![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
The first part of this book discusses institutions and mechanisms of algorithmic trading, market microstructure, high-frequency data and stylized facts, time and event aggregation, order book dynamics, trading strategies and algorithms, transaction costs, market impact and execution strategies, risk analysis, and management. The second part covers market impact models, network models, multi-asset trading, machine learning techniques, and nonlinear filtering. The third part discusses electronic market making, liquidity, systemic risk, recent developments and debates on the subject.
Financial, Macro and Micro Econometrics Using R, Volume 42, provides state-of-the-art information on important topics in econometrics, including multivariate GARCH, stochastic frontiers, fractional responses, specification testing and model selection, exogeneity testing, causal analysis and forecasting, GMM models, asset bubbles and crises, corporate investments, classification, forecasting, nonstandard problems, cointegration, financial market jumps and co-jumps, among other topics.
First published in 1992, The Efficiency of New Issue Markets provides a comprehensive overview of under-pricing and through this assess the efficiency of new issue markets. The book provides a further theoretical development of the adverse selection model of the new issue market and addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets which demonstrates the importance of distribution in determining relative efficiency.
This book is intended as an introduction to game theory which goes beyond the field of application, economics, and which introduces the reader to as many different sides of game theory as possible within the limitations of an introduction. The main goal is to give an impression of the diversity of game theoretical models, while at the same time covering the standard topics. The book has an equal coverage of non-cooperative and cooperative games, and it covers several topics such as selecting Nash equilibria, non-transferable utility games, applications of game theory to logic, combinatorial and differential games.
This book describes the functions frequently used in deep neural networks. For this purpose, 37 activation functions are explained both mathematically and visually, and given with their LaTeX implementations due to their common use in scientific articles.
The volatility of financial returns changes over time and, for the last thirty years, Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models have provided the principal means of analyzing, modeling, and monitoring such changes. Taking into account that financial returns typically exhibit heavy tails that is, extreme values can occur from time to time Andrew Harvey's new book shows how a small but radical change in the way GARCH models are formulated leads to a resolution of many of the theoretical problems inherent in the statistical theory. The approach can also be applied to other aspects of volatility, such as those arising from data on the range of returns and the time between trades. Furthermore, the more general class of Dynamic Conditional Score models extends to robust modeling of outliers in the levels of time series and to the treatment of time-varying relationships. As such, there are applications not only to financial data but also to macroeconomic time series and to time series in other disciplines. The statistical theory draws on basic principles of maximum likelihood estimation and, by doing so, leads to an elegant and unified treatment of nonlinear time-series modeling. The practical value of the proposed models is illustrated by fitting them to real data sets."
The Handbook of U.S. Labor Statistics is recognized as an authoritative resource on the U.S. labor force. It continues and enhances the Bureau of Labor Statistics's (BLS) discontinued publication, Labor Statistics. It allows the user to understand recent developments as well as to compare today's economy with past history. This edition includes new tables on occupational safety and health and income in the United States. The Handbook is a comprehensive reference providing an abundance of data on a variety of topics including: *Employment and unemployment; *Earnings; *Prices; *Productivity; *Consumer expenditures; *Occupational safety and health; *Union membership; *Working poor *And much more! Features of the publication In addition to over 215 tables that present practical data, the Handbook provides: *Introductory material for each chapter that contains highlights of salient data and figures that call attention to noteworthy trends in the data *Notes and definitions, which contain concise descriptions of the data sources, concepts, definitions, and methodology from which the data are derived *References to more comprehensive reports which provide additional data and more extensive descriptions of estimation methods, sampling, and reliability measures
Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Ito integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addition, the author discusses net premiums for various insurance policies. Mathematical statements are carefully and clearly formulated and proved while avoiding excessive technicalities as much as possible. The book illustrates how these statements help solve numerous statistical problems. It also includes more than 70 exercises.
This volume of Advances in Econometrics contains a selection of papers presented at the "Econometrics of Complex Survey Data: Theory and Applications" conference organized by the Bank of Canada, Ottawa, Canada, from October 19-20, 2017. The papers included in this volume span a range of methodological and practical topics including survey collection comparisons, imputation mechanisms, the bootstrap, nonparametric techniques, specification tests, and empirical likelihood estimation using complex survey data. For academics and students with an interest in econometrics and the ways in which complex survey data can be used and evaluated, this volume is essential.
This book explores the possibility of using social media data for detecting socio-economic recovery activities. In the last decade, there have been intensive research activities focusing on social media during and after disasters. This approach, which views people's communication on social media as a sensor for real-time situations, has been widely adopted as the "people as sensor" approach. Furthermore, to improve recovery efforts after large-scale disasters, detecting communities' real-time recovery situations is essential, since conventional socio-economic recovery indicators, such as governmental statistics, are not published in real time. Thanks to its timeliness, using social media data can fill the gap. Motivated by this possibility, this book especially focuses on the relationships between people's communication on Twitter and Facebook pages, and socio-economic recovery activities as reflected in the used-car market data and the housing market data in the case of two major disasters: the Great East Japan Earthquake and Tsunami of 2011 and Hurricane Sandy in 2012. The book pursues an interdisciplinary approach, combining e.g. disaster recovery studies, crisis informatics, and economics. In terms of its contributions, firstly, the book sheds light on the "people as sensors" approach for detecting socio-economic recovery activities, which has not been thoroughly studied to date but has the potential to improve situation awareness during the recovery phase. Secondly, the book proposes new socio-economic recovery indicators: used-car market data and housing market data. Thirdly, in the context of using social media during the recovery phase, the results demonstrate the importance of distinguishing between social media data posted both by people who are at or near disaster-stricken areas and by those who are farther away.
"It's the economy, stupid," as Democratic strategist James Carville
would say. After many years of study, Ray C. Fair has found that
the state of the economy has a dominant influence on national
elections. Just in time for the 2012 presidential election, this
new edition of his classic text, "Predicting Presidential Elections
and Other Things," provides us with a look into the likely future
of our nation's political landscape--but Fair doesn't stop there.
Valuable software, realistic examples, clear writing, and fascinating topics help you master key spreadsheet and business analytics skills with SPREADSHEET MODELING AND DECISION ANALYSIS, 8E. You'll find everything you need to become proficient in today's most widely used business analytics techniques using Microsoft (R) Office Excel (R) 2016. Author Cliff Ragsdale -- respected innovator in business analytics -- guides you through the skills you need, using the latest Excel (R) for Windows. You gain the confidence to apply what you learn to real business situations with step-by-step instructions and annotated screen images that make examples easy to follow. The World of Management Science sections further demonstrates how each topic applies to a real company. Each new edition includes extended trial licenses for Analytic Solver Platform and XLMiner with powerful simulation and optimization tools for descriptive and prescriptive analytics and a full suite of tools for data mining in Excel.
Originally published in 1978. This book is designed to enable students on main courses in economics to comprehend literature which employs econometric techniques as a method of analysis, to use econometric techniques themselves to test hypotheses about economic relationships and to understand some of the difficulties involved in interpreting results. While the book is mainly aimed at second-year undergraduates undertaking courses in applied economics, its scope is sufficiently wide to take in students at postgraduate level who have no background in econometrics - it integrates fully the mathematical and statistical techniques used in econometrics with micro- and macroeconomic case studies.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
Providing a practical introduction to state space methods as
applied to unobserved components time series models, also known as
structural time series models, this book introduces time series
analysis using state space methodology to readers who are neither
familiar with time series analysis, nor with state space methods.
The only background required in order to understand the material
presented in the book is a basic knowledge of classical linear
regression models, of which brief review is provided to refresh the
reader's knowledge. Also, a few sections assume familiarity with
matrix algebra, however, these sections may be skipped without
losing the flow of the exposition.
Over the last decade, dynamical systems theory and related
nonlinear methods have had a major impact on the analysis of time
series data from complex systems. Recent developments in
mathematical methods of state-space reconstruction, time-delay
embedding, and surrogate data analysis, coupled with readily
accessible and powerful computational facilities used in gathering
and processing massive quantities of high-frequency data, have
provided theorists and practitioners unparalleled opportunities for
exploratory data analysis, modelling, forecasting, and
control.
The beginning of the age of artificial intelligence and machine learning has created new challenges and opportunities for data analysts, statisticians, mathematicians, econometricians, computer scientists and many others. At the root of these techniques are algorithms and methods for clustering and classifying different types of large datasets, including time series data. Time Series Clustering and Classification includes relevant developments on observation-based, feature-based and model-based traditional and fuzzy clustering methods, feature-based and model-based classification methods, and machine learning methods. It presents a broad and self-contained overview of techniques for both researchers and students. Features Provides an overview of the methods and applications of pattern recognition of time series Covers a wide range of techniques, including unsupervised and supervised approaches Includes a range of real examples from medicine, finance, environmental science, and more R and MATLAB code, and relevant data sets are available on a supplementary website
Microsimulation models provide an exciting new tool for analysing the distributional impact and cost of government policy changes. They can also be used to analyse the current or future structure of society. This volume contains papers describing new developments at the frontiers of microsimulation modelling, and draws upon experiences in a wide range of countries. Some papers aim to share with other modellers, experience gained in designing and running microsimulation models and their use in government policy formulation. They also examine issues at the frontiers of the discipline, such as how to include usage of health, education and welfare services in models. Other chapters focus upon describing the innovative new approaches being taken in dynamic microsimulation modelling. They describe some of the policy applications for which dynamic models are being used in Europe, Australia and New Zealand. Topics covered include retirement income modelling, pension reform, the behavioural impact of tax changes, child care demand, and the inclusion of government services within models. Attention is also given to validating the results of models and estimating their statistical reliability.
Pathwise estimation and inference for diffusion market models discusses contemporary techniques for inferring, from options and bond prices, the market participants' aggregate view on important financial parameters such as implied volatility, discount rate, future interest rate, and their uncertainty thereof. The focus is on the pathwise inference methods that are applicable to a sole path of the observed prices and do not require the observation of an ensemble of such paths. This book is pitched at the level of senior undergraduate students undertaking research at honors year, and postgraduate candidates undertaking Master's or PhD degree by research. From a research perspective, this book reaches out to academic researchers from backgrounds as diverse as mathematics and probability, econometrics and statistics, and computational mathematics and optimization whose interest lie in analysis and modelling of financial market data from a multi-disciplinary approach. Additionally, this book is also aimed at financial market practitioners participating in capital market facing businesses who seek to keep abreast with and draw inspiration from novel approaches in market data analysis. The first two chapters of the book contains introductory material on stochastic analysis and the classical diffusion stock market models. The remaining chapters discuss more special stock and bond market models and special methods of pathwise inference for market parameter for different models. The final chapter describes applications of numerical methods of inference of bond market parameters to forecasting of short rate. Nikolai Dokuchaev is an associate professor in Mathematics and Statistics at Curtin University. His research interests include mathematical and statistical finance, stochastic analysis, PDEs, control, and signal processing. Lin Yee Hin is a practitioner in the capital market facing industry. His research interests include econometrics, non-parametric regression, and scientific computing.
The papers collected in the two volumes Nonlinear Models focus on the asymptotic theory of parameter estimators of nonlinear single equation models and systems of nonlinear models, in particular weak and strong consistency, asymptotic normality, and parameter inference, for cross-sections as well as for time series. A selection of papers on testing for, and estimation and inference under, model misspecification is also included. The models under review are parametric, hence their functional form is assured to be known up to a vector of unknown parameters, and the functional form involved is nonlinear in at least one of the parameters.The selection of earlier articles on nonlinear parametric models is extensive and, although they are not all equally influential, each has played a significant part in the development of the field. The more recent articles have been selected on the basis of their potential importance for the further development of this sphere of study.
Companion Website materials: https://tzkeith.com/ Multiple Regression and Beyond offers a conceptually-oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. This book: * Covers both MR and SEM, while explaining their relevance to one another * Includes path analysis, confirmatory factor analysis, and latent growth modeling * Makes extensive use of real-world research examples in the chapters and in the end-of-chapter exercises * Extensive use of figures and tables providing examples and illustrating key concepts and techniques New to this edition: * New chapter on mediation, moderation, and common cause * New chapter on the analysis of interactions with latent variables and multilevel SEM * Expanded coverage of advanced SEM techniques in chapters 18 through 22 * International case studies and examples * Updated instructor and student online resources
Covering a broad range of topics, this text provides a comprehensive survey of the modeling of chaotic dynamics and complexity in the natural and social sciences. Its attention to models in both the physical and social sciences and the detailed philosophical approach make this a unique text in the midst of many current books on chaos and complexity. Including an extensive index and bibliography along with numerous examples and simplified models, this is an ideal course text.
Change of Time and Change of Measure provides a comprehensive account of two topics that are of particular significance in both theoretical and applied stochastics: random change of time and change of probability law.Random change of time is key to understanding the nature of various stochastic processes, and gives rise to interesting mathematical results and insights of importance for the modeling and interpretation of empirically observed dynamic processes. Change of probability law is a technique for solving central questions in mathematical finance, and also has a considerable role in insurance mathematics, large deviation theory, and other fields.The book comprehensively collects and integrates results from a number of scattered sources in the literature and discusses the importance of the results relative to the existing literature, particularly with regard to mathematical finance.In this Second Edition a Chapter 13 entitled 'A Wider View' has been added. This outlines some of the developments that have taken place in the area of Change of Time and Change of Measure since the publication of the First Edition. Most of these developments have their root in the study of the Statistical Theory of Turbulence rather than in Financial Mathematics and Econometrics, and they form part of the new research area termed 'Ambit Stochastics'.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. " |
![]() ![]() You may like...
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,881
Discovery Miles 38 810
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,790
Discovery Miles 37 900
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,579
Discovery Miles 25 790
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,695
Discovery Miles 26 950
|