![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This Open Access Brief presents the KAPSARC Global Energy Macroeconometric Model (KGEMM). KGEMM is a policy analysis tool for examining the impacts of domestic policy measures and global economic and energy shocks on the Kingdom of Saudi Arabia. The model has eight blocks (real sector, fiscal, monetary, external sector, price, labor and wages, energy, population, and age cohorts) that interact with each other to represent the Kingdom's macroeconomy and energy linkages. It captures New Keynesian demand-side features anchored to medium-run equilibrium and long-run aggregate supply. It applies a cointegration and equilibrium correction modeling (ECM) methodology to time series data to estimate the model's behavioral equations in the framework of Autometrics, a general-to-specific econometric modeling strategy. Hence, the model combines 'theory-driven' approach with 'data-driven' approach. The Brief begins with an introduction to the theoretical framework of the model and the KGEMM methodology and then walks the reader through the structure of the model and its behavioral equations. The book closes with simulations showing the application of the model. Providing a detailed introduction to a cutting-edge, robust predictive model, this Brief will be of great use to researchers and policymakers interested in macroeconomics, energy economics, econometrics, and more specifically, the economy of Saudi Arabia.
The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory. Each chapter develops statistical techniques within the context of a particular financial application. This exciting new text contains a unique and accessible combination of theory and practice, bringing state-of-the-art statistical techniques to the forefront of financial applications. Each chapter also includes a discussion of recent empirical evidence, for example, the rejection of the Random Walk Hypothesis, as well as problems designed to help readers incorporate what they have read into their own applications
Stochastic Limit Theory, published in 1994, has become a standard reference in its field. Now reissued in a new edition, offering updated and improved results and an extended range of topics, Davidson surveys asymptotic (large-sample) distribution theory with applications to econometrics, with particular emphasis on the problems of time dependence and heterogeneity. The book is designed to be useful on two levels. First, as a textbook and reference work, giving definitions of the relevant mathematical concepts, statements, and proofs of the important results from the probability literature, and numerous examples; and second, as an account of recent work in the field of particular interest to econometricians. It is virtually self-contained, with all but the most basic technical prerequisites being explained in their context; mathematical topics include measure theory, integration, metric spaces, and topology, with applications to random variables, and an extended treatment of conditional probability. Other subjects treated include: stochastic processes, mixing processes, martingales, mixingales, and near-epoch dependence; the weak and strong laws of large numbers; weak convergence; and central limit theorems for nonstationary and dependent processes. The functional central limit theorem and its ramifications are covered in detail, including an account of the theoretical underpinnings (the weak convergence of measures on metric spaces), Brownian motion, the multivariate invariance principle, and convergence to stochastic integrals. This material is of special relevance to the theory of cointegration. The new edition gives updated and improved versions of many of the results and extends the coverage of many topics, in particular the theory of convergence to alpha-stable limits of processes with infinite variance.
Continuous-Time Models in Corporate Finance synthesizes four decades of research to show how stochastic calculus can be used in corporate finance. Combining mathematical rigor with economic intuition, Santiago Moreno-Bromberg and Jean-Charles Rochet analyze corporate decisions such as dividend distribution, the issuance of securities, and capital structure and default. They pay particular attention to financial intermediaries, including banks and insurance companies. The authors begin by recalling the ways that option-pricing techniques can be employed for the pricing of corporate debt and equity. They then present the dynamic model of the trade-off between taxes and bankruptcy costs and derive implications for optimal capital structure. The core chapter introduces the workhorse liquidity-management model--where liquidity and risk management decisions are made in order to minimize the costs of external finance. This model is used to study corporate finance decisions and specific features of banks and insurance companies. The book concludes by presenting the dynamic agency model, where financial frictions stem from the lack of interest alignment between a firm's manager and its financiers. The appendix contains an overview of the main mathematical tools used throughout the book. Requiring some familiarity with stochastic calculus methods, Continuous-Time Models in Corporate Finance will be useful for students, researchers, and professionals who want to develop dynamic models of firms' financial decisions.
This book covers the econometric methodsnecessary for a practicing applied economist or data analyst. This requiresboth an understanding of statistical theory and how it is used in actual applications. Chapters 1 to 9 present the material concerned with basic statistical theory. Chapters 10 to 13 introduce a number of topics which form the basis of more advanced option modules, such as time series methods in applied econometrics. To get the most out of these topics, companion files include Excel datasets and 4-color figures. It includes pull down menus to graph the data, calculate sample statistics and estimate regression equations. FEATURES: Integration of econometrics methods with statistical foundations Worked examples of all models considered in the text Includes Excel datasheets to facilitate estimation and application of models Features instructor ancillaries for use as atextbook
This brief addresses the estimation of quantile regression models from a practical perspective, which will support researchers who need to use conditional quantile regression to measure economic relationships among a set of variables. It will also benefit students using the methodology for the first time, and practitioners at private or public organizations who are interested in modeling different fragments of the conditional distribution of a given variable. The book pursues a practical approach with reference to energy markets, helping readers learn the main features of the technique more quickly. Emphasis is placed on the implementation details and the correct interpretation of the quantile regression coefficients rather than on the technicalities of the method, unlike the approach used in the majority of the literature. All applications are illustrated with R.
This book unifies and extends the definition and measurement of economic efficiency and its use as a real-life benchmarking technique for actual organizations. Analytically, the book relies on the economic theory of duality as guiding framework. Empirically, it shows how the alternative models can be implemented by way of Data Envelopment Analysis. An accompanying software programmed in the open-source Julia language is used to solve the models. The package is a self-contained set of functions that can be used for individual learning and instruction. The source code, associated documentation, and replication notebooks are available online. The book discusses the concept of economic efficiency at the firm level, comparing observed to optimal economic performance, and its decomposition according to technical and allocative criteria. Depending on the underlying technical efficiency measure, economic efficiency can be decomposed multiplicatively or additively. Part I of the book deals with the classic multiplicative approach that decomposes cost and revenue efficiency based on radial distance functions. Subsequently, the book examines how these partial approaches can be expanded to the notion of profitability efficiency, considering both the input and output dimensions of the firm, and relying on the generalized distance function for the measurement of technical efficiency. Part II is devoted to the recent additive framework related to the decomposition of economic inefficiency defined in terms of cost, revenue, and profit. The book presents economic models for the Russell and enhanced graph Russell measures, the weighted additive distance function, the directional distance function, the modified directional distance function, and the Hoelder distance function. Each model is presented in a separate chapter. New approaches that qualify and generalize previous results are also introduced in the last chapters, including the reverse directional distance function and the general direct approach. The book concludes by highlighting the importance of benchmarking economic efficiency for all business stakeholders and recalling the main conclusions obtained from many years of research on this topic. The book offers different alternatives to measure economic efficiency based on a set of desirable properties and advises on the choice of specific economic efficiency models.
Handbook of Econometrics, Volume 7A, examines recent advances in foundational issues and "hot" topics within econometrics, such as inference for moment inequalities and estimation of high dimensional models. With its world-class editors and contributors, it succeeds in unifying leading studies of economic models, mathematical statistics and economic data. Our flourishing ability to address empirical problems in economics by using economic theory and statistical methods has driven the field of econometrics to unimaginable places. By designing methods of inference from data based on models of human choice behavior and social interactions, econometricians have created new subfields now sufficiently mature to require sophisticated literature summaries.
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
Building on the success of Abadir and Magnus' Matrix Algebra in the Econometric Exercises Series, Statistics serves as a bridge between elementary and specialized statistics. Professors Abadir, Heijmans, and Magnus freely use matrix algebra to cover intermediate to advanced material. Each chapter contains a general introduction, followed by a series of connected exercises which build up knowledge systematically. The characteristic feature of the book (and indeed the series) is that all exercises are fully solved. The authors present many new proofs of established results, along with new results, often involving shortcuts that resort to statistical conditioning arguments.
In the last 20 years, econometric theory on panel data has developed rapidly, particularly for analyzing common behaviors among individuals over time. Meanwhile, the statistical methods employed by applied researchers have not kept up-to-date. This book attempts to fill in this gap by teaching researchers how to use the latest panel estimation methods correctly. Almost all applied economics articles use panel data or panel regressions. However, many empirical results from typical panel data analyses are not correctly executed. This book aims to help applied researchers to run panel regressions correctly and avoid common mistakes. The book explains how to model cross-sectional dependence, how to estimate a few key common variables, and how to identify them. It also provides guidance on how to separate out the long-run relationship and common dynamic and idiosyncratic dynamic relationships from a set of panel data. Aimed at applied researchers who want to learn about panel data econometrics by running statistical software, this book provides clear guidance and is supported by a full range of online teaching and learning materials. It includes practice sections on MATLAB, STATA, and GAUSS throughout, along with short and simple econometric theories on basic panel regressions for those who are unfamiliar with econometric theory on traditional panel regressions.
Applied Econometrics: A Practical Guide is an extremely user-friendly and application-focused book on econometrics. Unlike many econometrics textbooks which are heavily theoretical on abstractions, this book is perfect for beginners and promises simplicity and practicality to the understanding of econometric models. Written in an easy-to-read manner, the book begins with hypothesis testing and moves forth to simple and multiple regression models. It also includes advanced topics: Endogeneity and Two-stage Least Squares Simultaneous Equations Models Panel Data Models Qualitative and Limited Dependent Variable Models Vector Autoregressive (VAR) Models Autocorrelation and ARCH/GARCH Models Unit Root and Cointegration The book also illustrates the use of computer software (EViews, SAS and R) for economic estimating and modeling. Its practical applications make the book an instrumental, go-to guide for solid foundation in the fundamentals of econometrics. In addition, this book includes excerpts from relevant articles published in top-tier academic journals. This integration of published articles helps the readers to understand how econometric models are applied to real-world use cases.
Stata is one of the most popular statistical software in the world and suited for all kinds of users, from absolute beginners to experienced veterans. This book offers a clear and concise introduction to the usage and the workflow of Stata. Included topics are importing and managing datasets, cleaning and preparing data, creating and manipulating variables, producing descriptive statistics and meaningful graphs as well as central quantitative methods, like linear (OLS) and binary logistic regressions and matching. Additional information about diagnostical tests ensures that these methods yield valid and correct results that live up to academic standards. Furthermore, users are instructed how to export results that can be directly used in popular software like Microsoft Word for seminar papers and publications. Lastly, the book offers a short yet focussed introduction to scientific writing, which should guide readers through the process of writing a first quantitative seminar paper or research report. The book underlines correct usage of the software and a productive workflow which also introduces aspects like replicability and general standards for academic writing. While absolute beginners will enjoy the easy to follow point-and-click interface, more experienced users will benefit from the information about do-files and syntax which makes Stata so popular. Lastly, a wide range of user-contributed software ("Ados") is introduced which further improves the general workflow and guarantees the availability of state of the art statistical methods.
This important new dictionary - the first of its kind, now available in paperback - presents an accessible source of reference on the main concepts and techniques in econometrics. Featuring entries on all the major areas in theoretical econometrics, the dictionary will be used by students, both undergraduate and postgraduate, to aid their understanding of the subject. Sorted by alphabetical order, each entry is a short essay which is designed to present the essential points of a particular concept or technique and offer a concise guide to other relevant literature. Written in an accessible and discursive style, the book adopts non-technical language to make the topics accessible to those who need to know more about applied econometrics and the underlying econometric theory. It will be widely welcomed as an indispensable supplement to the standard textbook literature and will be particularly well suited to students following modular courses. An essential source of reference for both undergraduate and post graduate students, the dictionary will also be useful for professional economists seeking to keep abreast of the latest developments in econometrics.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
The last decade has brought dramatic changes in the way that researchers analyze economic and financial time series. This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. "Time Series Analysis" fills an important need for a textbook that integrates economic theory, econometrics, and new results. The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.
Reflecting the fast pace and ever-evolving nature of the financial industry, the Handbook of High-Frequency Trading and Modeling in Finance details how high-frequency analysis presents new systematic approaches to implementing quantitative activities with high-frequency financial data. Introducing new and established mathematical foundations necessary to analyze realistic market models and scenarios, the handbook begins with a presentation of the dynamics and complexity of futures and derivatives markets as well as a portfolio optimization problem using quantum computers. Subsequently, the handbook addresses estimating complex model parameters using high-frequency data. Finally, the handbook focuses on the links between models used in financial markets and models used in other research areas such as geophysics, fossil records, and earthquake studies. The Handbook of High-Frequency Trading and Modeling in Finance also features: Contributions by well-known experts within the academic, industrial, and regulatory fields A well-structured outline on the various data analysis methodologies used to identify new trading opportunities Newly emerging quantitative tools that address growing concerns relating to high-frequency data such as stochastic volatility and volatility tracking; stochastic jump processes for limit-order books and broader market indicators; and options markets Practical applications using real-world data to help readers better understand the presented material The Handbook of High-Frequency Trading and Modeling in Finance is an excellent reference for professionals in the fields of business, applied statistics, econometrics, and financial engineering. The handbook is also a good supplement for graduate and MBA-level courses on quantitative finance, volatility, and financial econometrics. Ionut Florescu, PhD, is Research Associate Professor in Financial Engineering and Director of the Hanlon Financial Systems Laboratory at Stevens Institute of Technology. His research interests include stochastic volatility, stochastic partial differential equations, Monte Carlo Methods, and numerical methods for stochastic processes. Dr. Florescu is the author of Probability and Stochastic Processes, the coauthor of Handbook of Probability, and the coeditor of Handbook of Modeling High-Frequency Data in Finance, all published by Wiley. Maria C. Mariani, PhD, is Shigeko K. Chan Distinguished Professor in Mathematical Sciences and Chair of the Department of Mathematical Sciences at The University of Texas at El Paso. Her research interests include mathematical finance, applied mathematics, geophysics, nonlinear and stochastic partial differential equations and numerical methods. Dr. Mariani is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley. H. Eugene Stanley, PhD, is William Fairfield Warren Distinguished Professor at Boston University. Stanley is one of the key founders of the new interdisciplinary field of econophysics, and has an ISI Hirsch index H=128 based on more than 1200 papers. In 2004 he was elected to the National Academy of Sciences. Frederi G. Viens, PhD, is Professor of Statistics and Mathematics and Director of the Computational Finance Program at Purdue University. He holds more than two dozen local, regional, and national awards and he travels extensively on a world-wide basis to deliver lectures on his research interests, which range from quantitative finance to climate science and agricultural economics. A Fellow of the Institute of Mathematics Statistics, Dr. Viens is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley.
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon. |
![]() ![]() You may like...
The Anatomy of Yoga Colouring Book…
Jo Ann Staugaard-jones
Paperback
Number Theory and Combinatorics - A…
Bruce M. Landman, Florian Luca, …
Hardcover
R5,799
Discovery Miles 57 990
Models, Algorithms, and Technologies for…
Boris I. Goldengorin, Valery A. Kalyagin, …
Hardcover
R3,536
Discovery Miles 35 360
|