![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This book discusses the need to carefully and prudently apply various regression techniques in order to obtain the full benefits. It also describes some of the techniques developed and used by the authors, presenting their innovative ideas regarding the formulation and estimation of regression decomposition models, hidden Markov chain, and the contribution of regressors in the set-theoretic approach, calorie poverty rate, and aggregate growth rate. Each of these techniques has applications that address a number of unanswered questions; for example, regression decomposition techniques reveal intra-household gender inequalities of consumption, intra-household allocation of resources and adult equivalent scales, while Hidden Markov chain models can forecast the results of future elections. Most of these procedures are presented using real-world data, and the techniques can be applied in other similar situations. Showing how difficult questions can be answered by developing simple models with simple interpretation of parameters, the book is a valuable resource for students and researchers in the field of model building.
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
This monograph deals with spatially dependent nonstationary time series in a way accessible to both time series econometricians wanting to understand spatial econometics, and spatial econometricians lacking a grounding in time series analysis. After charting key concepts in both time series and spatial econometrics, the book discusses how the spatial connectivity matrix can be estimated using spatial panel data instead of assuming it to be exogenously fixed. This is followed by a discussion of spatial nonstationarity in spatial cross-section data, and a full exposition of non-stationarity in both single and multi-equation contexts, including the estimation and simulation of spatial vector autoregression (VAR) models and spatial error correction (ECM) models. The book reviews the literature on panel unit root tests and panel cointegration tests for spatially independent data, and for data that are strongly spatially dependent. It provides for the first time critical values for panel unit root tests and panel cointegration tests when the spatial panel data are weakly or spatially dependent. The volume concludes with a discussion of incorporating strong and weak spatial dependence in non-stationary panel data models. All discussions are accompanied by empirical testing based on a spatial panel data of house prices in Israel.
This book consists of four parts: I. Labour demand and supply, II. Productivity slowdown and innovative activity, III. Disequilibrium and business cycle analysis, and IV. Time series analysis of output and employment. It presents a fine selection of articles in the growing field ofthe empirical analysis of output and employment fluctuations with applications in a micro-econometric or a time-series framework. The time-series literature recently has emphasized the careful testing for stationarity and nonlinearity in the data, and the importance of cointegration theory. An essential part of the papers make use of parametric and non-parametric methods developed in this literature and mostly connect their results to the hysteresis discussion about the existence of fragile equilibria. A second set of macro approaches use the disequilibrium framework that has found so much interest in Europe in recent years. The other papers use newly developed methods for microdata, especially qualitative data or limited dependent variables to study microeconomic models of behaviour that explain labour market and output decisions.
In the modern world of gigantic datasets, which scientists and practioners of all fields of learning are confronted with, the availability of robust, scalable and easy-to-use methods for pattern recognition and data mining are of paramount importance, so as to be able to cope with the avalanche of data in a meaningful way. This concise and pedagogical research monograph introduces the reader to two specific aspects - clustering techniques and dimensionality reduction - in the context of complex network analysis. The first chapter provides a short introduction into relevant graph theoretical notation; chapter 2 then reviews and compares a number of cluster definitions from different fields of science. In the subsequent chapters, a first-principles approach to graph clustering in complex networks is developed using methods from statistical physics and the reader will learn, that even today, this field significantly contributes to the understanding and resolution of the related statistical inference issues. Finally, an application chapter examines real-world networks from the economic realm to show how the network clustering process can be used to deal with large, sparse datasets where conventional analyses fail.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Written in honor of Emeritus Professor Georges Prat (University of Paris Nanterre, France), this book includes contributions from eminent authors on a range of topics that are of interest to researchers and graduates, as well as investors and portfolio managers. The topics discussed include the effects of information and transaction costs on informational and allocative market efficiency, bubbles and stock price dynamics, paradox of rational expectations and the principle of limited information, uncertainty and expectation hypotheses, oil price dynamics, and nonlinearity in asset price dynamics.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
What part does technological knowledge accumulation play in modern economic growth? This book investigates and examines the predictions of new growth theory, using OECD manufacturing data. Its empirical findings portray a novel and complex picture of the features of long-term growth, where technological knowledge production and diffusion play a central part, alongside variations in capital and employment. A parallel examination of long-run trade patterns and government policy issues completes a broader account of how knowledge-based growth in industrial output is at the heart of modern economic prosperity.
The primary goal of this book is to present the research
findings and conclusions of physicists, economists, mathematicians
and financial engineers working in the field of "Econophysics" who
have undertaken agent-based modelling, comparison with empirical
studies and related investigations.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
This book provides an essential toolkit for all students wishing to know more about the modelling and analysis of financial data. Applications of econometric techniques are becoming increasingly common in the world of finance and this second edition of an established text covers the following key themes: - unit roots, cointegration and other developments in the study of time series models - time varying volatility models of the GARCH type and the stochastic volatility approach - analysis of shock persistence and impulse responses - Markov switching and Kalman filtering - spectral analysis - present value relations and rationality - discrete choice models - analysis of truncated and censored samples - panel data analysis. This updated edition includes new chapters which cover limited dependent variables and panel data. It continues to be an essential guide for all graduate and advanced undergraduate students of econometrics and finance.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
PREFACE TO THE COLLECTION PREAMBLE The editors are pleased to present a selection of Henri Theil's contributions to economics and econometrics in three volumes. In Volume I we have provided an overview of Theil's contributions, a brief biography, an annotated bibliography of his research, and a selection of published and unpublished articles and chapters in books dealing with topics in econometrics. Volume IT contains Theil's contributions to demand analysis and information theory. Volume ITI includes Theil's contributions in economic policy and forecasting, and management science. The selection of articles is intended to provide examples of Theil's many seminal and pathbreaking contributions to economics in such areas as econometrics, statistics, demand analysis, information theory, economic policy analysis, aggregation theory, forecasting, index numbers, management science, sociology, operations research, higher education and much more. The collection is also intended to serve as a tribute to him on the occasion of his 67th birthday.! These three volumes also highlight some of Theil's contributions and service to the profession as a leader, advisor, administrator, teacher, and researcher. Theil's contributions, which encompass many disciplines, have been extensively cited both in scientific and professional journals. These citations often place Theil among the top 10 researchers (ranked according to number of times cited) in the world in various disciplines.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration, commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal."
Testing for a unit root is now an essential part of time series analysis. Indeed no time series study in economics, and other disciplines that use time series observations, can ignore the crucial issue of nonstationarity caused by a unit root. However, the literature on the topic is large and often technical, making it difficult to understand the key practical issues. This volume provides an accessible introduction and a critical overview of tests for a unit root in time series, with extensive practical examples and illustrations using simulation analysis. It presents the concepts that enable the reader to understand the theoretical background, and importance of ranA--dom walks and Brownian motion, to the development of unit root tests. The book also examines the latest developments and practical concerns in unit root testing. This book is indispensable reading for all interested in econometrics, time series econometrics, applied econometrics and applied statistics. It will also be of interest to other disciplines, such as geography, climate change and meteorology, which use time series of data. |
![]() ![]() You may like...
Electric Motor Drives and their…
. V. Indragandhi, R Selvamathi, …
Paperback
R3,439
Discovery Miles 34 390
Wind Energy Engineering - A Handbook for…
Trevor M. Letcher
Hardcover
Power Transformer Online Monitoring…
Gevork B. Gharehpetian, Hossein Karami
Paperback
R3,451
Discovery Miles 34 510
Monitoring and Control of Electrical…
Emilio Barocio Espejo, Felix Rafael Segundo Sevilla, …
Paperback
Carbon Capture and Storage in…
Hirdan Katarina De Medeiros Costa, Carolina Arlota
Paperback
R2,922
Discovery Miles 29 220
Scheduling and Operation of Virtual…
Ali Zangeneh, Moein Moeini-Aghtaie
Paperback
R3,224
Discovery Miles 32 240
|