Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
This book is intended for second year graduate students and professionals who have an interest in linear and nonlinear simultaneous equations mod els. It basically traces the evolution of econometrics beyond the general linear model (GLM), beginning with the general linear structural econo metric model (GLSEM) and ending with the generalized method of mo ments (GMM). Thus, it covers the identification problem (Chapter 3), maximum likelihood (ML) methods (Chapters 3 and 4), two and three stage least squares (2SLS, 3SLS) (Chapters 1 and 2), the general nonlinear model (GNLM) (Chapter 5), the general nonlinear simultaneous equations model (GNLSEM), the special ca'3e of GNLSEM with additive errors, non linear two and three stage least squares (NL2SLS, NL3SLS), the GMM for GNLSEIVl, and finally ends with a brief overview of causality and re lated issues, (Chapter 6). There is no discussion either of limited dependent variables, or of unit root related topics. It also contains a number of significant innovations. In a departure from the custom of the literature, identification and consistency for nonlinear models is handled through the Kullback information apparatus, as well as the theory of minimum contrast (MC) estimators. In fact, nearly all estimation problems handled in this volume can be approached through the theory of MC estimators. The power of this approach is demonstrated in Chapter 5, where the entire set of identification requirements for the GLSEM, in an ML context, is obtained almost effortlessly, through the apparatus of Kullback information."
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
This book consists of four parts: I. Labour demand and supply, II. Productivity slowdown and innovative activity, III. Disequilibrium and business cycle analysis, and IV. Time series analysis of output and employment. It presents a fine selection of articles in the growing field ofthe empirical analysis of output and employment fluctuations with applications in a micro-econometric or a time-series framework. The time-series literature recently has emphasized the careful testing for stationarity and nonlinearity in the data, and the importance of cointegration theory. An essential part of the papers make use of parametric and non-parametric methods developed in this literature and mostly connect their results to the hysteresis discussion about the existence of fragile equilibria. A second set of macro approaches use the disequilibrium framework that has found so much interest in Europe in recent years. The other papers use newly developed methods for microdata, especially qualitative data or limited dependent variables to study microeconomic models of behaviour that explain labour market and output decisions.
In the modern world of gigantic datasets, which scientists and practioners of all fields of learning are confronted with, the availability of robust, scalable and easy-to-use methods for pattern recognition and data mining are of paramount importance, so as to be able to cope with the avalanche of data in a meaningful way. This concise and pedagogical research monograph introduces the reader to two specific aspects - clustering techniques and dimensionality reduction - in the context of complex network analysis. The first chapter provides a short introduction into relevant graph theoretical notation; chapter 2 then reviews and compares a number of cluster definitions from different fields of science. In the subsequent chapters, a first-principles approach to graph clustering in complex networks is developed using methods from statistical physics and the reader will learn, that even today, this field significantly contributes to the understanding and resolution of the related statistical inference issues. Finally, an application chapter examines real-world networks from the economic realm to show how the network clustering process can be used to deal with large, sparse datasets where conventional analyses fail.
Written in honor of Emeritus Professor Georges Prat (University of Paris Nanterre, France), this book includes contributions from eminent authors on a range of topics that are of interest to researchers and graduates, as well as investors and portfolio managers. The topics discussed include the effects of information and transaction costs on informational and allocative market efficiency, bubbles and stock price dynamics, paradox of rational expectations and the principle of limited information, uncertainty and expectation hypotheses, oil price dynamics, and nonlinearity in asset price dynamics.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
Students in both social and natural sciences often seek regression methods to explain the frequency of events, such as visits to a doctor, auto accidents, or new patents awarded. This book provides the most comprehensive and up-to-date account of models and methods to interpret such data. The authors have conducted research in the field for more than twenty-five years. In this book, they combine theory and practice to make sophisticated methods of analysis accessible to researchers and practitioners working with widely different types of data and software in areas such as applied statistics, econometrics, marketing, operations research, actuarial studies, demography, biostatistics, and quantitative social sciences. The book may be used as a reference work on count models or by students seeking an authoritative overview. Complementary material in the form of data sets, template programs, and bibliographic resources can be accessed on the Internet through the authors' homepages. This second edition is an expanded and updated version of the first, with new empirical examples and more than one hundred new references added. The new material includes new theoretical topics, an updated and expanded treatment of cross-section models, coverage of bootstrap-based and simulation-based inference, expanded treatment of time series, multivariate and panel data, expanded treatment of endogenous regressors, coverage of quantile count regression, and a new chapter on Bayesian methods.
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
The primary goal of this book is to present the research
findings and conclusions of physicists, economists, mathematicians
and financial engineers working in the field of "Econophysics" who
have undertaken agent-based modelling, comparison with empirical
studies and related investigations.
What part does technological knowledge accumulation play in modern economic growth? This book investigates and examines the predictions of new growth theory, using OECD manufacturing data. Its empirical findings portray a novel and complex picture of the features of long-term growth, where technological knowledge production and diffusion play a central part, alongside variations in capital and employment. A parallel examination of long-run trade patterns and government policy issues completes a broader account of how knowledge-based growth in industrial output is at the heart of modern economic prosperity.
This book provides an essential toolkit for all students wishing to know more about the modelling and analysis of financial data. Applications of econometric techniques are becoming increasingly common in the world of finance and this second edition of an established text covers the following key themes: - unit roots, cointegration and other developments in the study of time series models - time varying volatility models of the GARCH type and the stochastic volatility approach - analysis of shock persistence and impulse responses - Markov switching and Kalman filtering - spectral analysis - present value relations and rationality - discrete choice models - analysis of truncated and censored samples - panel data analysis. This updated edition includes new chapters which cover limited dependent variables and panel data. It continues to be an essential guide for all graduate and advanced undergraduate students of econometrics and finance.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
The authors present a number of financial market studies that have as their general theme, the econometric testing of the underlying econometric assumptions of a number of financial models. More than 30 years of financial market research has convinced the authors that not enough attention has been paid to whether the estimated model is appropriate or, most importantly, whether the estimation technique is suitable for the problem under study. For many years linear models have been assumed with little or no testing of alternative specification. The result has been models that force linearity assumptions on what clearly are nonlinear processes. Another major assumption of much financial research constrains the coefficients to be stable over time. This critical assumption has been attacked by Lucas (1976) on the grounds that when economic policy changes, the coefficients of macroeconomics models change. If this occurs, any policy forecasts of these models will be flawed. In financial modeling, omitted (possibly non-quantifiable) variables will bias coefficients. While it may be possible to model some financial variables for extended periods, in other periods the underlying models may either exhibit nonlinearity or show changes in linear models. The authors research indicates that tests for changes in linear models, such as recursive residual analysis, or tests for episodic nonlinearity can be used to signal changes in the underlying structure of the market. The book begins with a brief review of basic linear time series techniques that include autoregressive integrated moving average models (ARIMA), vector autoregressive models (VAR), and models form the ARCH/GARCH class. While the ARIMA and VAR approach models the first moment of a series, models of the ARCH/GARCH class model both the first moment and second moment which is interpreted as conditional or explained volatility of a series. Recent work on nonlinearity detection has questioned the appropriateness of these essentially linear approaches. A number of such tests are shown and applied for the complete series and a subsets of the series. A major finding is that the structure of the series may change over time. Within the time frame of a study, there may be periods of episodic nonlinearity, episodic ARCH and episodic nonstationarity. Measures are developed to measure and relate these events both geographically and with mathematical models. This book will be of interest to applied finance researchers and to market participants.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration, commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal."
Testing for a unit root is now an essential part of time series analysis. Indeed no time series study in economics, and other disciplines that use time series observations, can ignore the crucial issue of nonstationarity caused by a unit root. However, the literature on the topic is large and often technical, making it difficult to understand the key practical issues. This volume provides an accessible introduction and a critical overview of tests for a unit root in time series, with extensive practical examples and illustrations using simulation analysis. It presents the concepts that enable the reader to understand the theoretical background, and importance of ranA--dom walks and Brownian motion, to the development of unit root tests. The book also examines the latest developments and practical concerns in unit root testing. This book is indispensable reading for all interested in econometrics, time series econometrics, applied econometrics and applied statistics. It will also be of interest to other disciplines, such as geography, climate change and meteorology, which use time series of data.
PREFACE TO THE COLLECTION PREAMBLE The editors are pleased to present a selection of Henri Theil's contributions to economics and econometrics in three volumes. In Volume I we have provided an overview of Theil's contributions, a brief biography, an annotated bibliography of his research, and a selection of published and unpublished articles and chapters in books dealing with topics in econometrics. Volume IT contains Theil's contributions to demand analysis and information theory. Volume ITI includes Theil's contributions in economic policy and forecasting, and management science. The selection of articles is intended to provide examples of Theil's many seminal and pathbreaking contributions to economics in such areas as econometrics, statistics, demand analysis, information theory, economic policy analysis, aggregation theory, forecasting, index numbers, management science, sociology, operations research, higher education and much more. The collection is also intended to serve as a tribute to him on the occasion of his 67th birthday.! These three volumes also highlight some of Theil's contributions and service to the profession as a leader, advisor, administrator, teacher, and researcher. Theil's contributions, which encompass many disciplines, have been extensively cited both in scientific and professional journals. These citations often place Theil among the top 10 researchers (ranked according to number of times cited) in the world in various disciplines. |
You may like...
Multidisciplinary Perspectives on Human…
Vandana Ahuja, Shubhangini Rathore
Hardcover
R5,769
Discovery Miles 57 690
Haunted Empire - Apple After Steve Jobs
Yukari Iwatani Kane
Paperback
The Digital Silk Road - China's Quest To…
Jonathan E. Hillman
Paperback
|