![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
In the modern world of gigantic datasets, which scientists and practioners of all fields of learning are confronted with, the availability of robust, scalable and easy-to-use methods for pattern recognition and data mining are of paramount importance, so as to be able to cope with the avalanche of data in a meaningful way. This concise and pedagogical research monograph introduces the reader to two specific aspects - clustering techniques and dimensionality reduction - in the context of complex network analysis. The first chapter provides a short introduction into relevant graph theoretical notation; chapter 2 then reviews and compares a number of cluster definitions from different fields of science. In the subsequent chapters, a first-principles approach to graph clustering in complex networks is developed using methods from statistical physics and the reader will learn, that even today, this field significantly contributes to the understanding and resolution of the related statistical inference issues. Finally, an application chapter examines real-world networks from the economic realm to show how the network clustering process can be used to deal with large, sparse datasets where conventional analyses fail.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Written in honor of Emeritus Professor Georges Prat (University of Paris Nanterre, France), this book includes contributions from eminent authors on a range of topics that are of interest to researchers and graduates, as well as investors and portfolio managers. The topics discussed include the effects of information and transaction costs on informational and allocative market efficiency, bubbles and stock price dynamics, paradox of rational expectations and the principle of limited information, uncertainty and expectation hypotheses, oil price dynamics, and nonlinearity in asset price dynamics.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
This volume contains an accessible discussion examining computationally-intensive techniques and bootstrap methods, providing ways to improve the finite-sample performance of well-known asymptotic tests for regression models. The book uses the linear regression model as a framework for introducing simulation-based tests to help perform econometric analyses.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
As well as providing a history of economic statistics, the book includes contributions by economists from a number of countries, applying economic statistics to the past and to current economic issues.
How does innovation emerge from normal economic activity? Economic Interdependence and Innovative Activity is an original new book which tries to answer this question by reconciling inter-industrial analysis with the study of innovation. This book provides a bridge between economic statics and the dynamics of growth and development. As well as offering important and original empirical data for Canada, France, Italy, Greece and China, the authors make a series of theoretical advances and propose a new way to observe the innovative process as well as new analytical tools to examine innovative activity. Their central thesis is that innovative outputs emerge out of increased social interaction, and division of labour through cooperative networks. An authoritative theoretical introduction and some thought-provoking conclusions have been prepared by Christian DeBresson. Economic Interdependence and Innovative Activity encourage input-output economists to encompass innovative activities in dynamic models and innovation researchers to look at technical interdependencies.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
What part does technological knowledge accumulation play in modern economic growth? This book investigates and examines the predictions of new growth theory, using OECD manufacturing data. Its empirical findings portray a novel and complex picture of the features of long-term growth, where technological knowledge production and diffusion play a central part, alongside variations in capital and employment. A parallel examination of long-run trade patterns and government policy issues completes a broader account of how knowledge-based growth in industrial output is at the heart of modern economic prosperity.
The primary goal of this book is to present the research
findings and conclusions of physicists, economists, mathematicians
and financial engineers working in the field of "Econophysics" who
have undertaken agent-based modelling, comparison with empirical
studies and related investigations.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
This timely volume brings together professors of finance and accounting from Japanese universities to examine the Japanese stock market in terms of its pricing and accounting systems. The papers report the results of empirical research into the Japanese stock market within the framework of new theories of finance. Academics, professionals, and anyone seeking to understand or enter the Japanese market will applaud the publication of this practical, informative volume. Having gathered data from the late 1970's through 1984, the authors analyze the market's behavior and the applicability of two major theoretical pricing models -- the Capital Asset Pricing Models and the Efficient Market Hypothesis -- to that market. Chapter 1 provides background statistical evidence on the behavior of monthly returns on Tokyo Stock Exchange common stocks. Chapter 2 discusses an empirical test of the capital asset pricing model. Chapter 3 examines evidence on the price performance of unseasoned new issues. The authors also examine the Japanese accounting disclosure system: Chapter 4 deals empirically with the information content of the annual accounting announcements and related market efficiency. The next chapter presents empirical evidence on the relationship between unsystematic returns and earnings forecast errors. Next, empirical research into the usefulness to investors of the disclosure system is examined. Finally, Chapter 7 presents several interesting questions and topics for future research on the Japanese stock market.
This book provides an essential toolkit for all students wishing to know more about the modelling and analysis of financial data. Applications of econometric techniques are becoming increasingly common in the world of finance and this second edition of an established text covers the following key themes: - unit roots, cointegration and other developments in the study of time series models - time varying volatility models of the GARCH type and the stochastic volatility approach - analysis of shock persistence and impulse responses - Markov switching and Kalman filtering - spectral analysis - present value relations and rationality - discrete choice models - analysis of truncated and censored samples - panel data analysis. This updated edition includes new chapters which cover limited dependent variables and panel data. It continues to be an essential guide for all graduate and advanced undergraduate students of econometrics and finance.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
The authors present a number of financial market studies that have as their general theme, the econometric testing of the underlying econometric assumptions of a number of financial models. More than 30 years of financial market research has convinced the authors that not enough attention has been paid to whether the estimated model is appropriate or, most importantly, whether the estimation technique is suitable for the problem under study. For many years linear models have been assumed with little or no testing of alternative specification. The result has been models that force linearity assumptions on what clearly are nonlinear processes. Another major assumption of much financial research constrains the coefficients to be stable over time. This critical assumption has been attacked by Lucas (1976) on the grounds that when economic policy changes, the coefficients of macroeconomics models change. If this occurs, any policy forecasts of these models will be flawed. In financial modeling, omitted (possibly non-quantifiable) variables will bias coefficients. While it may be possible to model some financial variables for extended periods, in other periods the underlying models may either exhibit nonlinearity or show changes in linear models. The authors research indicates that tests for changes in linear models, such as recursive residual analysis, or tests for episodic nonlinearity can be used to signal changes in the underlying structure of the market. The book begins with a brief review of basic linear time series techniques that include autoregressive integrated moving average models (ARIMA), vector autoregressive models (VAR), and models form the ARCH/GARCH class. While the ARIMA and VAR approach models the first moment of a series, models of the ARCH/GARCH class model both the first moment and second moment which is interpreted as conditional or explained volatility of a series. Recent work on nonlinearity detection has questioned the appropriateness of these essentially linear approaches. A number of such tests are shown and applied for the complete series and a subsets of the series. A major finding is that the structure of the series may change over time. Within the time frame of a study, there may be periods of episodic nonlinearity, episodic ARCH and episodic nonstationarity. Measures are developed to measure and relate these events both geographically and with mathematical models. This book will be of interest to applied finance researchers and to market participants.
PREFACE TO THE COLLECTION PREAMBLE The editors are pleased to present a selection of Henri Theil's contributions to economics and econometrics in three volumes. In Volume I we have provided an overview of Theil's contributions, a brief biography, an annotated bibliography of his research, and a selection of published and unpublished articles and chapters in books dealing with topics in econometrics. Volume IT contains Theil's contributions to demand analysis and information theory. Volume ITI includes Theil's contributions in economic policy and forecasting, and management science. The selection of articles is intended to provide examples of Theil's many seminal and pathbreaking contributions to economics in such areas as econometrics, statistics, demand analysis, information theory, economic policy analysis, aggregation theory, forecasting, index numbers, management science, sociology, operations research, higher education and much more. The collection is also intended to serve as a tribute to him on the occasion of his 67th birthday.! These three volumes also highlight some of Theil's contributions and service to the profession as a leader, advisor, administrator, teacher, and researcher. Theil's contributions, which encompass many disciplines, have been extensively cited both in scientific and professional journals. These citations often place Theil among the top 10 researchers (ranked according to number of times cited) in the world in various disciplines.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration, commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal."
Testing for a unit root is now an essential part of time series analysis. Indeed no time series study in economics, and other disciplines that use time series observations, can ignore the crucial issue of nonstationarity caused by a unit root. However, the literature on the topic is large and often technical, making it difficult to understand the key practical issues. This volume provides an accessible introduction and a critical overview of tests for a unit root in time series, with extensive practical examples and illustrations using simulation analysis. It presents the concepts that enable the reader to understand the theoretical background, and importance of ranA--dom walks and Brownian motion, to the development of unit root tests. The book also examines the latest developments and practical concerns in unit root testing. This book is indispensable reading for all interested in econometrics, time series econometrics, applied econometrics and applied statistics. It will also be of interest to other disciplines, such as geography, climate change and meteorology, which use time series of data.
Features: New chapters on Barrier Options, Lookback Options, Asian Options, Optimal Stopping Theorem, and Stochastic Volatility. Contains over 235 exercises, and 16 problems with complete solutions. Added over 150 graphs and figures, for more than 250 in total, to optimize presentation. 57 R coding examples now integrated into the book for implementation of the methods. Substantially class-tested, so ideal for course use or self-study. |
![]() ![]() You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,736
Discovery Miles 67 360
Race and Employment in America 2013
Deirdre A. Gaquin, Gwenavere W. Dunn
Paperback
R2,978
Discovery Miles 29 780
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
![]() R662 Discovery Miles 6 620
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Energy Use Policies and Carbon Pricing…
Arun Advani, Samuela Bassi, …
Paperback
R351
Discovery Miles 3 510
|