![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book discusses the need to carefully and prudently apply various regression techniques in order to obtain the full benefits. It also describes some of the techniques developed and used by the authors, presenting their innovative ideas regarding the formulation and estimation of regression decomposition models, hidden Markov chain, and the contribution of regressors in the set-theoretic approach, calorie poverty rate, and aggregate growth rate. Each of these techniques has applications that address a number of unanswered questions; for example, regression decomposition techniques reveal intra-household gender inequalities of consumption, intra-household allocation of resources and adult equivalent scales, while Hidden Markov chain models can forecast the results of future elections. Most of these procedures are presented using real-world data, and the techniques can be applied in other similar situations. Showing how difficult questions can be answered by developing simple models with simple interpretation of parameters, the book is a valuable resource for students and researchers in the field of model building.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
This volume contains an accessible discussion examining computationally-intensive techniques and bootstrap methods, providing ways to improve the finite-sample performance of well-known asymptotic tests for regression models. The book uses the linear regression model as a framework for introducing simulation-based tests to help perform econometric analyses.
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
This monograph deals with spatially dependent nonstationary time series in a way accessible to both time series econometricians wanting to understand spatial econometics, and spatial econometricians lacking a grounding in time series analysis. After charting key concepts in both time series and spatial econometrics, the book discusses how the spatial connectivity matrix can be estimated using spatial panel data instead of assuming it to be exogenously fixed. This is followed by a discussion of spatial nonstationarity in spatial cross-section data, and a full exposition of non-stationarity in both single and multi-equation contexts, including the estimation and simulation of spatial vector autoregression (VAR) models and spatial error correction (ECM) models. The book reviews the literature on panel unit root tests and panel cointegration tests for spatially independent data, and for data that are strongly spatially dependent. It provides for the first time critical values for panel unit root tests and panel cointegration tests when the spatial panel data are weakly or spatially dependent. The volume concludes with a discussion of incorporating strong and weak spatial dependence in non-stationary panel data models. All discussions are accompanied by empirical testing based on a spatial panel data of house prices in Israel.
As well as providing a history of economic statistics, the book includes contributions by economists from a number of countries, applying economic statistics to the past and to current economic issues.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
Self-contained chapters on the most important applications and methodologies in finance, which can easily be used for the reader’s research or as a reference for courses on empirical finance. Each chapter is reproducible in the sense that the reader can replicate every single figure, table, or number by simply copy-pasting the code we provide. A full-fledged introduction to machine learning with tidymodels based on tidy principles to show how factor selection and option pricing can benefit from Machine Learning methods. Chapter 2 on accessing & managing financial data shows how to retrieve and prepare the most important datasets in the field of financial economics: CRSP and Compustat. The chapter also contains detailed explanations of the most important data characteristics. Each chapter provides exercises that are based on established lectures and exercise classes and which are designed to help students to dig deeper. The exercises can be used for self-studying or as source of inspiration for teaching exercises.
Provides a comprehensive and accessible introduction to general insurance pricing, based on the author’s many years of experience as both a teacher and practitioner. Suitable for students taking a course in general insurance pricing, notably if they are studying to become an actuary through the UK Institute of Actuaries exams. No other title quite like this on the market that is perfect for teaching/study, and is also an excellent guide for practitioners.
How does innovation emerge from normal economic activity? Economic Interdependence and Innovative Activity is an original new book which tries to answer this question by reconciling inter-industrial analysis with the study of innovation. This book provides a bridge between economic statics and the dynamics of growth and development. As well as offering important and original empirical data for Canada, France, Italy, Greece and China, the authors make a series of theoretical advances and propose a new way to observe the innovative process as well as new analytical tools to examine innovative activity. Their central thesis is that innovative outputs emerge out of increased social interaction, and division of labour through cooperative networks. An authoritative theoretical introduction and some thought-provoking conclusions have been prepared by Christian DeBresson. Economic Interdependence and Innovative Activity encourage input-output economists to encompass innovative activities in dynamic models and innovation researchers to look at technical interdependencies.
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
An engaging and accessible examination of what ails insurance markets—and what to do about it—by three leading economists. Why is dental insurance so crummy? Why is pet insurance so expensive? Why does your auto insurer ask for your credit score? The answer to these questions lies in understanding how insurance works. Unlike the market for other goods and services—for instance, a grocer who doesn’t care who buys the store’s broccoli or carrots—insurance providers are more careful in choosing their customers, because some are more expensive than others. Unraveling the mysteries of insurance markets, Liran Einav, Amy Finkelstein, and Ray Fisman explore such issues as why insurers want to know so much about us and whether we should let them obtain this information; why insurance entrepreneurs often fail (and some tricks that may help them succeed); and whether we’d be better off with government-mandated health insurance instead of letting businesses, customers, and markets decide who gets coverage and at what price. With insurance at the center of divisive debates about privacy, equity, and the appropriate role of government, this book offers clear explanations for some of the critical business and policy issues you’ve often wondered about, as well as for others you haven’t yet considered.
What part does technological knowledge accumulation play in modern economic growth? This book investigates and examines the predictions of new growth theory, using OECD manufacturing data. Its empirical findings portray a novel and complex picture of the features of long-term growth, where technological knowledge production and diffusion play a central part, alongside variations in capital and employment. A parallel examination of long-run trade patterns and government policy issues completes a broader account of how knowledge-based growth in industrial output is at the heart of modern economic prosperity.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
This timely volume brings together professors of finance and accounting from Japanese universities to examine the Japanese stock market in terms of its pricing and accounting systems. The papers report the results of empirical research into the Japanese stock market within the framework of new theories of finance. Academics, professionals, and anyone seeking to understand or enter the Japanese market will applaud the publication of this practical, informative volume. Having gathered data from the late 1970's through 1984, the authors analyze the market's behavior and the applicability of two major theoretical pricing models -- the Capital Asset Pricing Models and the Efficient Market Hypothesis -- to that market. Chapter 1 provides background statistical evidence on the behavior of monthly returns on Tokyo Stock Exchange common stocks. Chapter 2 discusses an empirical test of the capital asset pricing model. Chapter 3 examines evidence on the price performance of unseasoned new issues. The authors also examine the Japanese accounting disclosure system: Chapter 4 deals empirically with the information content of the annual accounting announcements and related market efficiency. The next chapter presents empirical evidence on the relationship between unsystematic returns and earnings forecast errors. Next, empirical research into the usefulness to investors of the disclosure system is examined. Finally, Chapter 7 presents several interesting questions and topics for future research on the Japanese stock market.
The primary goal of this book is to present the research
findings and conclusions of physicists, economists, mathematicians
and financial engineers working in the field of "Econophysics" who
have undertaken agent-based modelling, comparison with empirical
studies and related investigations.
This book provides an essential toolkit for all students wishing to know more about the modelling and analysis of financial data. Applications of econometric techniques are becoming increasingly common in the world of finance and this second edition of an established text covers the following key themes: - unit roots, cointegration and other developments in the study of time series models - time varying volatility models of the GARCH type and the stochastic volatility approach - analysis of shock persistence and impulse responses - Markov switching and Kalman filtering - spectral analysis - present value relations and rationality - discrete choice models - analysis of truncated and censored samples - panel data analysis. This updated edition includes new chapters which cover limited dependent variables and panel data. It continues to be an essential guide for all graduate and advanced undergraduate students of econometrics and finance.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
The authors present a number of financial market studies that have as their general theme, the econometric testing of the underlying econometric assumptions of a number of financial models. More than 30 years of financial market research has convinced the authors that not enough attention has been paid to whether the estimated model is appropriate or, most importantly, whether the estimation technique is suitable for the problem under study. For many years linear models have been assumed with little or no testing of alternative specification. The result has been models that force linearity assumptions on what clearly are nonlinear processes. Another major assumption of much financial research constrains the coefficients to be stable over time. This critical assumption has been attacked by Lucas (1976) on the grounds that when economic policy changes, the coefficients of macroeconomics models change. If this occurs, any policy forecasts of these models will be flawed. In financial modeling, omitted (possibly non-quantifiable) variables will bias coefficients. While it may be possible to model some financial variables for extended periods, in other periods the underlying models may either exhibit nonlinearity or show changes in linear models. The authors research indicates that tests for changes in linear models, such as recursive residual analysis, or tests for episodic nonlinearity can be used to signal changes in the underlying structure of the market. The book begins with a brief review of basic linear time series techniques that include autoregressive integrated moving average models (ARIMA), vector autoregressive models (VAR), and models form the ARCH/GARCH class. While the ARIMA and VAR approach models the first moment of a series, models of the ARCH/GARCH class model both the first moment and second moment which is interpreted as conditional or explained volatility of a series. Recent work on nonlinearity detection has questioned the appropriateness of these essentially linear approaches. A number of such tests are shown and applied for the complete series and a subsets of the series. A major finding is that the structure of the series may change over time. Within the time frame of a study, there may be periods of episodic nonlinearity, episodic ARCH and episodic nonstationarity. Measures are developed to measure and relate these events both geographically and with mathematical models. This book will be of interest to applied finance researchers and to market participants.
PREFACE TO THE COLLECTION PREAMBLE The editors are pleased to present a selection of Henri Theil's contributions to economics and econometrics in three volumes. In Volume I we have provided an overview of Theil's contributions, a brief biography, an annotated bibliography of his research, and a selection of published and unpublished articles and chapters in books dealing with topics in econometrics. Volume IT contains Theil's contributions to demand analysis and information theory. Volume ITI includes Theil's contributions in economic policy and forecasting, and management science. The selection of articles is intended to provide examples of Theil's many seminal and pathbreaking contributions to economics in such areas as econometrics, statistics, demand analysis, information theory, economic policy analysis, aggregation theory, forecasting, index numbers, management science, sociology, operations research, higher education and much more. The collection is also intended to serve as a tribute to him on the occasion of his 67th birthday.! These three volumes also highlight some of Theil's contributions and service to the profession as a leader, advisor, administrator, teacher, and researcher. Theil's contributions, which encompass many disciplines, have been extensively cited both in scientific and professional journals. These citations often place Theil among the top 10 researchers (ranked according to number of times cited) in the world in various disciplines. |
You may like...
Optimization of Manufacturing Systems…
Yingfeng Zhang, Fei Tao
Paperback
Closing The Gap - The Fourth Industrial…
Tshilidzi Marwala
Paperback
Control of Complex Systems - Theory and…
Kyriakos Vamvoudakis, Sarangapani Jagannathan
Hardcover
Design of Feedback Control Systems
Raymond T. Stefani, Bahram Shahian, …
Hardcover
R6,540
Discovery Miles 65 400
Geometric Method for Type Synthesis of…
Qinchuan Li, Jacques M. Herve, …
Hardcover
R2,670
Discovery Miles 26 700
Adex Optimized Adaptive Controllers and…
Juan M. Martin-Sanchez, Jose Rodellar
Hardcover
R3,902
Discovery Miles 39 020
|