![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
In the modern world of gigantic datasets, which scientists and practioners of all fields of learning are confronted with, the availability of robust, scalable and easy-to-use methods for pattern recognition and data mining are of paramount importance, so as to be able to cope with the avalanche of data in a meaningful way. This concise and pedagogical research monograph introduces the reader to two specific aspects - clustering techniques and dimensionality reduction - in the context of complex network analysis. The first chapter provides a short introduction into relevant graph theoretical notation; chapter 2 then reviews and compares a number of cluster definitions from different fields of science. In the subsequent chapters, a first-principles approach to graph clustering in complex networks is developed using methods from statistical physics and the reader will learn, that even today, this field significantly contributes to the understanding and resolution of the related statistical inference issues. Finally, an application chapter examines real-world networks from the economic realm to show how the network clustering process can be used to deal with large, sparse datasets where conventional analyses fail.
It is increasingly common for analysts to seek out the opinions of individuals and organizations using attitudinal scales such as degree of satisfaction or importance attached to an issue. Examples include levels of obesity, seriousness of a health condition, attitudes towards service levels, opinions on products, voting intentions, and the degree of clarity of contracts. Ordered choice models provide a relevant methodology for capturing the sources of influence that explain the choice made amongst a set of ordered alternatives. The methods have evolved to a level of sophistication that can allow for heterogeneity in the threshold parameters, in the explanatory variables (through random parameters), and in the decomposition of the residual variance. This book brings together contributions in ordered choice modeling from a number of disciplines, synthesizing developments over the last fifty years, and suggests useful extensions to account for the wide range of sources of influence on choice.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
This volume contains an accessible discussion examining computationally-intensive techniques and bootstrap methods, providing ways to improve the finite-sample performance of well-known asymptotic tests for regression models. The book uses the linear regression model as a framework for introducing simulation-based tests to help perform econometric analyses.
This monograph deals with spatially dependent nonstationary time series in a way accessible to both time series econometricians wanting to understand spatial econometics, and spatial econometricians lacking a grounding in time series analysis. After charting key concepts in both time series and spatial econometrics, the book discusses how the spatial connectivity matrix can be estimated using spatial panel data instead of assuming it to be exogenously fixed. This is followed by a discussion of spatial nonstationarity in spatial cross-section data, and a full exposition of non-stationarity in both single and multi-equation contexts, including the estimation and simulation of spatial vector autoregression (VAR) models and spatial error correction (ECM) models. The book reviews the literature on panel unit root tests and panel cointegration tests for spatially independent data, and for data that are strongly spatially dependent. It provides for the first time critical values for panel unit root tests and panel cointegration tests when the spatial panel data are weakly or spatially dependent. The volume concludes with a discussion of incorporating strong and weak spatial dependence in non-stationary panel data models. All discussions are accompanied by empirical testing based on a spatial panel data of house prices in Israel.
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
As well as providing a history of economic statistics, the book includes contributions by economists from a number of countries, applying economic statistics to the past and to current economic issues.
Self-contained chapters on the most important applications and methodologies in finance, which can easily be used for the reader’s research or as a reference for courses on empirical finance. Each chapter is reproducible in the sense that the reader can replicate every single figure, table, or number by simply copy-pasting the code we provide. A full-fledged introduction to machine learning with tidymodels based on tidy principles to show how factor selection and option pricing can benefit from Machine Learning methods. Chapter 2 on accessing & managing financial data shows how to retrieve and prepare the most important datasets in the field of financial economics: CRSP and Compustat. The chapter also contains detailed explanations of the most important data characteristics. Each chapter provides exercises that are based on established lectures and exercise classes and which are designed to help students to dig deeper. The exercises can be used for self-studying or as source of inspiration for teaching exercises.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
Provides a comprehensive and accessible introduction to general insurance pricing, based on the author’s many years of experience as both a teacher and practitioner. Suitable for students taking a course in general insurance pricing, notably if they are studying to become an actuary through the UK Institute of Actuaries exams. No other title quite like this on the market that is perfect for teaching/study, and is also an excellent guide for practitioners.
How does innovation emerge from normal economic activity? Economic Interdependence and Innovative Activity is an original new book which tries to answer this question by reconciling inter-industrial analysis with the study of innovation. This book provides a bridge between economic statics and the dynamics of growth and development. As well as offering important and original empirical data for Canada, France, Italy, Greece and China, the authors make a series of theoretical advances and propose a new way to observe the innovative process as well as new analytical tools to examine innovative activity. Their central thesis is that innovative outputs emerge out of increased social interaction, and division of labour through cooperative networks. An authoritative theoretical introduction and some thought-provoking conclusions have been prepared by Christian DeBresson. Economic Interdependence and Innovative Activity encourage input-output economists to encompass innovative activities in dynamic models and innovation researchers to look at technical interdependencies.
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
What part does technological knowledge accumulation play in modern economic growth? This book investigates and examines the predictions of new growth theory, using OECD manufacturing data. Its empirical findings portray a novel and complex picture of the features of long-term growth, where technological knowledge production and diffusion play a central part, alongside variations in capital and employment. A parallel examination of long-run trade patterns and government policy issues completes a broader account of how knowledge-based growth in industrial output is at the heart of modern economic prosperity.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
This timely volume brings together professors of finance and accounting from Japanese universities to examine the Japanese stock market in terms of its pricing and accounting systems. The papers report the results of empirical research into the Japanese stock market within the framework of new theories of finance. Academics, professionals, and anyone seeking to understand or enter the Japanese market will applaud the publication of this practical, informative volume. Having gathered data from the late 1970's through 1984, the authors analyze the market's behavior and the applicability of two major theoretical pricing models -- the Capital Asset Pricing Models and the Efficient Market Hypothesis -- to that market. Chapter 1 provides background statistical evidence on the behavior of monthly returns on Tokyo Stock Exchange common stocks. Chapter 2 discusses an empirical test of the capital asset pricing model. Chapter 3 examines evidence on the price performance of unseasoned new issues. The authors also examine the Japanese accounting disclosure system: Chapter 4 deals empirically with the information content of the annual accounting announcements and related market efficiency. The next chapter presents empirical evidence on the relationship between unsystematic returns and earnings forecast errors. Next, empirical research into the usefulness to investors of the disclosure system is examined. Finally, Chapter 7 presents several interesting questions and topics for future research on the Japanese stock market.
The primary goal of this book is to present the research
findings and conclusions of physicists, economists, mathematicians
and financial engineers working in the field of "Econophysics" who
have undertaken agent-based modelling, comparison with empirical
studies and related investigations.
This book provides an essential toolkit for all students wishing to know more about the modelling and analysis of financial data. Applications of econometric techniques are becoming increasingly common in the world of finance and this second edition of an established text covers the following key themes: - unit roots, cointegration and other developments in the study of time series models - time varying volatility models of the GARCH type and the stochastic volatility approach - analysis of shock persistence and impulse responses - Markov switching and Kalman filtering - spectral analysis - present value relations and rationality - discrete choice models - analysis of truncated and censored samples - panel data analysis. This updated edition includes new chapters which cover limited dependent variables and panel data. It continues to be an essential guide for all graduate and advanced undergraduate students of econometrics and finance.
The authors present a number of financial market studies that have as their general theme, the econometric testing of the underlying econometric assumptions of a number of financial models. More than 30 years of financial market research has convinced the authors that not enough attention has been paid to whether the estimated model is appropriate or, most importantly, whether the estimation technique is suitable for the problem under study. For many years linear models have been assumed with little or no testing of alternative specification. The result has been models that force linearity assumptions on what clearly are nonlinear processes. Another major assumption of much financial research constrains the coefficients to be stable over time. This critical assumption has been attacked by Lucas (1976) on the grounds that when economic policy changes, the coefficients of macroeconomics models change. If this occurs, any policy forecasts of these models will be flawed. In financial modeling, omitted (possibly non-quantifiable) variables will bias coefficients. While it may be possible to model some financial variables for extended periods, in other periods the underlying models may either exhibit nonlinearity or show changes in linear models. The authors research indicates that tests for changes in linear models, such as recursive residual analysis, or tests for episodic nonlinearity can be used to signal changes in the underlying structure of the market. The book begins with a brief review of basic linear time series techniques that include autoregressive integrated moving average models (ARIMA), vector autoregressive models (VAR), and models form the ARCH/GARCH class. While the ARIMA and VAR approach models the first moment of a series, models of the ARCH/GARCH class model both the first moment and second moment which is interpreted as conditional or explained volatility of a series. Recent work on nonlinearity detection has questioned the appropriateness of these essentially linear approaches. A number of such tests are shown and applied for the complete series and a subsets of the series. A major finding is that the structure of the series may change over time. Within the time frame of a study, there may be periods of episodic nonlinearity, episodic ARCH and episodic nonstationarity. Measures are developed to measure and relate these events both geographically and with mathematical models. This book will be of interest to applied finance researchers and to market participants.
PREFACE TO THE COLLECTION PREAMBLE The editors are pleased to present a selection of Henri Theil's contributions to economics and econometrics in three volumes. In Volume I we have provided an overview of Theil's contributions, a brief biography, an annotated bibliography of his research, and a selection of published and unpublished articles and chapters in books dealing with topics in econometrics. Volume IT contains Theil's contributions to demand analysis and information theory. Volume ITI includes Theil's contributions in economic policy and forecasting, and management science. The selection of articles is intended to provide examples of Theil's many seminal and pathbreaking contributions to economics in such areas as econometrics, statistics, demand analysis, information theory, economic policy analysis, aggregation theory, forecasting, index numbers, management science, sociology, operations research, higher education and much more. The collection is also intended to serve as a tribute to him on the occasion of his 67th birthday.! These three volumes also highlight some of Theil's contributions and service to the profession as a leader, advisor, administrator, teacher, and researcher. Theil's contributions, which encompass many disciplines, have been extensively cited both in scientific and professional journals. These citations often place Theil among the top 10 researchers (ranked according to number of times cited) in the world in various disciplines.
This book contains a systematic analysis of allocation rules related to cost and surplus sharing problems. Broadly speaking, it examines various types of rules for allocating a common monetary value (cost) between individual members of a group (or network) when the characteristics of the problem are somehow objectively given. Without being an advanced text it o?ers a comprehensive mathematical analysis of a series of well-known allocation rules. The aim is to provide an overview and synthesis of current kno- edge concerning cost and surplus sharing methods. The text is accompanied by a description of several practical cases and numerous examples designed to make the theoretical results easily comprehensible for both students and practitioners alike. The book is based on a series of lectures given at the University of Copenhagen and Copenhagen Business School for graduate students joining the math/econ program. I am indebted to numerous colleagues, conference participants and s- dents who during the years have shaped my approach and interests through collaboration, commentsandquestionsthatweregreatlyinspiring.Inparti- lar, I would like to thank Hans Keiding, Maurice Koster, Tobias Markeprand, Juan D. Moreno-Ternero, Herv e Moulin, Bezalel Peleg, Lars Thorlund- Petersen, Jorgen Tind, Mich Tvede and Lars Peter Osterdal." |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|