![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This book is the result of my doctoral dissertation research at the Department of Econometrics of the University of Geneva, Switzerland. This research was also partially financed by the Swiss National Science Foundation (grants 12- 31072.91 and 12-40300.94). First and foremost, I wish to express my deepest gratitude to Professor Manfred Gilli, my thesis supervisor, for his constant support and help. I would also like to thank the president of my jury, Professor Fabrizio Carlevaro, as well as the other members of the jury, Professor Andrew Hughes Hallett, Professor Jean-Philippe Vial and Professor Gerhard Wanner. I am grateful to my colleagues and friends of the Departement of Econometrics, especially David Miceli who provided constant help and kind understanding during all the stages of my research. I would also like to thank Pascale Mignon for proofreading my text and im proving my English. Finally, I am greatly indebted to my parents for their kindness and encourage ments without which I could never have achieved my goals. Giorgio Pauletto Department of Econometrics, University of Geneva, Geneva, Switzerland Chapter 1 Introduction The purpose of this book is to present the available methodologies for the solution of large-scale macroeconometric models. This work reviews classical solution methods and introduces more recent techniques, such as parallel com puting and nonstationary iterative algorithms."
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
When von Neumann's and Morgenstern's Theory of Games and Economic Behavior appeared in 1944, one thought that a complete theory of strategic social behavior had appeared out of nowhere. However, game theory has, to this very day, remained a fast-growing assemblage of models which have gradually been united in a new social theory - a theory that is far from being completed even after recent advances in game theory, as evidenced by the work of the three Nobel Prize winners, John F. Nash, John C. Harsanyi, and Reinhard Selten. Two of them, Harsanyi and Selten, have contributed important articles to the present volume. This book leaves no doubt that the game-theoretical models are on the right track to becoming a respectable new theory, just like the great theories of the twentieth century originated from formerly separate models which merged in the course of decades. For social scientists, the age of great discover ies is not over. The recent advances of today's game theory surpass by far the results of traditional game theory. For example, modem game theory has a new empirical and social foundation, namely, societal experiences; this has changed its methods, its "rationality. " Morgenstern (I worked together with him for four years) dreamed of an encompassing theory of social behavior. With the inclusion of the concept of evolution in mathematical form, this dream will become true. Perhaps the new foundation will even lead to a new name, "conflict theory" instead of "game theory."
Patrick Artus and Yves Barroux The Applied Econometric Association organised an international conference on "Monetary and Financial Models" in Geneva in January 1987. The purpose of this book is to make available to the public a choice of the papers that were presented at the conference. The selected papers all deal with the setting of monetary targets and the effects of monetary policy on the economy as well as with the analysis of the financial behaviours of economic agents. Other papers presented at the same conference but dealing with the external aspects of monetary policy (exchange rate policy, international coordination of economic policies, international transmission of business cycles, . . . ) are the matter of a distinct publication. The papers put together to make up this book either are theoretical research contributions or consist of applied statistical or econometric work. It seemed to be more logical to start with the more theoretical papers. The topics tackled in the first two parts of the book have in common the fact that they appeared just recently in the field of economic research and deal with the analysis of the behaviour of Central Banks. They analyse this behaviour so as to be able to exhibit its major determinants as well as revealed preferences of Central Banks: this topic comes under the caption "optimal monetary policy and reaction function of the monetary authorities."
Aggregation of individual opinions into a social decision is a problem widely observed in everyday life. For centuries people tried to invent the best' aggregation rule. In 1951 young American scientist and future Nobel Prize winner Kenneth Arrow formulated the problem in an axiomatic way, i.e., he specified a set of axioms which every reasonable aggregation rule has to satisfy, and obtained that these axioms are inconsistent. This result, often called Arrow's Paradox or General Impossibility Theorem, had become a cornerstone of social choice theory. The main condition used by Arrow was his famous Independence of Irrelevant Alternatives. This very condition pre-defines the local' treatment of the alternatives (or pairs of alternatives, or sets of alternatives, etc.) in aggregation procedures. Remaining within the framework of the axiomatic approach and based on the consideration of local rules, Arrovian Aggregation Models investigates three formulations of the aggregation problem according to the form in which the individual opinions about the alternatives are defined, as well as to the form of desired social decision. In other words, we study three aggregation models. What is common between them is that in all models some analogue of the Independence of Irrelevant Alternatives condition is used, which is why we call these models Arrovian aggregation models. Chapter 1 presents a general description of the problem of axiomatic synthesis of local rules, and introduces problem formulations for various versions of formalization of individual opinions and collective decision. Chapter 2 formalizes precisely the notion of rationality' of individual opinions and social decision. Chapter 3 deals with the aggregation model for the case of individual opinions and social decisions formalized as binary relations. Chapter 4 deals with Functional Aggregation Rules which transform into a social choice function individual opinions defined as choice functions. Chapter 5 considers another model &endash; Social Choice Correspondences when the individual opinions are formalized as binary relations, and the collective decision is looked for as a choice function. Several new classes of rules are introduced and analyzed.
The generous social welfare system in Europe is one of the most important differences between Europe and the US. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. Others argue that the "invisible hand" in the US economy is equally powerful in reducing unemployment and preventing crime. This book takes this trade-off as a starting point and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. In doing so, it evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation.
Multivariate Statistical Analysis
This book focuses on tools and techniques for building regression models using real-world data and assessing their validity. A key theme throughout the book is that it makes sense to base inferences or conclusions only on valid models. Plots are shown to be an important tool for both building regression models and assessing their validity. We shall see that deciding what to plot and how each plot should be interpreted will be a major challenge. In order to overcome this challenge we shall need to understand the mathematical properties of the fitted regression models and associated diagnostic procedures. As such this will be an area of focus throughout the book. In particular, we shall carefully study the properties of resi- als in order to understand when patterns in residual plots provide direct information about model misspecification and when they do not. The regression output and plots that appear throughout the book have been gen- ated using R. The output from R that appears in this book has been edited in minor ways. On the book web site you will find the R code used in each example in the text.
Data envelopment analysis develops a set of nonparametric and semiparametric techniques for measuring economic efficiency among firms and nonprofit organizations. Over the past decade this technique has found most widespread applications in public sector organizations. However these applications have been mostly static. This monograph extends this static framework of efficiency analysis in several new directions. These include but are not limited to the following: (1) a dynamic view of the production and cost frontier, where capital inputs are treated differently from the current inputs, (2) a direct role of the technological progress and regress, which is so often stressed in total factor productivity discussion in modem growth theory in economics, (3) stochastic efficiency in a dynamic setting, where reliability improvement competes with technical efficiency, (4) flexible manufacturing systems, where flexibility of the production process and the economies of scope play an important role in efficiency analysis and (5) the role of economic factors such as externalities and input interdependences. Efficiency is viewed here in the framework of a general systems theory model. Such a view is intended to broaden the scope of applications of this promising new technique of data envelopment analysis. The monograph stresses the various applied aspects of the dynamic theory, so that it can be empirically implemented in different situations. As far as possible abstract mathematical treatments are avoided and emphasis placed on the statistical examples and empirical illustrations.
This book covers a highly relevant and timely topic that is of wide interest, especially in finance, engineering and computational biology. The introductory material on simulation and stochastic differential equation is very accessible and will prove popular with many readers. While there are several recent texts available that cover stochastic differential equations, the concentration here on inference makes this book stand out. No other direct competitors are known to date. With an emphasis on the practical implementation of the simulation and estimation methods presented, the text will be useful to practitioners and students with minimal mathematical background. What's more, because of the many R programs, the information here is appropriate for many mathematically well educated practitioners, too.
Gerald P. Dwyer, Jr. and R. W. Hafer The articles and commentaries included in this volume were presented at the Federal Reserve Bank of St. Louis' thirteenth annual economic policy conference, held on October 21-22, 1988. The conference focused on the behavior of asset market prices, a topic of increasing interest to both the popular press and to academic journals as the bull market of the 1980s continued. The events that transpired during October, 1987, both in the United States and abroad, provide an informative setting to test alter native theories. In assembling the papers presented during this conference, we asked the authors to explore the issue of asset pricing and financial market behavior from several vantages. Was the crash evidence of the bursting of a speculative bubble? Do we know enough about the work ings of asset markets to hazard an intelligent guess why they dropped so dramatically in such a brief time? Do we know enough to propose regulatory changes that will prevent any such occurrence in the future, or do we want to even if we can? We think that the articles and commentaries contained in this volume provide significant insight to inform and to answer such questions. The article by Behzad Diba surveys existing theoretical and empirical research on rational bubbles in asset prices."
International Applications of Productivity and Efficiency Analysis features a complete range of techniques utilized in frontier analysis, including extensions of existing techniques and the development of new techniques. Another feature is that most of the contributions use panel data in a variety of approaches. Finally, the range of empirical applications is at least as great as the range of techniques, and many of the applications are of considerable policy relevance.
This book proposes new methods to build optimal portfolios and to analyze market liquidity and volatility under market microstructure effects, as well as new financial risk measures using parametric and non-parametric techniques. In particular, it investigates the market microstructure of foreign exchange and futures markets.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
This highly useful book contains methodology for the analysis of data that arise from multiscale processes. It brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. These methods can handle different amounts of prior knowledge at different scales, as often occurs in practice.
The place in survival analysis now occupied by proportional hazards models and their generalizations is so large that it is no longer conceivable to offer a course on the subject without devoting at least half of the content to this topic alone. This book focuses on the theory and applications of a very broad class of models - proportional hazards and non-proportional hazards models, the former being viewed as a special case of the latter - which underlie modern survival analysis. Researchers and students alike will find that this text differs from most recent works in that it is mostly concerned with methodological issues rather than the analysis itself.
WINNER OF THE 2007 DEGROOT PRIZE The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models. For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods. It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach. The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
Written by a world leader in the field and aimed at researchers in applied and engineering sciences, this brilliant text has as its main goal imparting an understanding of the methods so that practitioners can make immediate use of existing algorithms and software, and so that researchers can extend the state of the art and find new applications. It includes algorithms on seeking feasibility and analyzing infeasibility, as well as describing new and surprising applications.
Globalization affects regional economies in a broad spectrum of aspects, from labor market conditions and development policies to climate change. This volume, written by an international cast of eminent regional scientists, provides new tools for analyzing the enormous changes in regional economies due to globalization. It offers timely conceptual refinements for regional analysis.
This book presents models and statistical methods for the analysis of recurrent event data. The authors provide broad, detailed coverage of the major approaches to analysis, while emphasizing the modeling assumptions that they are based on. More general intensity-based models are also considered, as well as simpler models that focus on rate or mean functions. Parametric, nonparametric and semiparametric methodologies are all covered, with procedures for estimation, testing and model checking.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled.
Empirical Studies on Volatility in International Stock Markets describes the existing techniques for the measurement and estimation of volatility in international stock markets with emphasis on the SV model and its empirical application. Eugenie Hol develops various extensions of the SV model, which allow for additional variables in both the mean and the variance equation. In addition, the forecasting performance of SV models is compared not only to that of the well-established GARCH model but also to implied volatility and so-called realised volatility models which are based on intraday volatility measures. The intended readers are financial professionals who seek to obtain more accurate volatility forecasts and wish to gain insight about state-of-the-art volatility modelling techniques and their empirical value, and academic researchers and students who are interested in financial market volatility and want to obtain an updated overview of the various methods available in this area.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa,l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity.
The purpose of models is not to fit the data but to sharpen the questions. S. Karlin, 11th R. A. Fisher Memorial Lecture, Royal Society, 20 April 1983 We are proud to offer this volume in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Not one to model solely for the sake of modelling, the above quotation nicely captures Professor Paelinck's unceasing quest for the best question for which an answer is needed. His FLEUR model has sharpened many spatial economics and spatial econometrics questions! Jean Paelinck, arguably, is the founder of modem spatial econometrics, penning the seminal introductory monograph on this topic, Spatial Econometrics, with Klaassen in 1979. In the General Address to the Dutch Statistical Association, on May 2, 1974, in Tilburg, "he coined the term [spatial econometrics] to designate a growing body of the regional science literature that dealt primarily with estimation and testing problems encountered in the implementation of multiregional econometric models" (Anselin, 1988, p. 7); he already had introduced this idea in his introductory report to the 1966 Annual Meeting of the Association de Science Regionale de Langue Fran~aise. |
![]() ![]() You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,055
Discovery Miles 30 550
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Advances in Longitudinal Data Methods in…
Nicholas Tsounis, Aspasia Vlachvei
Hardcover
R7,157
Discovery Miles 71 570
|