![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The contents of this volume are drawn from the seventh International Symposium in Economic Theory and Econometrics, and represent recent advances in the development of concepts and methods in political economy. Contributors include leading practitioners working on formal, applied, and historical approaches to the subject. The collection will interest scholars in the fields of political science and political sociology no less than economics. Section 1 investigates models of voting and representation, section 2 explores dimensions of political institutions, section 3 covers strategic aspects of competition, and section 4 examines key aspects of government behavior.
The book reports experimental studies and a theoretical investigation of non-cooperative bargaining games with joint production. Such games have rarely been studied within laboratory experiments despite being more general and more natural than bargaining without production. It is shown that equity theory is a good predictor of subjects' behavior. Furthermore subjects exhibit different equity notions. One chapter addresses problems of statistical data analysis that are specific to experiments. Applying evolutionary game theory within a model of bargaining with production it is shown theoretically that altruistic preferences, which generate moderate bargaining behavior, can survive the process of evolution.
Generalized method of moments (GMM) estimation of nonlinear systems has two important advantages over conventional maximum likelihood (ML) estimation: GMM estimation usually requires less restrictive distributional assumptions and remains computationally attractive when ML estimation becomes burdensome or even impossible. This book presents an in-depth treatment of the conditional moment approach to GMM estimation of models frequently encountered in applied microeconometrics. It covers both large sample and small sample properties of conditional moment estimators and provides an application to empirical industrial organization. With its comprehensive and up-to-date coverage of the subject which includes topics like bootstrapping and empirical likelihood techniques, the book addresses scientists, graduate students and professionals in applied econometrics.
In this book, time use behavior within households is modeled as the outcome of a bargaining process between family members who bargain over household resource allocation and the intrafamily distribution of welfare. In view of trends such as rising female employment along with falling fertility rates and increasing divorce rates, a strategic aspect of female employment is analyzed in a dynamic family bargaining framework. The division of housework between spouses and the observed leisure differential between women and men are investigated within non-cooperative bargaining settings. The models developed are tested empirically using data from the German Socio-Economic Panel and the German Time Budget Survey.
A novel methodology is put forward in this book, which empowers researchers to investigate and identify potential spatial processes among a set of regions. Spatial processes and their underlying functional spatial relationships are commonly observed in the geosciences and related disciplines. Examples are spatially autocorrelated random variables manifesting themselves in distinct global patterns as well as local clusters and hot spots, or spatial interaction leading to stochastic ties among the regions. An example from observational epidemiology demonstrates the flexibility of Moran's approach by analyzing the spatial distribution of cancer data from several perspectives. Recent advances in computing technology, computer algorithms, statistical techniques and global and local spatial patterns by means of Moran's "I" feasability. Moran's "I" is an extremely versatile tool for exploring and analyzing spatial data and testing spatial hypotheses.
Two-sided matching provides a model of search processes such as those between firms and workers in labor markets or between buyers and sellers in auctions. This book gives a comprehensive account of recent results concerning the game-theoretic analysis of two-sided matching. The focus of the book is on the stability of outcomes, on the incentives that different rules of organization give to agents, and on the constraints that these incentives impose on the ways such markets can be organized. The results for this wide range of related models and matching situations help clarify which conclusions depend on particular modeling assumptions and market conditions, and which are robust over a wide range of conditions.
Applied Nonparametric Regression brings together in one place the techniques for regression curve smoothing involving more than one variable. The computer and the development of interactive graphics programs has made curve estimation popular. This volume focuses on the applications and practical problems of two central aspects of curve smoothing: the choice of smoothing parameters and the construction of confidence bounds. The methods covered in this text have numerous applications in many areas using statistical analysis. Examples are drawn from economics--such as the estimation of Engel curves--as well as other disciplines including medicine and engineering. For practical applications of these methods a computing environment for exploratory Regression--XploRe--is described.
Neural networks have had considerable success in a variety of disciplines including engineering, control, and financial modelling. However a major weakness is the lack of established procedures for testing mis-specified models and the statistical significance of the various parameters which have been estimated. This is particularly important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. Based on the latest, most significant developments in estimation theory, model selection and the theory of mis-specified models, this volume develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework required, and displays the efficient use of neural networks for modelling complex financial phenomena. Unlike most other books in this area, this one treats neural networks as statistical devices for non-linear, non-parametric regression analysis.
This book deals with the omitted variable tests for a multivariate time-series regression model. What are the consequences of testing for the omission of a variable when the model is dynamically misspecified? What is the small sample bias of the omitted variable test when the model dynamics is correctly specificfied? The answers to these questions are proposed in this book. As an empirical illustration, the analysis is applied to the homogeneity test of a demand system. I particularly thank Professor Dr. Philippe J. Deschamps who draw my attention on this subject and who made very helpful comments and sugges- tions. Additionally, I would like to thank Professor Dr. Reiner Wolff for his comments especially on the chapter dealing with consumer theory. Special thanks go to Maria Jose Redondo, who read this book several times and for the inspiring discussions with her. I would also like to thank Dr. Ali Vak- ili (always ready to answer any questions in mathematics), Prof. Dr. Hans Wolf gang Brachinger, Curzio De Gottardi, Peter Mantsch, Dr. Paul-Andre Monney, Dr. Uwe Steinhauser, Leon Stroeks and Dr. Peter Windlin. Frances Angell improved the English of this work. The research for this book had been financially suppurted by the Univer- site de Fribourg (Switzerland). Finally, I appreciated the support from Springer-Verlag and I thank Dr.
This collection of papers delivered at the fifth international Symposium in Economic Theory and Econometrics in 1988 is devoted to recent advances in the estimation and testing of models that impose relatively weak restrictions on the stochastic behavior of data. Particularly in highly nonlinear models, empirical results are very sensitive to the choice of the parametric form of the distribution of the observable variables, and often nonparametric and semiparametric models are a preferable alternative. Methods and applications that do not require strong parametric assumptions for their validity, that are based on kernels and on series expansions, and methods for independent and dependent observations, are investigated and developed in these essays by renowned econometricians.
In the mid-eighties Mehra and Prescott showed that the risk premium earned by American stocks cannot reasonably be explained by conventional capital market models. Using time additive utility, the observed risk pre mium can only be explained by unrealistically high risk aversion parameters. This phenomenon is well known as the equity premium puzzle. Shortly aft erwards it was also observed that the risk-free rate is too low relative to the observed risk premium. This essay is the first one to analyze these puzzles in the German capital market. It starts with a thorough discussion of the available theoretical mod els and then goes on to perform various empirical studies on the German capital market. After discussing natural properties of the pricing kernel by which future cash flows are translated into securities prices, various multi period equilibrium models are investigated for their implied pricing kernels. The starting point is a representative investor who optimizes his invest ment and consumption policy over time. One important implication of time additive utility is the identity of relative risk aversion and the inverse in tertemporal elasticity of substitution. Since this identity is at odds with reality, the essay goes on to discuss recursive preferences which violate the expected utility principle but allow to separate relative risk aversion and intertemporal elasticity of substitution."
This book provides a synthesis of concepts and materials that ordinarily appear separately in time series and econometrics literature, presenting a comprehensive review of both theoretical and applied concepts. Perhaps the most novel feature of the book is its use of Kalman filtering together with econometric and time series methodology. From a technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. This technique was originally developed in control engineering but is becoming increasingly important in economics and operations research. The book is primarily concerned with modeling economic and social time series and with addressing the special problems that the treatment of such series pose.
The use of computer simulations to study social phenomena has grown rapidly during the last few years. Many social scientists from the fields of economics, sociology, psychology and other disciplines now use computer simulations to study a wide range of social phenomena. The availability of powerful personal computers, the development of multidisciplinary approaches and the use of artificial intelligence models have all contributed to this development. The benefits of using computer simulations in the social sciences are obvious. This holds true for the use of simulations as tools for theory building and for its implementation as a tool for sensitivity analysis and parameter optimization in application-oriented models. In both, simulation provides powerful tools for the study of complex social systems, especially for dynamic and multi-agent social systems in which mathematical tractability is often impossible. The graphical display of simulation output renders it user friendly to many social scientists that lack sufficient familiarity with the language of mathematics. The present volume aims to contribute in four directions: (1) To examine theoretical and methodological issues related to the application of simulations in the social sciences. By this we wish to promote the objective of designing a unified, user-friendly, simulation toolkit which could be applied to diverse social problems. While no claim is made that this objective has been met, the theoretical issues treated in Part 1 of this volume are a contribution towards this objective.
Many econometric models contain unknown functions as well as finite- dimensional parameters. Examples of such unknown functions are the distribution function of an unobserved random variable or a transformation of an observed variable. Econometric methods for estimating population parameters in the presence of unknown functions are called "semiparametric." During the past 15 years, much research has been carried out on semiparametric econometric models that are relevant to empirical economics. This book synthesizes the results that have been achieved for five important classes of models. The book is aimed at graduate students in econometrics and statistics as well as professionals who are not experts in semiparametic methods. The usefulness of the methods will be illustrated with applications that use real data.
As most econometricians will readily agree, the data used in applied econometrics seldom provide accurate measurements for the pertinent theory's variables. Here, Bernt Stigum offers the first systematic and theoretically sound way of accounting for such inaccuracies. He and a distinguished group of contributors bridge econometrics and the philosophy of economics--two topics that seem worlds apart. They ask: How is a science of economics possible? The answer is elusive. Economic theory seems to be about abstract ideas or, it might be said, about toys in a toy community. How can a researcher with such tools learn anything about the social reality in which he or she lives? This book shows that an econometrician with the proper understanding of economic theory and the right kind of questions can gain knowledge about characteristic features of the social world. It addresses varied topics in both classical and Bayesian econometrics, offering ample evidence that its answer to the fundamental question is sound. The first book to comprehensively explore economic theory and econometrics simultaneously, Econometrics and the Philosophy of Economics represents an authoritative account of contemporary economic methodology. About a third of the chapters are authored or coauthored by Heather Anderson, Erik Biorn, Christophe Bontemps, Jeffrey A. Dubin, Harald E. Goldstein, Clive W.J. Granger, David F. Hendry, Herman Ruge-Jervell, Dale W. Jorgenson, Hans-Martin Krolzig, Nils Lid Hjort, Daniel L. McFadden, Grayham E. Mizon, Tore Schweder, Geir Storvik, and Herman K. van Dijk.
This manuscript is about the joint dynamics of stock returns and trading volume. It grew out of my attempt to construct an intertemporal asset pricing model with rational agents which can. explain the relation between volume, volatility and persistence of stock return documented in empirical literature. Most part of the manuscript is taken from my thesis. I wish to express my deep appreciation to Peter Kugler and Benedikt Poetscher, my advisors of the thesis, for their invaluable guidance and support. I wish to thank Gerhard Orosel and Gerhard Sorger for their encouraging and helpful discussions. Finally, my thanks go to George Tauchen who has been generous in giving me the benefit of his numerical and computational experience, in providing me with programs and in his encouragement. Contents 1 Introduction 1 7 2 Efficient Stock Markets Equilibrium Models of Asset Pricing 8 2. 1 2. 1. 1 The Martigale Model of Stock Prices 8 2. 1. 2 Lucas' Consumption Based Asset Pricing Model 9 2. 2 Econometric Tests of the Efficient Market Hypothesis 13 2. 2. 1 Autocorrelation Based Tests 14 16 2. 2. 2 Volatility Tests Time-Varying Expected Returns 25 2. 2. 3 3 The Informational Role of Volume 29 3. 1 Standard Grossman-Stiglitz Model 31 3. 2 The No-Trad Result of the BEO Model 34 A Model with Nontradable Asset 37 3. 3 4 Volume and Volatility of Stock Returns 43 4. 1 Empirical and Numerical Results 45 4.
This volume contains revised versions of 43 papers presented during the 21st Annual Conference of the Gesellschaft fur Klassifikation (GfKl), the German Classification Society. The conference took place at the University of Pots- dam (Germany) in March 1997; the local organizer was Prof. 1. Balderjahn, Chair of Business Administration and Marketing at Potsdam. The scientific program of the conference included 103 plenary and con- tributed papers, software and book presentations as well as special (tutorial) courses. Researchers and practitioners interested in data analysis and clus- tering methods, information sciences and database techniques, and in the main topic of the conference: data highways and their importance for classifi- cation and data analysis, had the opportunity to discuss recent developments and to establish cross-disciplinary cooperation in these fields. The conference owed much to its sponsors - Berliner Volksbank - Daimler Benz AG - Deutsche Telekom AG Direktion Potsdam - Dresdner Bank AG Filiale Potsdam - Henkel KGaA - Landeszentralbank in Berlin und Brandenburg - Ministerium fur Wissenschaft, Forschung und Kultur des Landes Brandenburg - Sci con GmbH - Siemens AG - Universitat Potsdam - Unternehmensgruppe Roland Ernst who helped in many ways. Their generous support is gratefully acknowl- edged. In the present proceedings volume, selected and peer-reviewed papers are presented in six chapters as follows.
Recent advances in establishing the nature and scope of estimators in econometrics have shed more light on the importance of instrumental variables. In this book, the authors argue that such methods may be regarded as a strong organizing principle for a wide variety of estimation and hypothesis testing problems in econometrics and statistics. In support of this claim they present and develop the methodology of instrumental variables in its most general and explanatory form. They show, for instance, that techniques commonly used to handle simultaneity and related problems can be reduced to one of two generic variables of instrumental variables estimators, allowing them to explore further the conditions under which different proposed estimators are efficient.
One aim of this book is to examine the causes of fluctuations in the mark/dollar, pound/dollar, and yen/dollar real exchange rates for the period 1972-1994 with quarterly data to determine appropriate policy recommendations to reduce these movements. A second aim is to investigate whether the three real exchange rates are covariance-stationary or not and to which extent they are covariance-stationary, respectively. These aims are reached by using a two-country overshooting model for real exchange rates with real government expenditure and by applying Johansen's maximum likelihood cointegration procedure and a factor model of Gonzalo and Granger to this model.
This study was written while I was a doctoral student in the Graduier- tenkolleg Finanz-und Gutermiirkte at the University of Mannheim; it has been accepted as a doctoral dissertation in February 1997. I am indebted to my advisors, Professors Axel Borsch-Supan and Martin Hellwig at Mannheim and John Rust at Madison, for their encouragement and for many helpful discussions and comments. At various stages, I benefited from comments on portions of the manu- script by, and from discussions with, Thomas Astebro, Charles Calomiris, Timothy Dunne, Frank Gerhard, Annette Kohler, Jens Koke, Stephan Monissen, Gordon Phillips, Winfried Pohlmeier, Kenneth Troske, Wol- fram Wissler and seminar participants at Columbia Business School, the University of Mannheim, the University of Tiibingen, the University of Wisconsin at Madison, Yale University, the ENTER Jamborees at Uni- versity College London, January 1995, and at Tilburg University, January 1997, at a Meeting of the DFG-Schwerpunktprogramm Industrieokonomik und Inputmiirkte, Heidelberg, November 1996, and at the annual meeting of the Verein fur Socialpolitik, Bern, September 1997. Silke Januszewski and Melanie Liihrmann provided dedicated assistence during the prepa- ration of the final version of the manuscript.
This book comprises the articles of the 6th Econometric Workshop in Karlsruhe, Germany. In the first part approaches from traditional econometrics and innovative methods from machine learning such as neural nets are applied to financial issues. Neural Networks are successfully applied to different areas such as debtor analysis, forecasting and corporate finance. In the second part various aspects from Value-at-Risk are discussed. The proceedings describe the legal framework, review the basics and discuss new approaches such as shortfall measures and credit risk.
Considerable work has been done on chaotic dynamics in the field of economic growth and dynamic macroeconomic models during the last two decades. This book considers numerous new developments: introduction of infrastructure in growth models, heterogeneity of agents, hysteresis systems, overlapping models with "pay-as-you-go" systems, keynesian approaches with finance considerations, interactions between relaxation cycles and chaotic dynamics, methodological issues, long memory processes and fractals... A volume of contributions which shows the relevance and fruitfulness of non-linear analysis for the explanation of complex dynamics in economic systems.
Since there exists a multi-level policy making system in the market
economies, choices of decision makers at different levels should be
considered explicitly in the formulation of sectoral plans and
policies. To support the hypothesis, a theoretical energy planning
approach is developed within the framework of the theory of
economic policy planning, policy systems analysis and multi-level
programming. The Parametric Programming Search Algorithm has been
developed. On the basis of this theoretical model, an Australian
Energy Policy System Optimisation Model (AEPSOM) has been developed
and is used to formulate an Australian multi-level energy
plan.
The advent of electronic computing permits the empirical analysis of economic models of far greater subtlety and rigour than before, when many interesting ideas were not followed up because the calculations involved made this impracticable. The estimation and testing of these more intricate models is usually based on the method of Maximum Likelihood, which is a well-established branch of mathematical statistics. Its use in econometrics has led to the development of a number of special techniques; the specific conditions of econometric research moreover demand certain changes in the interpretation of the basic argument. This book is a self-contained introduction to this field. It consists of three parts. The first deals with general features of Maximum Likelihood methods; the second with linear and nonlinear regression; and the third with discrete choice and related micro-economic models. Readers should already be familiar with elementary statistical theory, with applied econometric research papers, or with the literature on the mathematical basis of Maximum Likelihood theory. They can also try their hand at some advanced econometric research of their own.
The most common mode of analysis in economic theory is to assume equilibrium. Yet, without a proper theory of how economies behave in disequilibrium, there is no foundation for such a practice. The necessary step in proposing a foundation is the formulation of a theory of stability, and in this 1984 book, Professor Fisher is primarily concerned with this subject, although disequilibrium behavior itself is analyzed. The author first undertakes a review of the existing literature on the stability of general equilibrium. He then proposes a more satisfactory general model in which agents realize their state of disequilibrium and act on arbitrage opportunities. The interrelated topics of the role of money, the nature of quantity constraints, and the optimal behaviour of arbitraging agents are extensively treated. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
An Introduction to Wavelets and Other…
Ramazan Gencay, Faruk Selcuk, …
Hardcover
R2,710
Discovery Miles 27 100
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|