![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I Richard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold M. S. Coxeter Introduction to Modern Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Bruno de Finetti Theory of Probability, Volume 1 Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research Amos de Shalit & Herman Feshbach Theoretical Nuclear Physics, Volume 1 —Nuclear Structure J. L. Doob Stochastic Processes Nelson Dunford & Jacob T. Schwartz Linear Operators, Part One, General Theory Nelson Dunford & Jacob T. Schwartz Linear Operators, Part Two, Spectral Theory—Self Adjoint Operators in Hilbert Space Nelson Dunford & Jacob T. Schwartz Linear Operators, Part Three, Spectral Operators Herman Fsehbach Theoretical Nuclear Physics: Nuclear Reactions Bernard Friedman Lectures on Applications-Oriented Mathematics Gerald d. Hahn & Samuel S. Shapiro Statistical Models in Engineering Morris H. Hansen, William N. Hurwitz & William G. Madow Sample Survey Methods and Theory, Volume I—Methods and Applications Morris H. Hansen, William N. Hurwitz & William G. Madow Sample Survey Methods and Theory, Volume II—Theory Peter Henrici Applied and Computational Complex Analysis, Volume 1—Power Series—lntegration—Conformal Mapping—Location of Zeros Peter Henrici Applied and Computational Complex Analysis, Volume 2—Special Functions—Integral Transforms—Asymptotics—Continued Fractions Peter Henrici Applied and Computational Complex Analysis, Volume 3—Discrete Fourier Analysis—Cauchy Integrals—Construction of Conformal Maps—Univalent Functions Peter Hilton & Yel-Chiang Wu A Course in Modern Algebra Harry Hochetadt Integral Equations Erwin O. Kreyezig Introductory Functional Analysis with Applications William H. Louisell Quantum Statistical Properties of Radiation All Hasan Nayfeh Introduction to Perturbation Techniques Emanuel Parzen Modern Probability Theory and Its Applications P.M. Prenter Splines and Variational Methods Walter Rudin Fourier Analysis on Groups C. L. Siegel Topics in Complex Function Theory, Volume I—Elliptic Functions and Uniformization Theory C. L. Siegel Topics in Complex Function Theory, Volume II—Automorphic and Abelian integrals C. L Siegel Topics in Complex Function Theory, Volume III—Abelian Functions & Modular Functions of Several Variables J. J. Stoker Differential Geometry J. J. Stoker Water Waves: The Mathematical Theory with Applications J. J. Stoker Nonlinear Vibrations in Mechanical and Electrical Systems
Modelling trends and cycles in economic time series has a long history, with the use of linear trends and moving averages forming the basic tool kit of economists until the 1970s. Several developments in econometrics then led to an overhaul of the techniques used to extract trends and cycles from time series. Terence Mills introduces these various approaches to allow students and researchers to appreciate the variety of techniques and the considerations that underpin their choice for modelling trends and cycles.
This book develops and analyzes dynamic decision models (DDM) with one trajectoral objective according to the methodology of multi-criteria decision making (MCDM). Moreover, DDMs which concomitantly pursue multiple objectives are analyzed, with special emphasis given to hybrid models with scalar and trajectorial objectives as well as models with multiple trajectorial objectives. Introducing the method of distance maximization crucially augments MCDM and proves to be invaluable for DDMs with nonexistent utopia trajectory or with sustainability as objective. The notions of efficiency and sustainability are formally developed and counterposed by means of the construct of trajectorial objective, which is presented here, along with its implications, as a natural advance upon the classical scalar objective.
For some seven decades, econometrics has been almost exclusiveley dealing with constructing and applying econometric equation systems, which constitute constraints in econometric optimization models. The second major component, the scalarvalued objective function, has only in recent years attracted more attention and some progress has been made. This book is devoted to theories, models and methods for constructing scalarvalued objective functions for econometric optimization models, to their applications, and to some related topics like historical issues about pioneering contributions by Ragnar Frisch and Jan Tinbergen.
Empirical measurement of impacts of active labour market programmes has started to become a central task of economic researchers. New improved econometric methods have been developed that will probably influence future empirical work in various other fields of economics as well. This volume contains a selection of original papers from leading experts, among them James J. Heckman, Noble Prize Winner 2000 in economics, addressing these econometric issues at the theoretical and empirical level. The theoretical part contains papers on tight bounds of average treatment effects, instrumental variables estimators, impact measurement with multiple programme options and statistical profiling. The empirical part provides the reader with econometric evaluations of active labour market programmes in Canada, Germany, France, Italy, Slovak Republic and Sweden.
and Feldman, 1996 or Audretsch and Stephan, 1996) show that unformalized knowledge may playa major role in the innovation of new products. Now if unformalized knowledge is communicated personally, distance will be an important variable in this process, since the intensity of contacts between persons can be expected to be negatively correlated to the distance between them. In the discussion of section 3.3.1 (page 42) we saw that it was this aspect of localization that Marshall had in mind when he was alluding to "local trade secrets."4 Note that if this spatial dimension of communication between agents exists, it is possible to transfer it to regional aggregates of agents: the closer two regions, the more they will be able to profit from the respective pool of human capital (R&D-output etc.) of the other region. This argument gives a spatial 5 interpretation of the literature on endogenous growth. Now if these spillovers have a spatial dimension then it follows from the discussion in chapter 3 that they will be one driving force in the dynamics of agglomeration. With the model to be developed in this chapter I will investigate the hy pothesis that it is these forces of agglomeration (i.e. spatial spillovers of nonrival goods or foctors) that are responsible for the inhomogeneous pattern of growth con vergence. To analyze this phenomenon, I consider different types of regional aggregates and different distances in the model."
This book, and its companion volume, present a collection of papers by Clive W.J. Granger. His contributions to economics and econometrics, many of them seminal, span more than four decades and touch on all aspects of time series analysis. The papers assembled in this volume explore topics in spectral analysis, seasonality, nonlinearity, methodology, and forecasting. Those in the companion volume investigate themes in causality, integration and cointegration, and long memory. The two volumes contain the original articles as well as an introduction written by the editors.
The book reports experimental studies and a theoretical investigation of non-cooperative bargaining games with joint production. Such games have rarely been studied within laboratory experiments despite being more general and more natural than bargaining without production. It is shown that equity theory is a good predictor of subjects' behavior. Furthermore subjects exhibit different equity notions. One chapter addresses problems of statistical data analysis that are specific to experiments. Applying evolutionary game theory within a model of bargaining with production it is shown theoretically that altruistic preferences, which generate moderate bargaining behavior, can survive the process of evolution.
Generalized method of moments (GMM) estimation of nonlinear systems has two important advantages over conventional maximum likelihood (ML) estimation: GMM estimation usually requires less restrictive distributional assumptions and remains computationally attractive when ML estimation becomes burdensome or even impossible. This book presents an in-depth treatment of the conditional moment approach to GMM estimation of models frequently encountered in applied microeconometrics. It covers both large sample and small sample properties of conditional moment estimators and provides an application to empirical industrial organization. With its comprehensive and up-to-date coverage of the subject which includes topics like bootstrapping and empirical likelihood techniques, the book addresses scientists, graduate students and professionals in applied econometrics.
In this book, time use behavior within households is modeled as the outcome of a bargaining process between family members who bargain over household resource allocation and the intrafamily distribution of welfare. In view of trends such as rising female employment along with falling fertility rates and increasing divorce rates, a strategic aspect of female employment is analyzed in a dynamic family bargaining framework. The division of housework between spouses and the observed leisure differential between women and men are investigated within non-cooperative bargaining settings. The models developed are tested empirically using data from the German Socio-Economic Panel and the German Time Budget Survey.
A novel methodology is put forward in this book, which empowers researchers to investigate and identify potential spatial processes among a set of regions. Spatial processes and their underlying functional spatial relationships are commonly observed in the geosciences and related disciplines. Examples are spatially autocorrelated random variables manifesting themselves in distinct global patterns as well as local clusters and hot spots, or spatial interaction leading to stochastic ties among the regions. An example from observational epidemiology demonstrates the flexibility of Moran's approach by analyzing the spatial distribution of cancer data from several perspectives. Recent advances in computing technology, computer algorithms, statistical techniques and global and local spatial patterns by means of Moran's "I" feasability. Moran's "I" is an extremely versatile tool for exploring and analyzing spatial data and testing spatial hypotheses.
The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory. Each chapter develops statistical techniques within the context of a particular financial application. This exciting new text contains a unique and accessible combination of theory and practice, bringing state-of-the-art statistical techniques to the forefront of financial applications. Each chapter also includes a discussion of recent empirical evidence, for example, the rejection of the Random Walk Hypothesis, as well as problems designed to help readers incorporate what they have read into their own applications
Neural networks have had considerable success in a variety of disciplines including engineering, control, and financial modelling. However a major weakness is the lack of established procedures for testing mis-specified models and the statistical significance of the various parameters which have been estimated. This is particularly important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. Based on the latest, most significant developments in estimation theory, model selection and the theory of mis-specified models, this volume develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework required, and displays the efficient use of neural networks for modelling complex financial phenomena. Unlike most other books in this area, this one treats neural networks as statistical devices for non-linear, non-parametric regression analysis.
This book deals with the omitted variable tests for a multivariate time-series regression model. What are the consequences of testing for the omission of a variable when the model is dynamically misspecified? What is the small sample bias of the omitted variable test when the model dynamics is correctly specificfied? The answers to these questions are proposed in this book. As an empirical illustration, the analysis is applied to the homogeneity test of a demand system. I particularly thank Professor Dr. Philippe J. Deschamps who draw my attention on this subject and who made very helpful comments and sugges- tions. Additionally, I would like to thank Professor Dr. Reiner Wolff for his comments especially on the chapter dealing with consumer theory. Special thanks go to Maria Jose Redondo, who read this book several times and for the inspiring discussions with her. I would also like to thank Dr. Ali Vak- ili (always ready to answer any questions in mathematics), Prof. Dr. Hans Wolf gang Brachinger, Curzio De Gottardi, Peter Mantsch, Dr. Paul-Andre Monney, Dr. Uwe Steinhauser, Leon Stroeks and Dr. Peter Windlin. Frances Angell improved the English of this work. The research for this book had been financially suppurted by the Univer- site de Fribourg (Switzerland). Finally, I appreciated the support from Springer-Verlag and I thank Dr.
In the mid-eighties Mehra and Prescott showed that the risk premium earned by American stocks cannot reasonably be explained by conventional capital market models. Using time additive utility, the observed risk pre mium can only be explained by unrealistically high risk aversion parameters. This phenomenon is well known as the equity premium puzzle. Shortly aft erwards it was also observed that the risk-free rate is too low relative to the observed risk premium. This essay is the first one to analyze these puzzles in the German capital market. It starts with a thorough discussion of the available theoretical mod els and then goes on to perform various empirical studies on the German capital market. After discussing natural properties of the pricing kernel by which future cash flows are translated into securities prices, various multi period equilibrium models are investigated for their implied pricing kernels. The starting point is a representative investor who optimizes his invest ment and consumption policy over time. One important implication of time additive utility is the identity of relative risk aversion and the inverse in tertemporal elasticity of substitution. Since this identity is at odds with reality, the essay goes on to discuss recursive preferences which violate the expected utility principle but allow to separate relative risk aversion and intertemporal elasticity of substitution."
The use of computer simulations to study social phenomena has grown rapidly during the last few years. Many social scientists from the fields of economics, sociology, psychology and other disciplines now use computer simulations to study a wide range of social phenomena. The availability of powerful personal computers, the development of multidisciplinary approaches and the use of artificial intelligence models have all contributed to this development. The benefits of using computer simulations in the social sciences are obvious. This holds true for the use of simulations as tools for theory building and for its implementation as a tool for sensitivity analysis and parameter optimization in application-oriented models. In both, simulation provides powerful tools for the study of complex social systems, especially for dynamic and multi-agent social systems in which mathematical tractability is often impossible. The graphical display of simulation output renders it user friendly to many social scientists that lack sufficient familiarity with the language of mathematics. The present volume aims to contribute in four directions: (1) To examine theoretical and methodological issues related to the application of simulations in the social sciences. By this we wish to promote the objective of designing a unified, user-friendly, simulation toolkit which could be applied to diverse social problems. While no claim is made that this objective has been met, the theoretical issues treated in Part 1 of this volume are a contribution towards this objective.
Many econometric models contain unknown functions as well as finite- dimensional parameters. Examples of such unknown functions are the distribution function of an unobserved random variable or a transformation of an observed variable. Econometric methods for estimating population parameters in the presence of unknown functions are called "semiparametric." During the past 15 years, much research has been carried out on semiparametric econometric models that are relevant to empirical economics. This book synthesizes the results that have been achieved for five important classes of models. The book is aimed at graduate students in econometrics and statistics as well as professionals who are not experts in semiparametic methods. The usefulness of the methods will be illustrated with applications that use real data.
This manuscript is about the joint dynamics of stock returns and trading volume. It grew out of my attempt to construct an intertemporal asset pricing model with rational agents which can. explain the relation between volume, volatility and persistence of stock return documented in empirical literature. Most part of the manuscript is taken from my thesis. I wish to express my deep appreciation to Peter Kugler and Benedikt Poetscher, my advisors of the thesis, for their invaluable guidance and support. I wish to thank Gerhard Orosel and Gerhard Sorger for their encouraging and helpful discussions. Finally, my thanks go to George Tauchen who has been generous in giving me the benefit of his numerical and computational experience, in providing me with programs and in his encouragement. Contents 1 Introduction 1 7 2 Efficient Stock Markets Equilibrium Models of Asset Pricing 8 2. 1 2. 1. 1 The Martigale Model of Stock Prices 8 2. 1. 2 Lucas' Consumption Based Asset Pricing Model 9 2. 2 Econometric Tests of the Efficient Market Hypothesis 13 2. 2. 1 Autocorrelation Based Tests 14 16 2. 2. 2 Volatility Tests Time-Varying Expected Returns 25 2. 2. 3 3 The Informational Role of Volume 29 3. 1 Standard Grossman-Stiglitz Model 31 3. 2 The No-Trad Result of the BEO Model 34 A Model with Nontradable Asset 37 3. 3 4 Volume and Volatility of Stock Returns 43 4. 1 Empirical and Numerical Results 45 4.
This volume contains revised versions of 43 papers presented during the 21st Annual Conference of the Gesellschaft fur Klassifikation (GfKl), the German Classification Society. The conference took place at the University of Pots- dam (Germany) in March 1997; the local organizer was Prof. 1. Balderjahn, Chair of Business Administration and Marketing at Potsdam. The scientific program of the conference included 103 plenary and con- tributed papers, software and book presentations as well as special (tutorial) courses. Researchers and practitioners interested in data analysis and clus- tering methods, information sciences and database techniques, and in the main topic of the conference: data highways and their importance for classifi- cation and data analysis, had the opportunity to discuss recent developments and to establish cross-disciplinary cooperation in these fields. The conference owed much to its sponsors - Berliner Volksbank - Daimler Benz AG - Deutsche Telekom AG Direktion Potsdam - Dresdner Bank AG Filiale Potsdam - Henkel KGaA - Landeszentralbank in Berlin und Brandenburg - Ministerium fur Wissenschaft, Forschung und Kultur des Landes Brandenburg - Sci con GmbH - Siemens AG - Universitat Potsdam - Unternehmensgruppe Roland Ernst who helped in many ways. Their generous support is gratefully acknowl- edged. In the present proceedings volume, selected and peer-reviewed papers are presented in six chapters as follows.
One aim of this book is to examine the causes of fluctuations in the mark/dollar, pound/dollar, and yen/dollar real exchange rates for the period 1972-1994 with quarterly data to determine appropriate policy recommendations to reduce these movements. A second aim is to investigate whether the three real exchange rates are covariance-stationary or not and to which extent they are covariance-stationary, respectively. These aims are reached by using a two-country overshooting model for real exchange rates with real government expenditure and by applying Johansen's maximum likelihood cointegration procedure and a factor model of Gonzalo and Granger to this model.
This study was written while I was a doctoral student in the Graduier- tenkolleg Finanz-und Gutermiirkte at the University of Mannheim; it has been accepted as a doctoral dissertation in February 1997. I am indebted to my advisors, Professors Axel Borsch-Supan and Martin Hellwig at Mannheim and John Rust at Madison, for their encouragement and for many helpful discussions and comments. At various stages, I benefited from comments on portions of the manu- script by, and from discussions with, Thomas Astebro, Charles Calomiris, Timothy Dunne, Frank Gerhard, Annette Kohler, Jens Koke, Stephan Monissen, Gordon Phillips, Winfried Pohlmeier, Kenneth Troske, Wol- fram Wissler and seminar participants at Columbia Business School, the University of Mannheim, the University of Tiibingen, the University of Wisconsin at Madison, Yale University, the ENTER Jamborees at Uni- versity College London, January 1995, and at Tilburg University, January 1997, at a Meeting of the DFG-Schwerpunktprogramm Industrieokonomik und Inputmiirkte, Heidelberg, November 1996, and at the annual meeting of the Verein fur Socialpolitik, Bern, September 1997. Silke Januszewski and Melanie Liihrmann provided dedicated assistence during the prepa- ration of the final version of the manuscript.
This book comprises the articles of the 6th Econometric Workshop in Karlsruhe, Germany. In the first part approaches from traditional econometrics and innovative methods from machine learning such as neural nets are applied to financial issues. Neural Networks are successfully applied to different areas such as debtor analysis, forecasting and corporate finance. In the second part various aspects from Value-at-Risk are discussed. The proceedings describe the legal framework, review the basics and discuss new approaches such as shortfall measures and credit risk.
Considerable work has been done on chaotic dynamics in the field of economic growth and dynamic macroeconomic models during the last two decades. This book considers numerous new developments: introduction of infrastructure in growth models, heterogeneity of agents, hysteresis systems, overlapping models with "pay-as-you-go" systems, keynesian approaches with finance considerations, interactions between relaxation cycles and chaotic dynamics, methodological issues, long memory processes and fractals... A volume of contributions which shows the relevance and fruitfulness of non-linear analysis for the explanation of complex dynamics in economic systems.
Since there exists a multi-level policy making system in the market
economies, choices of decision makers at different levels should be
considered explicitly in the formulation of sectoral plans and
policies. To support the hypothesis, a theoretical energy planning
approach is developed within the framework of the theory of
economic policy planning, policy systems analysis and multi-level
programming. The Parametric Programming Search Algorithm has been
developed. On the basis of this theoretical model, an Australian
Energy Policy System Optimisation Model (AEPSOM) has been developed
and is used to formulate an Australian multi-level energy
plan.
Subject is the description of unvariate and multivariate business cycle stylized facts. A spectral analysis method (Maximum Entropy spectral estimation) novel in the analysis of economic time series is described and utilized. The method turns out to be superior to widely used time domain methods and the "classical" spectral estimate, the periodogram. The results for eleven OECD countries confirm and extend the basic set of stylized facts of traditional business cycle theory. The changing characteristics of the business cycle are analyzed by comparing the cyclical structure for the postwar and the prewar period. The results show that business cycle is mainly due to investment fluctuations. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Tools and Techniques for Economic…
Jelena Stankovi, Pavlos Delias, …
Hardcover
R5,167
Discovery Miles 51 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,987
Discovery Miles 29 870
|