Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
A novel methodology is put forward in this book, which empowers researchers to investigate and identify potential spatial processes among a set of regions. Spatial processes and their underlying functional spatial relationships are commonly observed in the geosciences and related disciplines. Examples are spatially autocorrelated random variables manifesting themselves in distinct global patterns as well as local clusters and hot spots, or spatial interaction leading to stochastic ties among the regions. An example from observational epidemiology demonstrates the flexibility of Moran's approach by analyzing the spatial distribution of cancer data from several perspectives. Recent advances in computing technology, computer algorithms, statistical techniques and global and local spatial patterns by means of Moran's "I" feasability. Moran's "I" is an extremely versatile tool for exploring and analyzing spatial data and testing spatial hypotheses.
An accessible treatment of Monte Carlo methods, techniques, and applications in the field of finance and economics Providing readers with an in-depth and comprehensive guide, the Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics presents a timely account of the applicationsof Monte Carlo methods in financial engineering and economics. Written by an international leading expert in thefield, the handbook illustrates the challenges confronting present-day financial practitioners and provides various applicationsof Monte Carlo techniques to answer these issues. The book is organized into five parts: introduction andmotivation; input analysis, modeling, and estimation; random variate and sample path generation; output analysisand variance reduction; and applications ranging from option pricing and risk management to optimization. The Handbook in Monte Carlo Simulation features: * An introductory section for basic material on stochastic modeling and estimation aimed at readers who may need a summary or review of the essentials * Carefully crafted examples in order to spot potential pitfalls and drawbacks of each approach * An accessible treatment of advanced topics such as low-discrepancy sequences, stochastic optimization, dynamic programming, risk measures, and Markov chain Monte Carlo methods * Numerous pieces of R code used to illustrate fundamental ideas in concrete terms and encourage experimentation The Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics is a complete reference for practitioners in the fields of finance, business, applied statistics, econometrics, and engineering, as well as a supplement for MBA and graduate-level courses on Monte Carlo methods and simulation.
Neural networks have had considerable success in a variety of disciplines including engineering, control, and financial modelling. However a major weakness is the lack of established procedures for testing mis-specified models and the statistical significance of the various parameters which have been estimated. This is particularly important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. Based on the latest, most significant developments in estimation theory, model selection and the theory of mis-specified models, this volume develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework required, and displays the efficient use of neural networks for modelling complex financial phenomena. Unlike most other books in this area, this one treats neural networks as statistical devices for non-linear, non-parametric regression analysis.
This book links the questions people ask about why things exist, why the world is the way it is, and whether and how it is possible to change their society or world with the societal myths they develop and teach to answer those questions and organize and bring order to their communal lives. It also is about the need for change in western societies’ current organizing concept, classical (Lockean) liberalism. Despite the attempts of numerous insightful political thinkers, the myth of classical liberalism has developed so many cracks that it cannot be put back together again. If not entirely failed, it is at this point unsalvageable in its present form. Never the thought of just one person, the liberal model of individual religious, political, and economic freedom developed over hundreds of years starting with Martin Luther’s dictum that every man should be his own priest. Although, classical liberalism means different things to different people, at its most basic level, this model sees human beings as individuals who exist prior to government and have rights over government and the social good. That is, the individual right always trumps the moral and social good and individuals have few obligations to one another unless they actively choose to undertake them. Possibility’s Parents argues that Lockean liberalism has reached the end of its logic in ways that make it unable to handle the western world’s most pressing problems and that novelists whose writing includes the form and texture of myth have important insights to offer on the way forward.
Recent advances in establishing the nature and scope of estimators in econometrics have shed more light on the importance of instrumental variables. In this book, the authors argue that such methods may be regarded as a strong organizing principle for a wide variety of estimation and hypothesis testing problems in econometrics and statistics. In support of this claim they present and develop the methodology of instrumental variables in its most general and explanatory form. They show, for instance, that techniques commonly used to handle simultaneity and related problems can be reduced to one of two generic variables of instrumental variables estimators, allowing them to explore further the conditions under which different proposed estimators are efficient.
In the mid-eighties Mehra and Prescott showed that the risk premium earned by American stocks cannot reasonably be explained by conventional capital market models. Using time additive utility, the observed risk pre mium can only be explained by unrealistically high risk aversion parameters. This phenomenon is well known as the equity premium puzzle. Shortly aft erwards it was also observed that the risk-free rate is too low relative to the observed risk premium. This essay is the first one to analyze these puzzles in the German capital market. It starts with a thorough discussion of the available theoretical mod els and then goes on to perform various empirical studies on the German capital market. After discussing natural properties of the pricing kernel by which future cash flows are translated into securities prices, various multi period equilibrium models are investigated for their implied pricing kernels. The starting point is a representative investor who optimizes his invest ment and consumption policy over time. One important implication of time additive utility is the identity of relative risk aversion and the inverse in tertemporal elasticity of substitution. Since this identity is at odds with reality, the essay goes on to discuss recursive preferences which violate the expected utility principle but allow to separate relative risk aversion and intertemporal elasticity of substitution."
The contents of this volume comprise the proceedings of the International Symposia in Economic Theory and Econometrics conference held in 1987 at the IC^T2 (Innovation, Creativity, and Capital) Institute at the University of Texas at Austin. The essays present fundamental new research on the analysis of complicated outcomes in relatively simple macroeconomic models. The book covers econometric modelling and time series analysis techniques in five parts. Part I focuses on sunspot equilibria, the study of uncertainty generated by nonstochastic economic models. Part II examines the more traditional examples of deterministic chaos: bubbles, instability, and hyperinflation. Part III contains the most current literature dealing with empirical tests for chaos and strange attractors. Part IV deals with chaos and informational complexity. Part V, Nonlinear Econometric Modelling, includes tests for and applications of nonlinearity.
This study was written while I was a doctoral student in the Graduier- tenkolleg Finanz-und Gutermiirkte at the University of Mannheim; it has been accepted as a doctoral dissertation in February 1997. I am indebted to my advisors, Professors Axel Borsch-Supan and Martin Hellwig at Mannheim and John Rust at Madison, for their encouragement and for many helpful discussions and comments. At various stages, I benefited from comments on portions of the manu- script by, and from discussions with, Thomas Astebro, Charles Calomiris, Timothy Dunne, Frank Gerhard, Annette Kohler, Jens Koke, Stephan Monissen, Gordon Phillips, Winfried Pohlmeier, Kenneth Troske, Wol- fram Wissler and seminar participants at Columbia Business School, the University of Mannheim, the University of Tiibingen, the University of Wisconsin at Madison, Yale University, the ENTER Jamborees at Uni- versity College London, January 1995, and at Tilburg University, January 1997, at a Meeting of the DFG-Schwerpunktprogramm Industrieokonomik und Inputmiirkte, Heidelberg, November 1996, and at the annual meeting of the Verein fur Socialpolitik, Bern, September 1997. Silke Januszewski and Melanie Liihrmann provided dedicated assistence during the prepa- ration of the final version of the manuscript.
This book comprises the articles of the 6th Econometric Workshop in Karlsruhe, Germany. In the first part approaches from traditional econometrics and innovative methods from machine learning such as neural nets are applied to financial issues. Neural Networks are successfully applied to different areas such as debtor analysis, forecasting and corporate finance. In the second part various aspects from Value-at-Risk are discussed. The proceedings describe the legal framework, review the basics and discuss new approaches such as shortfall measures and credit risk.
The advent of electronic computing permits the empirical analysis of economic models of far greater subtlety and rigour than before, when many interesting ideas were not followed up because the calculations involved made this impracticable. The estimation and testing of these more intricate models is usually based on the method of Maximum Likelihood, which is a well-established branch of mathematical statistics. Its use in econometrics has led to the development of a number of special techniques; the specific conditions of econometric research moreover demand certain changes in the interpretation of the basic argument. This book is a self-contained introduction to this field. It consists of three parts. The first deals with general features of Maximum Likelihood methods; the second with linear and nonlinear regression; and the third with discrete choice and related micro-economic models. Readers should already be familiar with elementary statistical theory, with applied econometric research papers, or with the literature on the mathematical basis of Maximum Likelihood theory. They can also try their hand at some advanced econometric research of their own.
The most common mode of analysis in economic theory is to assume equilibrium. Yet, without a proper theory of how economies behave in disequilibrium, there is no foundation for such a practice. The necessary step in proposing a foundation is the formulation of a theory of stability, and in this 1984 book, Professor Fisher is primarily concerned with this subject, although disequilibrium behavior itself is analyzed. The author first undertakes a review of the existing literature on the stability of general equilibrium. He then proposes a more satisfactory general model in which agents realize their state of disequilibrium and act on arbitrage opportunities. The interrelated topics of the role of money, the nature of quantity constraints, and the optimal behaviour of arbitraging agents are extensively treated.
Many econometric models contain unknown functions as well as finite- dimensional parameters. Examples of such unknown functions are the distribution function of an unobserved random variable or a transformation of an observed variable. Econometric methods for estimating population parameters in the presence of unknown functions are called "semiparametric." During the past 15 years, much research has been carried out on semiparametric econometric models that are relevant to empirical economics. This book synthesizes the results that have been achieved for five important classes of models. The book is aimed at graduate students in econometrics and statistics as well as professionals who are not experts in semiparametic methods. The usefulness of the methods will be illustrated with applications that use real data.
Considerable work has been done on chaotic dynamics in the field of economic growth and dynamic macroeconomic models during the last two decades. This book considers numerous new developments: introduction of infrastructure in growth models, heterogeneity of agents, hysteresis systems, overlapping models with "pay-as-you-go" systems, keynesian approaches with finance considerations, interactions between relaxation cycles and chaotic dynamics, methodological issues, long memory processes and fractals... A volume of contributions which shows the relevance and fruitfulness of non-linear analysis for the explanation of complex dynamics in economic systems.
This manuscript is about the joint dynamics of stock returns and trading volume. It grew out of my attempt to construct an intertemporal asset pricing model with rational agents which can. explain the relation between volume, volatility and persistence of stock return documented in empirical literature. Most part of the manuscript is taken from my thesis. I wish to express my deep appreciation to Peter Kugler and Benedikt Poetscher, my advisors of the thesis, for their invaluable guidance and support. I wish to thank Gerhard Orosel and Gerhard Sorger for their encouraging and helpful discussions. Finally, my thanks go to George Tauchen who has been generous in giving me the benefit of his numerical and computational experience, in providing me with programs and in his encouragement. Contents 1 Introduction 1 7 2 Efficient Stock Markets Equilibrium Models of Asset Pricing 8 2. 1 2. 1. 1 The Martigale Model of Stock Prices 8 2. 1. 2 Lucas' Consumption Based Asset Pricing Model 9 2. 2 Econometric Tests of the Efficient Market Hypothesis 13 2. 2. 1 Autocorrelation Based Tests 14 16 2. 2. 2 Volatility Tests Time-Varying Expected Returns 25 2. 2. 3 3 The Informational Role of Volume 29 3. 1 Standard Grossman-Stiglitz Model 31 3. 2 The No-Trad Result of the BEO Model 34 A Model with Nontradable Asset 37 3. 3 4 Volume and Volatility of Stock Returns 43 4. 1 Empirical and Numerical Results 45 4.
This volume contains revised versions of 43 papers presented during the 21st Annual Conference of the Gesellschaft fur Klassifikation (GfKl), the German Classification Society. The conference took place at the University of Pots- dam (Germany) in March 1997; the local organizer was Prof. 1. Balderjahn, Chair of Business Administration and Marketing at Potsdam. The scientific program of the conference included 103 plenary and con- tributed papers, software and book presentations as well as special (tutorial) courses. Researchers and practitioners interested in data analysis and clus- tering methods, information sciences and database techniques, and in the main topic of the conference: data highways and their importance for classifi- cation and data analysis, had the opportunity to discuss recent developments and to establish cross-disciplinary cooperation in these fields. The conference owed much to its sponsors - Berliner Volksbank - Daimler Benz AG - Deutsche Telekom AG Direktion Potsdam - Dresdner Bank AG Filiale Potsdam - Henkel KGaA - Landeszentralbank in Berlin und Brandenburg - Ministerium fur Wissenschaft, Forschung und Kultur des Landes Brandenburg - Sci con GmbH - Siemens AG - Universitat Potsdam - Unternehmensgruppe Roland Ernst who helped in many ways. Their generous support is gratefully acknowl- edged. In the present proceedings volume, selected and peer-reviewed papers are presented in six chapters as follows.
One aim of this book is to examine the causes of fluctuations in the mark/dollar, pound/dollar, and yen/dollar real exchange rates for the period 1972-1994 with quarterly data to determine appropriate policy recommendations to reduce these movements. A second aim is to investigate whether the three real exchange rates are covariance-stationary or not and to which extent they are covariance-stationary, respectively. These aims are reached by using a two-country overshooting model for real exchange rates with real government expenditure and by applying Johansen's maximum likelihood cointegration procedure and a factor model of Gonzalo and Granger to this model.
This book brings together presentations of some of the fundamental new research that has begun to appear in the areas of dynamic structural modeling, nonlinear structural modeling, time series modeling, nonparametric inference, and chaotic attractor inference. The contents of this volume comprise the proceedings of the third of a conference series entitled International Symposia in Economic Theory and Econometrics. This conference was held at the IC;s2 (Innovation, Creativity and Capital) Institute at the University of Texas at Austin on May 22-23, l986.
The present book was accepted as a dissertation at the Humboldt Universitat zu Berlin in summer 1996. I am very much obliged to thank my advisor, Professor Wolfgang Hardie, for the continuous, always inspiring support and for opening me the world of non parametric statistics. Without him I probably would have worked on a different, less exciting topic and this book would not exist. Also, I would like to thank my second advisor, Professor Helmut Liitkepohl, for his excellent introduction to time series analysis and for always helpful comments on my work. This work was financially supported by the Deutsche Forschungsgemein schaft, in the first stage while I was a member of the Graduiertenkolleg "Ap plied Microeconomics," and later when I came to the Sonderforschungsbereich 373. For an interestingly widespread academic surrounding I want to thank the members of the Graduiertenkolleg and the Sonderforschungsbereich, es pecially Stefan Sperlich and Axel Werwatz. For the use of XploRe and many other issues I received substantial help from my colleagues Sigbert Klinke, Thomas Kotter, Marlene Miiller and Swetlana Schmelzer. Concerning many central topics of this dissertation, helpful and improving comments were given by Jorg Breitung, Helmut Herwartz, RolfTschernig and Lijian Yang, who also revised most parts of the manuscript. I have much reason to thank them for their help. Of course, all remaining errors are mine. Berlin, July 1997 CHRISTIAN M 0 HAFNER Contents Preface . . . . . IX List of Tables ."
Since there exists a multi-level policy making system in the market
economies, choices of decision makers at different levels should be
considered explicitly in the formulation of sectoral plans and
policies. To support the hypothesis, a theoretical energy planning
approach is developed within the framework of the theory of
economic policy planning, policy systems analysis and multi-level
programming. The Parametric Programming Search Algorithm has been
developed. On the basis of this theoretical model, an Australian
Energy Policy System Optimisation Model (AEPSOM) has been developed
and is used to formulate an Australian multi-level energy
plan.
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include: Well-established nonparametric and parametric approaches to estimation and conventional (asymptotic and bootstrap) frameworks for statistical inference Estimation of models based on moment restrictions implied by economic theory, including various method-of-moments estimators for unconditional and conditional moment restriction models, and asymptotic theory for correctly specified and misspecified models Non-conventional asymptotic tools that lead to improved finite sample inference, such as higher-order asymptotic analysis that allows for more accurate approximations via various asymptotic expansions, and asymptotic approximations based on drifting parameter sequences Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
We live in a time of economic virtualism, whereby our lives are made to conform to the virtual reality of economic thought. Globalization, transnational capitalism, structural adjustment programmes and the decay of welfare are all signs of the growing power of economics, one of the most potent forces of recent decades. In the last thirty years, economics has ceased to be just an academic discipline concerned with the study of economy, and has come to be the only legitimate way to think about all aspects of society and how we order our lives. Economic models are no longer measured against the world they seek to describe, but instead the world is measured against them, found wanting and made to conform.This profound and dangerous change in the power of abstract economics to shape the lives of people in rich and poor countries alike is the subject of this interdisciplinary study. Contributors show how economics has come to portray a virtual reality -- a world that seems real but is merely a reflection of a neo-classical model -- and how governments, the World Bank and the IMF combine to stamp the world with a virtual image that condemns as irrational our local social and cultural arrangements. Further, it is argued that virtualism represents the worrying emergence of new forms of abstraction in the political economy, of which economics is just one example.
For Masters and PhD students in EconomicsIn this textbook, the duality between the equilibrium concept used in dynamic economic theory and the stationarity of economic variables is explained and used in the presentation of single equations models and system of equations such as VARs, recursive models and simultaneous equations models.The book also contains chapters on: exogeneity, in the context of estimation, policy analysis and forecasting; automatic (computer based) variable selection, and how it can aid in the specification of an empirical macroeconomic model; and finally, on a common framework for model-based economic forecasting.Supplementary materials and notes are available on the publisher's website.
This book presents an extensive survey of the theory and empirics of international parity conditions which are critical to our understanding of the linkages between world markets and the movement of interest and exchange rates across countries. The book falls into three parts dealing with the theory, methods of econometric testing and existing empirical evidence. Although it is intended to provide a consensus view on the subject, the authors also make some controversial propositions, particularly on the purchasing power parity conditions.
1.1 Economic issues to be analyzed This research examines two elements of the Swiss market for electricity: the residential electricity demand by time-of-use and the cost structure of municipal electricity distribution utilities. The empirical results of demand and cost elasticities allow the investigation of interesting economic and policy issues such as the desirability of a widespread introduction of time-of-use pricing for residential customers, the desirability of side-by-side competition in the distribution of electricity and, more generally, the economic effects of a reduction of the load factor and of mergers between electric distribution utilities on costs. Desirability of time-of-use pricing In the last decade there has been an intensifying debate in Switzerland about efficacy of electricity rate reforms in order to improve the efficiency of electricity use. This debate was initiated by two main events. First, there was an important growth of electricity consumption. Second, the Chernobyl accident in 1986 aroused widespread public concern about the problems associated with nuclear power and waste disposal. As a result, in 1991 the Swiss voted, in a referendum, a lO-year moratorium on the 2 construction of new nuclear power plants. Moreover, plans to expand production of hydroelectric power (construction of new dams or expanding existing ones) have been stiffly opposed by environmental groups. These developments have consistently curtailed potential expansion of domestic electricity supply. As a result, Switzerland during the winter has to import electricity from foreign countries.
Subject is the description of unvariate and multivariate business cycle stylized facts. A spectral analysis method (Maximum Entropy spectral estimation) novel in the analysis of economic time series is described and utilized. The method turns out to be superior to widely used time domain methods and the "classical" spectral estimate, the periodogram. The results for eleven OECD countries confirm and extend the basic set of stylized facts of traditional business cycle theory. The changing characteristics of the business cycle are analyzed by comparing the cyclical structure for the postwar and the prewar period. The results show that business cycle is mainly due to investment fluctuations. |
You may like...
Boundaries and Borders in the…
Nenad Stefanov, Srdjan Radovic
Hardcover
R2,794
Discovery Miles 27 940
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,686
Discovery Miles 76 860
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,649
Discovery Miles 56 490
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,570
Discovery Miles 25 700
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|