Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
We live in a time of economic virtualism, whereby our lives are made to conform to the virtual reality of economic thought. Globalization, transnational capitalism, structural adjustment programmes and the decay of welfare are all signs of the growing power of economics, one of the most potent forces of recent decades. In the last thirty years, economics has ceased to be just an academic discipline concerned with the study of economy, and has come to be the only legitimate way to think about all aspects of society and how we order our lives. Economic models are no longer measured against the world they seek to describe, but instead the world is measured against them, found wanting and made to conform.This profound and dangerous change in the power of abstract economics to shape the lives of people in rich and poor countries alike is the subject of this interdisciplinary study. Contributors show how economics has come to portray a virtual reality -- a world that seems real but is merely a reflection of a neo-classical model -- and how governments, the World Bank and the IMF combine to stamp the world with a virtual image that condemns as irrational our local social and cultural arrangements. Further, it is argued that virtualism represents the worrying emergence of new forms of abstraction in the political economy, of which economics is just one example.
A collection of articles presented at the XLVI Applied Econometrics Association conference on exchange rates held in Heigerloch Castle, Germany), in 1995. The book consists of three parts examining the experience of the exchange rate in Europe. In the first part some aspects of exchange rate determination in Europe are examined; the second part deals with the exchange rate policy within the European Monetary System; in the third part an analysis of recent intervention practices in the European exchange rate markets is presented.
Population aging raises a number of issues regarding the optimality of public debt policy and the systems of public pension provisions that are in use in developed countries. The studies in this book address these questions using computable general equilibrium models. They give illuminating insights and new empirical estimates of future prospects of pay-as-you-go pension schemes in the "big seven" OECD countries, the possible distortions introduced by the pension systems in four large European economies, the effects of lifetime uncertainty in analyzing a potential reform of the Dutch pension system, effects of increasing international mobility of financial capital to pension policies, and public debt reduction policies in relation to possible adverse effects of taxation on wage formation and unemployment.
This book provides a practical introduction to mathematics for economics using R software. Using R as a basis, this book guides the reader through foundational topics in linear algebra, calculus, and optimization. The book is organized in order of increasing difficulty, beginning with a rudimentary introduction to R and progressing through exercises that require the reader to code their own functions in R. All chapters include applications for topics in economics and econometrics. As fully reproducible book, this volume gives readers the opportunity to learn by doing and develop research skills as they go. As such, it is appropriate for students in economics and econometrics.
This book helps and promotes the use of machine learning tools and techniques in econometrics and explains how machine learning can enhance and expand the econometrics toolbox in theory and in practice. Throughout the volume, the authors raise and answer six questions: 1) What are the similarities between existing econometric and machine learning techniques? 2) To what extent can machine learning techniques assist econometric investigation? Specifically, how robust or stable is the prediction from machine learning algorithms given the ever-changing nature of human behavior? 3) Can machine learning techniques assist in testing statistical hypotheses and identifying causal relationships in 'big data? 4) How can existing econometric techniques be extended by incorporating machine learning concepts? 5) How can new econometric tools and approaches be elaborated on based on machine learning techniques? 6) Is it possible to develop machine learning techniques further and make them even more readily applicable in econometrics? As the data structures in economic and financial data become more complex and models become more sophisticated, the book takes a multidisciplinary approach in developing both disciplines of machine learning and econometrics in conjunction, rather than in isolation. This volume is a must-read for scholars, researchers, students, policy-makers, and practitioners, who are using econometrics in theory or in practice.
This book was born out of a five-years research at Sonderforschungsbe reich 303 by the Deutsche Forschungsgemeinschaft (DFG) at Rheinische Friedrich-Wilhelms-Universitiit Bonn and was approved as my doctoral thesis by the Rechts-und Staatswissenschaftliche Fakultiit in December 1994. It was my former colleague Wolfgang Peters who had drawn my atten tion to overlapping-generations models and to problems of intergenerational efficiency and distribution. The subtle connection between the latter two has been fascinating me from the very beginning: redistribution of the results of free trade can become necessary from the point of view of efficiency, although no externalities hamper the development of an economy. In spite of being a matured part of economics, neoclassical growth theory had left many questions unsolved, some of them even unrecognized by a large part of our profession. I took up the challenge to contribute to the investigation of some of these thorny problems. One of these issues is the often quoted idea of the inter generational con tract. Although intergenerational transfers can improve intertemporal effi ciency, the design of pension schemes to achieve an improvement of well-being of some generations without hurting that of any other, is not an easy task in an economy with flexible prices. Quite frequently, only interest rate and growth rate are taken into account when deciding on whether a generation wins or looses."
This book/software package divulges the combined knowledge of a whole international community of Mathematica users - from the fields of economics, finance, investments, quantitative business and operations research. The 23 contributors - all experts in their fields - take full advantage of the latest updates of Mathematica in their presentations and equip both current and prospective users with tools for professional, research and educational projects. The real-world and self-contained models provided are applicable to an extensive range of contemporary problems. The DOS disk contains Notebooks and packages which are also available online from the TELOS site.
Major transport infrastructures are increasingly in the news as both the engineering and financing possibilities come together. However, these projects have also demonstrated the inadequacy of most existing approaches to forecasting their impacts and their overall evaluation. This collection of papers from a conference organised by the Applied Econometric Association represents a state of the art look at issues of forecasting traffic, developing pricing strategies and estimating the impacts in a set of papers by leading authorities from Europe, North America and Japan.
This book provides a self-contained account of periodic models for
seasonally observed economic time series with stochastic trends.
Two key concepts are periodic integration and periodic
cointegration. Periodic integration implies that a seasonally
varying differencing filter is required to remove a stochastic
trend. Periodic cointegration amounts to allowing cointegration
paort-term adjustment parameters to vary with the season. The
emphasis is on useful econrameters and shometric models that
explicitly describe seasonal variation and can reasonably be
interpreted in terms of economic behaviour. The analysis considers
econometric theory, Monte Carlo simulation, and forecasting, and it
is illustrated with numerous empirical time series. A key feature
of the proposed models is that changing seasonal fluctuations
depend on the trend and business cycle fluctuations. In the case of
such dependence, it is shown that seasonal adjustment leads to
inappropriate results.
This book links the questions people ask about why things exist, why the world is the way it is, and whether and how it is possible to change their society or world with the societal myths they develop and teach to answer those questions and organize and bring order to their communal lives. It also is about the need for change in western societies’ current organizing concept, classical (Lockean) liberalism. Despite the attempts of numerous insightful political thinkers, the myth of classical liberalism has developed so many cracks that it cannot be put back together again. If not entirely failed, it is at this point unsalvageable in its present form. Never the thought of just one person, the liberal model of individual religious, political, and economic freedom developed over hundreds of years starting with Martin Luther’s dictum that every man should be his own priest. Although, classical liberalism means different things to different people, at its most basic level, this model sees human beings as individuals who exist prior to government and have rights over government and the social good. That is, the individual right always trumps the moral and social good and individuals have few obligations to one another unless they actively choose to undertake them. Possibility’s Parents argues that Lockean liberalism has reached the end of its logic in ways that make it unable to handle the western world’s most pressing problems and that novelists whose writing includes the form and texture of myth have important insights to offer on the way forward.
1. 1 Introduction In economics, one often observes time series that exhibit different patterns of qualitative behavior, both regular and irregular, symmetric and asymmetric. There exist two different perspectives to explain this kind of behavior within the framework of a dynamical model. The traditional belief is that the time evolution of the series can be explained by a linear dynamic model that is exogenously disturbed by a stochastic process. In that case, the observed irregular behavior is explained by the influence of external random shocks which do not necessarily have an economic reason. A more recent theory has evolved in economics that attributes the patterns of change in economic time series to an underlying nonlinear structure, which means that fluctua tions can as well be caused endogenously by the influence of market forces, preference relations, or technological progress. One of the main reasons why nonlinear dynamic models are so interesting to economists is that they are able to produce a great variety of possible dynamic outcomes - from regular predictable behavior to the most complex irregular behavior - rich enough to meet the economists' objectives of modeling. The traditional linear models can only capture a limited number of possi ble dynamic phenomena, which are basically convergence to an equilibrium point, steady oscillations, and unbounded divergence. In any case, for a lin ear system one can write down exactly the solutions to a set of differential or difference equations and classify them."
The problem of disparities between different estimates of GDP is, according to this text, well-known and widely discussed. Here, the authors describe a method for examining the discrepancies using a technique allocating them with reference to data reliability. The method enhances the reliability of the underlying data and leads to maximum-likelihood estimates. It is illustrated by application to the UK national accounts for the period 1920-1990. The book includes a full set of estimates for this period, including runs of industrial data for the period 1948-1990 which are longer than those available from any other source. The statistical technique allows estimates of standard errors of the data to be calculated and verified; these are presented both for data in levels and for changes in variables over one-, two- and five-year periods. A disk with the dataset in machine readable form is available separately.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
As large physical capital stock projects need long periods to be built, a time-to-build specification is incorporated in factor demand models. Time-to-build and adjustment costs dynamics are identified since by the first moving average dynamics, whereas by the latter autoregressive dynamics are induced. Empirical evidence for time-to-build is obtained from data from the Dutch construction industry and by the estimation result from the manufacturing industry of six OECD countries.
Migration, commuting, and tourism are prominent phenomena demonstrating the political and economic relevance of the spatial choice behavior of households. The identification of the determinants and effects of the households' location choice is necessary for both entrepreneurial and policy planners who attempt to predict (or regulate) the future demand for location-specific commodities, such as infrastructure, land, or housing, and the supply of labor. Microeconomic studies of the spatial behavior of individuals have typically focused upon the demand for a single, homogeneous, yet location-specific com 2 modity (such as land or housing ) or their supply of labor3 and investigated the formation of location-specific prices and wages in the presence of transportation and migration costs or analyzed the individual-and location-specific character istics triggering spatial rather than quantitative or temporal adjustments. In contrast to many theoretical analyses, empirical studies of the causes or con sequences of individual demand for location-specific commodities have often considered several "brands" of a heterogeneous good that are offered at various locations, are perfect substitutes, and may be produced by varying production 4 technologies. lCf. Alonso (1964) 2Cf. Muth (1969). 3Cf. Sjaastad (1962) and Greenwood (1975)."
The papers collected in this volume are contributions to T.I.Tech./K.E.S. Conference on Nonlinear and Convex Analysis in Economic Theory, which was held at Keio University, July 2-4, 1993. The conference was organized by Tokyo Institute of Technology (T. I. Tech.) and the Keio Economic Society (K. E. S.) , and supported by Nihon Keizai Shimbun Inc .. A lot of economic problems can be formulated as constrained optimiza tions and equilibrations of their solutions. Nonlinear-convex analysis has been supplying economists with indispensable mathematical machineries for these problems arising in economic theory. Conversely, mathematicians working in this discipline of analysis have been stimulated by various mathematical difficulties raised by economic the ories. Although our special emphasis was laid upon "nonlinearity" and "con vexity" in relation with economic theories, we also incorporated stochastic aspects of financial economics in our project taking account of the remark able rapid growth of this discipline during the last decade. The conference was designed to bring together those mathematicians who were seriously interested in getting new challenging stimuli from economic theories with those economists who were seeking for effective mathematical weapons for their researches. Thirty invited talks (six of them were plenary talks) given at the conf- ence were roughly classified under the following six headings : 1) Nonlinear Dynamical Systems and Business Fluctuations, . 2) Fixed Point Theory, 3) Convex Analysis and Optimization, 4) Eigenvalue of Positive Operators, 5) Stochastic Analysis and Financial Market, 6) General Equilibrium Analysis.
As a new type of technique, simplicial methods have yielded extremely important contributions toward solutions of a system of nonlinear equations. Theoretical investigations and numerical tests have shown that the performance of simplicial methods depends critically on the triangulations underlying them. This monograph describes some recent developments in triangulations and simplicial methods. It includes the D1-triangulation and its applications to simplicial methods. As a result, efficiency of simplicial methods has been improved significantly. Thus more effective simplicial methods have been developed.
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include: Well-established nonparametric and parametric approaches to estimation and conventional (asymptotic and bootstrap) frameworks for statistical inference Estimation of models based on moment restrictions implied by economic theory, including various method-of-moments estimators for unconditional and conditional moment restriction models, and asymptotic theory for correctly specified and misspecified models Non-conventional asymptotic tools that lead to improved finite sample inference, such as higher-order asymptotic analysis that allows for more accurate approximations via various asymptotic expansions, and asymptotic approximations based on drifting parameter sequences Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
This book brings together a wide range of topics and perspectives in the growing field of Classification and related methods of Exploratory and Multivariate Data Analysis. It gives a broad view on the state ofthe art, useful for those in the scientific community who gather data and seek tools for analyzing and interpreting large sets of data. As it presents a wide field of applications, this book is not only of interest for data analysts, mathematicians and statisticians, but also for scientists from many areas and disciplines concerned with real data, e. g. , medicine, biology, astronomy, image analysis, pattern recognition, social sciences, psychology, marketing, etc. It contains 79 invited or selected and refereed papers presented during the Fourth Bi- ennial Conference of the International Federation of Classification Societies (IFCS'93) held in Paris. Previous conferences were held at Aachen (Germany), Charlottesville (USA) and Edinburgh (U. K. ). The conference at Paris emerged from the elose coop- eration between the eight members of the IFCS: British Classification Society (BCS), Classification Society of North America (CSNA), Gesellschaft fur Klassifikation (GfKl), J apanese Classification Society (J CS), Jugoslovenska Sekcija za Klasifikacije (JSK), Societe Francophone de Classification (SFC), Societa. Italiana di Statistica (SIS), Vereniging voor Ordinatie en Classificatie (VOC), and was organized by INRIA ("Institut National de Recherche en Informatique et en Automatique"), Rocquencourt and the "Ecole Nationale Superieure des Telecommuni- cations," Paris.
This study is a revised version of my doctoral dissertation at the Economics Department of the University of Munich. I want to take the opportunity to express my gratitude to some people who have helped me in my work. My greatest thanks go to the supervisor of this dissertation, Professor Claude Billinger. Bis ideas have formed the basis of my work. Be permanently sup ported it with a host of ideas, criticism and encouragement. Furthermore, he provided a stimulating research environment at SEMECON. This study would not have been possible in this form without the help of my present and former colleagues at SEMECON. I am indebted to Rudolf Kohne-Volland, Monika Sebold-Bender and Ulrich Woitek for providing soft ware and guidance for the data analysis. Discussions with them and with Thilo Weser have helped me to take many hurdles, particularly in the early stages of the project. My sincere thanks go to them all. I had the opportunity to present a former version of my growth model at a workshop of Professor Klaus Zimmermann. I want to thank all the parti cipants for their helpful comments. I also acknowledge critical and constructive comments from an anonymous referee. Table of Contents Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Part I. Methodology 1. Importance of Stylized Facts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.1 Limitations of statistical testing. . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.2 Evaluating economic models. . . . . . . . . . . . . . . . . . .. . . . 11 . . . . . . 2. Further Methodological Issues . . . . . . . . . . . . . . . . . .. . . . 13 . . . . . ."
This book presents a review of recent developments in the theory and construction of index numbers using the stochastic approach, demonstrating the versatility of this approach in handling various index number problems within a single conceptual framework. It also contains a brief, but complete, review of the existing approaches to index numbers with illustrative numerical examples.;The stochastic approach considers the index number problem as a signal extraction problem. The strength and reliability of the signal extracted from price and quantity changes for different commodities depends on the messages received and the information content of the messages. The most important applications of the new approach are to be found in the context of measuring rate of inflation and fixed and chain base index numbers for temporal comparisons and for spatial inter-country comparisons - the latter generally require special index number formulae that result in transitive and base invariant comparisons.
A discussion of various aspects of dynamic behavior of empirical macroeconomic, and in particular, macroeconometric models, is presented in this book. The book addresses in depth several theoretical and practical aspects concerning the modeling and analysis of long-run equilibrium behavior, adjustment dynamics and stability. Tools are developed to identify and interpret the main determinants of the dynamics of models. The tools involve, among others, error-correction mechanisms, eigenvalue analysis, feedback closure rules, graph theory, learning behavior, steady-state analysis, and stochastic simulation. Their usefulness is demonstrated by interesting applications to a number of well-known national and multi-national models.
'This most commendable volume brings together a set of papers which permits ready access to the means of estimating quantitative relationships using cointegration and error correction procedures. Providing the data to show fully the basis for calculation, this approach is an excellent perception of the needs of senior undergraduates and graduate students.' - Professor W.P. Hogan, The University of Sydney Applied economists, with modest econometric background, are now desperately looking for expository literature on the unit roots and cointegration techniques. This volume of expository essays is written for them. It explains in a simple style various tests for the existence of unit roots and how to estimate cointegration relationships. Original data are given to enable easy replications. Limitations of some existing unit root tests are also discussed.
Many models in this volume can be used in solving portfolio problems, in assessing forecasts, in understanding the possible effects of shocks and disturbances. |
You may like...
Voels Van Suider-Afrika - Die Volledige…
Burger Cillie, Niel Cillie, …
Paperback
(11)
Sasol Birds of Southern Africa (With PVC…
Ian Sinclair, Phil Hockey
Paperback
Simpson's Forensic Medicine
Jason Payne-James, Richard Martin Jones
Paperback
R1,210
Discovery Miles 12 100
|