Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor, 'entia non sunt multiplicanda praeter necessitatem': entities are not to be multiplied beyond necessity. A problem with Ockham's razor is that nearly everybody seems to accept it, but few are able to define its exact meaning and to make it operational in a non-arbitrary way. Using a multidisciplinary perspective including philosophers, mathematicians, econometricians and economists, this 2002 monograph examines simplicity by asking six questions: what is meant by simplicity? How is simplicity measured? Is there an optimum trade-off between simplicity and goodness-of-fit? What is the relation between simplicity and empirical modelling? What is the relation between simplicity and prediction? What is the connection between simplicity and convenience? The book concludes with reflections on simplicity by Nobel Laureates in Economics.
First published in 2004, this is a rigorous but user-friendly book on the application of stochastic control theory to economics. A distinctive feature of the book is that mathematical concepts are introduced in a language and terminology familiar to graduate students of economics. The standard topics of many mathematics, economics and finance books are illustrated with real examples documented in the economic literature. Moreover, the book emphasises the dos and don'ts of stochastic calculus, cautioning the reader that certain results and intuitions cherished by many economists do not extend to stochastic models. A special chapter (Chapter 5) is devoted to exploring various methods of finding a closed-form representation of the value function of a stochastic control problem, which is essential for ascertaining the optimal policy functions. The book also includes many practice exercises for the reader. Notes and suggested readings are provided at the end of each chapter for more references and possible extensions.
This book proposes new methods of detecting causality among several dynamic variables and of estimating divisions of nominal income changes into changes in output and prices. Amano builds on established traditions of macro-dynamics and the theories of Keynes and Freidman, while providing innovative perspectives and important policy implications.
This book develops the major themes of time series analysis from its formal beginnings in the early part of the 20th century to the present day through the research of six distinguished British statisticians, all of whose work is characterised by the British traits of pragmatism and the desire to solve practical problems of importance.
The book proposes an overview of the research conducted to date in the field of wine economics. All of these contributions have in common the use of econometric techniques and mathematical formalization to describe the new challenges of this economic sector.
Monetary Policy and the Economy in South Africa covers both modern theories and empirical analysis, linking monetary policy with relating house wealth, drivers of current account based on asset approach, expenditure switching and income absorption effects of monetary policy on trade balance, effects of inflation uncertainty on output growth and international spill overs. Each chapter uses data and relevant methodology to answer empirical and pertinent policy questions in South Africa. The book gives new insights into understanding these areas of economic policy and the wider emerging-markets.
Observers and Macroeconomic Systems is concerned with the computational aspects of using a control-theoretic approach to the analysis of dynamic macroeconomic systems. The focus is on using a separate model for the development of the control policies. In particular, it uses the observer-based approach whereby the separate model learns to behave in a similar manner to the economic system through output-injections. The book shows how this approach can be used to learn the forward-looking behaviour of economic actors which is a distinguishing feature of dynamic macroeconomic models. It also shows how it can be used in conjunction with low-order models to undertake policy analysis with a large practical econometric model. This overcomes some of the computational problems arising from using just the large econometric models to compute optimal policy trajectories. The work also develops visual simulation software tools that can be used for policy analysis with dynamic macroeconomic systems.
Each chapter of Macroeconometrics is written by respected econometricians in order to provide useful information and perspectives for those who wish to apply econometrics in macroeconomics. The chapters are all written with clear methodological perspectives, making the virtues and limitations of particular econometric approaches accessible to a general readership familiar with applied macroeconomics. The real tensions in macroeconometrics are revealed by the critical comments from different econometricians, having an alternative perspective, which follow each chapter.
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
This volume seeks to go beyond the microeconomic view of wages as a cost having negative consequences on a given firm, to consider the positive macroeconomic dynamics associated with wages as a major component of aggregate demand.
This book presents an exciting new set of econometric methods. They have been developed as a result of the increase in power and affordability of computers which allow simulations to be run. The authors have played a large role in developing the techniques.
Modelling and Forecasting Financial Data brings together a coherent and accessible set of chapters on recent research results on this topic. To make such methods readily useful in practice, the contributors to this volume have agreed to make available to readers upon request all computer programs used to implement the methods discussed in their respective chapters. Modelling and Forecasting Financial Data is a valuable resource for researchers and graduate students studying complex systems in finance, biology, and physics, as well as those applying such methods to nonlinear time series analysis and signal processing.
The modern system-wide approach to applied demand analysis emphasizes a unity between theory and applications. Its firm foundations in economic theory make it one of the most successful areas of applied econometrics. A System-Wide Analysis of International Consumption Patterns presents a large number of applications of recent innovations in the area and uses consumption data for 18 OECD countries to provide convincing evidence, one way or the other, about the validity of consumption theory. The empirical results presented in the book have a number of uses. Reliable estimates of income and price elasticities of demand are provided for 10 commodity groups in 18 countries. A feature of these results is that a number of major empirical regularities are identified that seem to hold across different periods and different countries. A System-Wide Analysis of International Consumption Patterns also presents an extensive application of recently developed Monte Carlo testing procedures - to test demand theory and the structure of preferences. The results so obtained are in stark contrast to most previous findings based on the conventional asymptotic tests. Other results presented in the book include: (i) Differences in economic variables (prices and and incomes in particular) account for observed differences in consumption patterns internationally, while differences in tastes seem to play a much smaller role. (ii) Own-price elasticities are approximately proportional to the corresponding income elasticities, a result coinciding with Pigou's law. (iii) The income elasticity of the marginal utility of income does not seem to depend on income, which contradicts the famous Frisch's conjecture.
Unlike uncertain dynamical systems in physical sciences where models for prediction are somewhat given to us by physical laws, uncertain dynamical systems in economics need statistical models. In this context, modeling and optimization surface as basic ingredients for fruitful applications. This volume concentrates on the current methodology of copulas and maximum entropy optimization. This volume contains main research presentations at the Sixth International Conference of the Thailand Econometrics Society held at the Faculty of Economics, Chiang Mai University, Thailand, during January 10-11, 2013. It consists of keynote addresses, theoretical and applied contributions. These contributions to Econometrics are somewhat centered around the theme of Copulas and Maximum Entropy Econometrics. The method of copulas is applied to a variety of economic problems where multivariate model building and correlation analysis are needed. As for the art of choosing copulas in practical problems, the principle of maximum entropy surfaces as a potential way to do so. The state-of-the-art of Maximum Entropy Econometrics is presented in the first keynote address, while the second keynote address focusses on testing stationarity in economic time series data.
These proceedings, from a conference held at the Federal Reserve Bank of St. Louis on October 17-18, 1991, attempted to layout what we currently know about aggregate economic fluctuations. Identifying what we know inevitably reveals what we do not know about such fluctuations as well. From the vantage point of where the conference's participants view our current understanding to be, these proceedings can be seen as suggesting an agenda for further research. The conference was divided into five sections. It began with the formu lation of an empirical definition of the "business cycle" and a recitation of the stylized facts that must be explained by any theory that purports to capture the business cycle's essence. After outlining the historical develop ment and key features of the current "theories" of business cycles, the conference evaluated these theories on the basis of their ability to explain the facts. Included in this evaluation was a discussion of whether (and how) the competing theories could be distinguished empirically. The conference then examined the implications for policy of what is known and not known about business cycles. A panel discussion closed the conference, high lighting important unresolved theoretical and empirical issues that should be taken up in future business cycle research. What Is a Business Cycle? Before gaining a genuine understanding of business cycles, economists must agree and be clear about what they mean when they refer to the cycle."
Cost Structure and the Measurement of Economic Performance is designed to provide a comprehensive guide for students, researchers or consultants who wish to model, construct, interpret, and use economic performance measures. The topical emphasis is on productivity growth and its dependence on the cost structure. The methodological focus is on application of the tools of economic analysis - the `thinking structure' provided by microeconomic theory - to measure technological or cost structure, and link it with market and regulatory structure. This provides a rich basis for evaluation of economic performance and its determinants. The format of the book stresses topics or questions of interest rather than the theoretical tools for analysis. Traditional productivity growth modeling and measurement practices that result in a productivity residual often called the `measure of our ignorance' are initially overviewed, and then the different aspects of technological, market and regulatory structure that might underlie this residual are explored. The ultimate goal is to decompose or explain the residual, by modeling and measuring a multitude of impacts that determine the economic performance of firms, sectors, and economies. The chapters are organized with three broad goals in mind. The first is to introduce the overall ideas involved in economic performance measurement and traditional productivity growth analysis. Issues associated with different types of (short and long run, internal and external) cost economies, market and regulatory impacts, and other general cost efficiencies that might impact these measures are then explored. Finally, some of the theoretical, data construction and econometric tools necessary to justify and implement these models are emphasized.
Nonlinear Time Series Analysis of Economic and Financial Data provides an examination of the flourishing interest that has developed in this area over the past decade. The constant theme throughout this work is that standard linear time series tools leave unexamined and unexploited economically significant features in frequently used data sets. The book comprises original contributions written by specialists in the field, and offers a combination of both applied and methodological papers. It will be useful to both seasoned veterans of nonlinear time series analysis and those searching for an informative panoramic look at front-line developments in the area.
How successful is PPP, and its extension in the monetary model, as a measure of the equilibrium exchange rate? What are the determinants and dynamics of equilibrium real exchange rates? How can misalignments be measured, and what are their causes? What are the effects of specific policies upon the equilibrium exchange rate? The answers to these questions are important to academic theorists, policymakers, international bankers and investment fund managers. This volume encompasses all of the competing views of equilibrium exchange rate determination, from PPP, through other reduced form models, to the macroeconomic balance approach. This volume is essentially empirical: what do we know about exchange rates? The different econometric and theoretical approaches taken by the various authors in this volume lead to mutually consistent conclusions. This consistency gives us confidence that significant progress has been made in understanding what are the fundamental determinants of exchange rates and what are the forces operating to bring them back in line with the fundamentals.
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand.
Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.
These three volumes contain an account of Professor Henri Theil's distinguished career as a leader, advisor, administrator, teacher, and researcher in economics and econometrics. The books also contain a selection of his contributions in many areas, such as econometrics, demand analysis, information theory, forecasting, statistics, economic policy analysis and management science. To date he has contributed over 250 articles in refereed journals and chapters in books, and 15 books, three of which became citation classics. His books and articles have appeared in (and have been translated into) many languages, such as Polish, Russian, Dutch, English, French, German, Hungarian, Italian and Japanese. This collection provides excellent reference material to researchers and graduate students working in a variety of disciplines, such as econometrics, economics, management science, operations research, and statistics. Moreover, Professor Theil's career serves as a role model for younger generations of scholars, both in terms of his approach to research and his commitment to his profession. Professor Theil's distinguished career as an academic began in 1953 when he was appointed Professor of Econometrics at the Netherlands School of Economics in Rotterdam (now Erasmus University). Three years later he founded the Econometric Institute in Rotterdam and served as its first director until 1966, when he accepted a joint appointment at the Graduate School of Business and Department of Economics, University of Chicago, U.S.A. In 1981, Theil was appointed to the McKethan-Matherly Eminent Chair at the Graduate School of Business Administration of the University of Florida in Gainesville. Theil has received many international honours including four honorary degrees.
Models, Methods, Concepts and Applications of the Analytic Hierarchy Process is a volume dedicated to selected applications of the Analytic Hierarchy Process (AHP) focused on three themes: economics, the social sciences, and the linking of measurement with human values. (1) The AHP offers economists a substantially different approach to dealing with economic problems through ratio scales. The main mathematical models on which economics has based its quantitative thinking up to now are utility theory, which uses interval scales, and linear programming. We hope that the variety of examples included here can perhaps stimulate researchers in economics to try applying this new approach. (2) The second theme is concerned with the social sciences. The AHP offers psychologists and political scientists the methodology to quantify and derive measurements for intangibles. We hope that the examples included in this book will encourage them to examine the methods of AHP in terms of the problems they seek to solve. (3) The third theme is concerned with providing people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process one needs to interpret what the measurements mean. A number is useless until someone understands what it means. It can have different meanings in different problems. Ten dollars are plenty to satisfy one's hunger but are useless by themselves in buying a new car. Such measurements are only indicators of the state of a system, but do not relate to the values of the human observers of that system. AHP methods can help resolve the conflicts between hard measurement data and human values.
The twelve papers in this collection grew out of the workshop on "Eco nomic Evolution, Learning, and Complexity" held at the University of Augsburg, Augsburg, Germany on May 23-25, 1997. The Augsburg workshop was the second of two events in the Euroconference Series on Evolutionary Economics, the first of which was held in Athens, Greece in September 1993. A special issue of the Journal of Evolutionary Econo mics (1993(4)) edited by Yannis Katsoulacos on "Evolutionary and Neo classical Perspectives on Market Structure and Economic Growth" con tains selected papers from the Athens conference. The Athens conference explored neoclassical and evolutionary perspectives on technological competition and increasing returns. It helped to identify the dis tinguishing features of evolutionary scholarship. The Augsburg workshop was more oriented toward exploring methodological issues in evolutiona of the papers employed new me ry and related scholarship. A number thods, such as genetic programming and experimental analysis, some developed new econometric techniques or raised new empirical issues in evolutionary economics, and some relied on simulation techniques. Twelve papers covering a range of areas were selected for this collection. The papers address central issues in evolutionary and Schumpeterian accounts of industrial competition, learning, and innovation."
Stochastic Volatility in Financial Markets presents advanced topics in financial econometrics and theoretical finance, and is divided into three main parts. The first part aims at documenting an empirical regularity of financial price changes: the occurrence of sudden and persistent changes of financial markets volatility. This phenomenon, technically termed stochastic volatility', or conditional heteroskedasticity', has been well known for at least 20 years; in this part, further, useful theoretical properties of conditionally heteroskedastic models are uncovered. The second part goes beyond the statistical aspects of stochastic volatility models: it constructs and uses new fully articulated, theoretically-sounded financial asset pricing models that allow for the presence of conditional heteroskedasticity. The third part shows how the inclusion of the statistical aspects of stochastic volatility in a rigorous economic scheme can be faced from an empirical standpoint.
Quantitative Methods for Portfolio Analysis provides practical models and methods for the quantitative analysis of financial asset prices, construction of various portfolios, and computer-assisted trading systems. In particular, this book is required reading for: (1) `Quants' (quantitatively-inclined analysts) in financial industries; (2) financial engineers in investment banks, securities companies, derivative-trading companies, software houses, etc., who are developing portfolio trading systems; (3) graduate students and specialists in the areas of finance, business, economics, statistics, financial engineering; and (4) investors who are interested in Japanese financial markets. Throughout the book the emphasis is placed on the originality and usefulness of models and methods for the construction of portfolios and investment decision making, and examples are provided to demonstrate, with practical analysis, models for Japanese financial markets. |
You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,105
Discovery Miles 61 050
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,508
Discovery Miles 25 080
Kwantitatiewe statistiese tegnieke
Swanepoel Swanepoel, Vivier Vivier, …
Book
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
|