![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book proposes a new methodology for the selection of one (model) from among a set of alternative econometric models. Let us recall that a model is an abstract representation of reality which brings out what is relevant to a particular economic issue. An econometric model is also an analytical characterization of the joint probability distribution of some random variables of interest, which yields some information on how the actual economy works. This information will be useful only if it is accurate and precise; that is, the information must be far from ambiguous and close to what we observe in the real world Thus, model selection should be performed on the basis of statistics which summarize the degree of accuracy and precision of each model. A model is accurate if it predicts right; it is precise if it produces tight confidence intervals. A first general approach to model selection includes those procedures based on both characteristics, precision and accuracy. A particularly interesting example of this approach is that of Hildebrand, Laing and Rosenthal (1980). See also Hendry and Richard (1982). A second general approach includes those procedures that use only one of the two dimensions to discriminate among models. In general, most of the tests we are going to examine correspond to this category.
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
This book proposes a uniform logic and probabilistic (LP) approach to risk estimation and analysis in engineering and economics. It covers the methodological and theoretical basis of risk management at the design, test, and operation stages of economic, banking, and engineering systems with groups of incompatible events (GIE). This edition includes new chapters providing a detailed treatment of scenario logic and probabilistic models for revealing bribes. It also contains clear definitions and notations, revised sections and chapters, an extended list of references, and a new subject index, as well as more than a hundred illustrations and tables which motivate the presentation.
During 1985-86, the acquisition editor for the humanities and social sciences division of Kluwer Academic Publishers in the Netherlands visited the University of Horida (where I was also visiting while on sabbatical leave from Wilfrid Laurier University as the McKethan-Matherly Senior Research Fellow) to discuss publishing plans of the faculty. He expressed a keen interest in publishing the proceedings of the conference of the Canadian Econometric Study Group (CESG) that was to be held the following year at WLU. This volume is the end product of his interest, endurance, and persistence. But for his persistence I would have given up on th~ project Most of the papers (though not all) included in this volume are based on presentations at CESG conferences. In some cases scholars were invited to contribute to this volume where their research complimented those presented at these conferences even though they were not conference participants. Since papers selected for presentation at the CESG conferences are generally the finished product of scholarly research and often under submission to refereed journals, it was not possible to publish the conference proceedings in their entirety. Accordingly it was decided, in consultation with the publisher, to invite a select list of authors to submit significant extensions of the papers they presented at the CESG conferences for inclusion in this volume. The editor wishes to express gratitude to all those authors who submitted their papers for evaluation by anonymous referees and for making revisions to conform to our editorial process.
Covers applications to risky assets traded on the markets for
funds, fixed-income products and electricity derivatives.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Gerald P. Dwyer, Jr. and R. W. Hafer The articles and commentaries included in this volume were presented at the Federal Reserve Bank of St. Louis' thirteenth annual economic policy conference, held on October 21-22, 1988. The conference focused on the behavior of asset market prices, a topic of increasing interest to both the popular press and to academic journals as the bull market of the 1980s continued. The events that transpired during October, 1987, both in the United States and abroad, provide an informative setting to test alter native theories. In assembling the papers presented during this conference, we asked the authors to explore the issue of asset pricing and financial market behavior from several vantages. Was the crash evidence of the bursting of a speculative bubble? Do we know enough about the work ings of asset markets to hazard an intelligent guess why they dropped so dramatically in such a brief time? Do we know enough to propose regulatory changes that will prevent any such occurrence in the future, or do we want to even if we can? We think that the articles and commentaries contained in this volume provide significant insight to inform and to answer such questions. The article by Behzad Diba surveys existing theoretical and empirical research on rational bubbles in asset prices."
Given the magnitude of currency speculation and sports gambling, it is surprising that the literature contains mostly negative forecasting results. Majority opinion still holds that short term fluctuations in financial markets follow random walk. In this non-random walk through financial and sports gambling markets, parallels are drawn between modeling short term currency movements and modeling outcomes of athletic encounters. The forecasting concepts and methodologies are identical; only the variables change names. If, in fact, these markets are driven by mechanisms of non-random walk, there must be some explanation for the negative forecasting results. The Analysis of Sports Forecasting: Modeling Parallels Between Sports Gambling and Financial Markets examines this issue.
Modern apparatuses allow us to collect samples of functional data, mainly curves but also images. On the other hand, nonparametric statistics produces useful tools for standard data exploration. This book links these two fields of modern statistics by explaining how functional data can be studied through parameter-free statistical ideas. At the same time it shows how functional data can be studied through parameter-free statistical ideas, and offers an original presentation of new nonparametric statistical methods for functional data analysis.
This book focuses on tools and techniques for building regression models using real-world data and assessing their validity. A key theme throughout the book is that it makes sense to base inferences or conclusions only on valid models. Plots are shown to be an important tool for both building regression models and assessing their validity. We shall see that deciding what to plot and how each plot should be interpreted will be a major challenge. In order to overcome this challenge we shall need to understand the mathematical properties of the fitted regression models and associated diagnostic procedures. As such this will be an area of focus throughout the book. In particular, we shall carefully study the properties of resi- als in order to understand when patterns in residual plots provide direct information about model misspecification and when they do not. The regression output and plots that appear throughout the book have been gen- ated using R. The output from R that appears in this book has been edited in minor ways. On the book web site you will find the R code used in each example in the text.
International Applications of Productivity and Efficiency Analysis features a complete range of techniques utilized in frontier analysis, including extensions of existing techniques and the development of new techniques. Another feature is that most of the contributions use panel data in a variety of approaches. Finally, the range of empirical applications is at least as great as the range of techniques, and many of the applications are of considerable policy relevance.
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists. In addition, the book addresses more technical econometric considerations such as the importance of the choice between single-equation and system-wide approaches to modelling the exchange rate, and the reduced form versus structural equation problems. Readers will gain both a comprehensive overview of the way macroeconomists approach exchange rate modelling, and an understanding of how advanced techniques can help them explain and predict the behavior of this crucial economic variable.
Standard macroeconomic monographs often discuss the mechanism of monetary transmission, usually ending by highlighting the complexities and uncertainties involved in this mechanism. Conversely, The Preparation of Monetary Policy takes these uncertainties as a starting point, analytically investigating their nature and spelling out their consequences for the monetary policy maker. The second innovative aspect of this book is its focus on policy preparation instead of well-covered topics such as monetary policy strategy, tactics, and implementation. Thirdly, a general, multi-model framework for preparing monetary policy is proposed, which is illustrated by case studies stressing the role of international economic linkages and of expectations. Written in a self-contained fashion, these case studies are of interest by themselves. The book is written for an audience that is interested in the art and science of monetary policy making, which includes central bankers, academics, and (graduate) students in the field of monetary economics, macroeconomics, international economics and finance.
This monograph examines the domain of classical political economy using the methodologies developed in recent years both by the new discipline of econo-physics and by computing science. This approach is used to re-examine the classical subdivisions of political economy: production, exchange, distribution and finance. The book begins by examining the most basic feature of economic life - production - and asks what it is about physical laws that allows production to take place. How is it that human labour is able to modify the world? It looks at the role that information has played in the process of mass production and the extent to which human labour still remains a key resource. The Ricardian labour theory of value is re-examined in the light of econophysics, presenting agent based models in which the Ricardian theory of value appears as an emergent property. The authors present models giving rise to the class distribution of income, and the long term evolution of profit rates in market economies. Money is analysed using tools drawn both from computer science and the recent Chartalist school of financial theory. Covering a combination of techniques drawn from three areas, classical political economy, theoretical computer science and econophysics, to produce models that deepen our understanding of economic reality, this new title will be of interest to higher level doctoral and research students, as well as scientists working in the field of econophysics.
The search for symmetry is part of the fundamental scientific paradigm in mathematics and physics. Can this be valid also for economics? This book represents an attempt to explore this possibility. The behavior of price-taking producers, monopolists, monopsonists, sectoral market equilibria, behavior under risk and uncertainty, and two-person zero- and non-zero-sum games are analyzed and discussed under the unifying structure called the linear complementarity problem. Furthermore, the equilibrium problem allows for the relaxation of often-stated but unnecessary assumptions. This unifying approach offers the advantage of a better understanding of the structure of economic models. It also introduces the simplest and most elegant algorithm for solving a wide class of problems.
Franz Ferschl is seventy. According to his birth certificate it is true, but it is unbelievable. Two of the three editors remembers very well the Golden Age of Operations Research at Bonn when Franz Ferschl worked together with Wilhelm Krelle, Martin Beckmann and Horst Albach. The importance of this fruitful cooperation is reflected by the fact that half of the contributors to this book were strongly influenced by Franz Ferschl and his colleagues at the University of Bonn. Clearly, Franz Ferschl left his traces at all the other places of his professional activities, in Vienna and Munich. This is demonstrated by the present volume as well. Born in 1929 in the Upper-Austrian Miihlviertel, his scientific education brought him to Vienna where he studied mathematics. In his early years he was attracted by Statistics and Operations Research. During his employment at the Osterreichische Bundeskammer fUr Gewerbliche Wirtschaft in Vienna he prepared his famous book on queueing theory and stochastic processes in economics. This work has been achieved during his scarce time left by his duties at the Bundeskammer, mostly between 6 a.m. and midnight. All those troubles were, however, soon rewarded by the chair of statistics at Bonn University. As a real Austrian, the amenities of the Rhineland could not prevent him from returning to Vienna, where he took the chair of statistics.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.
Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
The availability of microdata has increased rapidly over the last decades, and standard statistical and econometric software packages for data analysis include ever more sophisticated modeling options. The goal of this book, now initssecondedition, istofamiliarizethereaderwithawiderangeofcommonly used models, and thereby to enable her/him to become a critical consumer of current empirical research, and to properly conduct own empirical analyses. The book can be used as a textbook for an advanced undergraduate, a Master's or a ?rst-year Ph.D. course on the topic of microdata analysis. In economicsandrelateddisciplines, suchacourseistypicallyo?eredaftera?rst course on the linear regression model. Alternatively, the book can also serve as a supplementary text to applied ?eld courses, such as those dealing with empirical analyses in labor, health or education. Finally, it might provide a useful reference for graduate students, researchers and practitioners who encounter microdata in their work. The focus of the book is on regression-type models in the context of large cross-section samples where the dependent variable is qualitative or discrete, or where the sample is not randomly drawn from the population of interest, due to censoring or truncation of the dependent variable. While our ba- groundisineconomics, andweoccasionallyrefertoproblemsandapplications fromempiricaleconomics, themodelsdiscussedinthisbookshouldbeequally relevant wherever microdata are used, inside the social sciences, including for example quantitative political science and sociology, as well as outside.
This book presents in detail methodologies for the Bayesian estimation of sing- regime and regime-switching GARCH models. These models are widespread and essential tools in n ancial econometrics and have, until recently, mainly been estimated using the classical Maximum Likelihood technique. As this study aims to demonstrate, the Bayesian approach o ers an attractive alternative which enables small sample results, robust estimation, model discrimination and probabilistic statements on nonlinear functions of the model parameters. The author is indebted to numerous individuals for help in the preparation of this study. Primarily, I owe a great debt to Prof. Dr. Philippe J. Deschamps who inspired me to study Bayesian econometrics, suggested the subject, guided me under his supervision and encouraged my research. I would also like to thank Prof. Dr. Martin Wallmeier and my colleagues of the Department of Quantitative Economics, in particular Michael Beer, Roberto Cerratti and Gilles Kaltenrieder, for their useful comments and discussions. I am very indebted to my friends Carlos Ord as Criado, Julien A. Straubhaar, J er DEGREES ome Ph. A. Taillard and Mathieu Vuilleumier, for their support in the elds of economics, mathematics and statistics. Thanks also to my friend Kevin Barnes who helped with my English in this work. Finally, I am greatly indebted to my parents and grandparents for their support and encouragement while I was struggling with the writing of this t
This book is a review of the analytical methods required in most of the quantitative courses taught at MBA programs. Students with no technical background, or who have not studied mathematics since college or even earlier, may easily feel overwhelmed by the mathematical formalism that is typical of economics and finance courses. These students will benefit from a concise and focused review of the analytical tools that will become a necessary skill in their MBA classes. The objective of this book is to present the essential quantitative concepts and methods in a self-contained, non-technical, and intuitive way.
This edition contains a large number of additions and corrections scattered throughout the text, including the incorporation of a new chapter on state-space models. The companion diskette for the IBM PC has expanded into the software package ITSM: An Interactive Time Series Modelling Package for the PC, which includes a manual and can be ordered from Springer-Verlag. * We are indebted to many readers who have used the book and programs and made suggestions for improvements. Unfortunately there is not enough space to acknowledge all who have contributed in this way; however, special mention must be made of our prize-winning fault-finders, Sid Resnick and F. Pukelsheim. Special mention should also be made of Anthony Brockwell, whose advice and support on computing matters was invaluable in the preparation of the new diskettes. We have been fortunate to work on the new edition in the excellent environments provided by the University of Melbourne and Colorado State University. We thank Duane Boes particularly for his support and encouragement throughout, and the Australian Research Council and National Science Foundation for their support of research related to the new material. We are also indebted to Springer-Verlag for their constant support and assistance in preparing the second edition. Fort Collins, Colorado P. J. BROCKWELL November, 1990 R. A. DAVIS * /TSM: An Interactive Time Series Modelling Package for the PC by P. J. Brockwell and R. A. Davis. ISBN: 0-387-97482-2; 1991.
Decision-theoretic ideas can structure the process of inference together with the decision-making that inference supports. Statistical decision theory is the sub-discipline of statistics which explores and develops this structure. Typically, discusion of decision theory within one discipline does not recognise that other disciplines may have considered the same or similar problems. This text, Volume 9 in the prestigious Kendall's Library of Statistics, provides an overview of the main ideas and concepts of statistical decision theory and sets it within the broader concept of decision theory, decision analysis and decision support as they are practised in many disciplines beyond statistics - including artificial intelligence, economics, operational research, philosophy and psychology. |
You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|