![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This volume contains a selection of papers presented at the first conference of the Society for Computational Economics held at ICC Institute, Austin, Texas, May 21-24, 1995. Twenty-two papers are included in this volume, devoted to applications of computational methods for the empirical analysis of economic and financial systems; the development of computing methodology, including software, related to economics and finance; and the overall impact of developments in computing. The various contributions represented in the volume indicate the growing interest in the topic due to the increased availability of computational concepts and tools and the necessity of analyzing complex decision problems. The papers in this volume are divided into four sections: Computational methods in econometrics, Computational methods in finance, Computational methods for a social environment and New computational methods.GBP/LISTGBP
When von Neumann's and Morgenstern's Theory of Games and Economic Behavior appeared in 1944, one thought that a complete theory of strategic social behavior had appeared out of nowhere. However, game theory has, to this very day, remained a fast-growing assemblage of models which have gradually been united in a new social theory - a theory that is far from being completed even after recent advances in game theory, as evidenced by the work of the three Nobel Prize winners, John F. Nash, John C. Harsanyi, and Reinhard Selten. Two of them, Harsanyi and Selten, have contributed important articles to the present volume. This book leaves no doubt that the game-theoretical models are on the right track to becoming a respectable new theory, just like the great theories of the twentieth century originated from formerly separate models which merged in the course of decades. For social scientists, the age of great discover ies is not over. The recent advances of today's game theory surpass by far the results of traditional game theory. For example, modem game theory has a new empirical and social foundation, namely, societal experiences; this has changed its methods, its "rationality. " Morgenstern (I worked together with him for four years) dreamed of an encompassing theory of social behavior. With the inclusion of the concept of evolution in mathematical form, this dream will become true. Perhaps the new foundation will even lead to a new name, "conflict theory" instead of "game theory."
Scientific visualization may be defined as the transformation of numerical scientific data into informative graphical displays. The text introduces a nonverbal model to subdisciplines that until now has mostly employed mathematical or verbal-conceptual models. The focus is on how scientific visualization can help revolutionize the manner in which the tendencies for (dis)similar numerical values to cluster together in location on a map are explored and analyzed. In doing so, the concept known as spatial autocorrelation - which characterizes these tendencies - is further demystified.
Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."
The field of econometrics has gone through remarkable changes during the last thirty-five years. Widening its earlier focus on testing macroeconomic theories, it has become a rather comprehensive discipline concemed with the development of statistical methods and their application to the whole spectrum of economic data. This development becomes apparent when looking at the biography of an econometrician whose illustrious research and teaching career started about thirty-five years ago and who will retire very soon after his 65th birthday. This is Gerd Hansen, professor of econometrics at the Christian Albrechts University at Kiel and to whom this volume with contributions from colleagues and students has been dedicated. He has shaped the econometric landscape in and beyond Germany throughout these thirty-five years. At the end of the 1960s he developed one of the first econometric models for the German econ omy which adhered c10sely to the traditions put forth by the Cowles commission."
Data envelopment analysis develops a set of nonparametric and semiparametric techniques for measuring economic efficiency among firms and nonprofit organizations. Over the past decade this technique has found most widespread applications in public sector organizations. However these applications have been mostly static. This monograph extends this static framework of efficiency analysis in several new directions. These include but are not limited to the following: (1) a dynamic view of the production and cost frontier, where capital inputs are treated differently from the current inputs, (2) a direct role of the technological progress and regress, which is so often stressed in total factor productivity discussion in modem growth theory in economics, (3) stochastic efficiency in a dynamic setting, where reliability improvement competes with technical efficiency, (4) flexible manufacturing systems, where flexibility of the production process and the economies of scope play an important role in efficiency analysis and (5) the role of economic factors such as externalities and input interdependences. Efficiency is viewed here in the framework of a general systems theory model. Such a view is intended to broaden the scope of applications of this promising new technique of data envelopment analysis. The monograph stresses the various applied aspects of the dynamic theory, so that it can be empirically implemented in different situations. As far as possible abstract mathematical treatments are avoided and emphasis placed on the statistical examples and empirical illustrations.
This book covers a highly relevant and timely topic that is of wide interest, especially in finance, engineering and computational biology. The introductory material on simulation and stochastic differential equation is very accessible and will prove popular with many readers. While there are several recent texts available that cover stochastic differential equations, the concentration here on inference makes this book stand out. No other direct competitors are known to date. With an emphasis on the practical implementation of the simulation and estimation methods presented, the text will be useful to practitioners and students with minimal mathematical background. What's more, because of the many R programs, the information here is appropriate for many mathematically well educated practitioners, too.
Every advance in computer architecture and software tempts statisticians to tackle numerically harder problems. To do so intelligently requires a good working knowledge of numerical analysis. This book equips students to craft their own software and to understand the advantages and disadvantages of different numerical methods. Issues of numerical stability, accurate approximation, computational complexity, and mathematical modeling share the limelight in a broad yet rigorous overview of those parts of numerical analysis most relevant to statisticians. In this second edition, the material on optimization has been completely rewritten. There is now an entire chapter on the MM algorithm in addition to more comprehensive treatments of constrained optimization, penalty and barrier methods, and model selection via the lasso. There is also new material on the Cholesky decomposition, Gram-Schmidt orthogonalization, the QR decomposition, the singular value decomposition, and reproducing kernel Hilbert spaces. The discussions of the bootstrap, permutation testing, independent Monte Carlo, and hidden Markov chains are updated, and a new chapter on advanced MCMC topics introduces students to Markov random fields, reversible jump MCMC, and convergence analysis in Gibbs sampling. Numerical Analysis for Statisticians can serve as a graduate text for a course surveying computational statistics. With a careful selection of topics and appropriate supplementation, it can be used at the undergraduate level. It contains enough material for a graduate course on optimization theory. Because many chapters are nearly self-contained, professional statisticians will also find the book useful as a reference.
hen my husband died in 1973 I had to go through his W papers. Some of them were still in manuscript form and had never before been published. I selected several of these, plus a number of other articles that had appeared in periodicals but were no longer in print. This book is the result. At my request Richard Ebeling wrote an introduction which he has done in great detail. The depth of Ebeling's understanding of my husband's work is certainly apparent in his writing. I am pleased to have the Ludwig von Mises Institute present this volume to the public. Margit von Mises New York City September 1989 vii Introduction I I n the 1920s and the 1930s, Ludwig von Mises was recognized as one of the leading economic theorists on the European Conti nent. I F. A. Hayek has said that Mises's critique of the possibilities for economic calculation under socialism had "the most profound impression on my generation . . . . To none of us '" who read his] book Socialism] when it appeared was the world ever the same again., 2 Lord Lionel Robbins, in introducing the Austrian School literature on money and the trade cycle to English-speaking readers in 1931, emphasized the "marvelous renaissance" the "School of Vienna" had experienced "under the leadership of . . . Professor Mises."
Econometrics of Health Care - which we have sometimes called 'medico metrics' - is a field in full expansion. The reasons are numerous: our knowl edge of quantitative relations in the field of health econometrics is far from being perfect, a large number of analytical difficulties - combining medical (latent factors, e. g. ) and economic facts (spatial behaviour, e. g. ) are faced by the research worker, medical and pharmaceutical techniques change rapidly, medical costs rocket more than proportionally with available resources, of being tightened. medical budgets are in the process So it is not surprising that the practice of 'hygieconometrics' - to produce a neologism - is more and more included in the programmes of econometri cians. The Applied Econometrics Association has devoted to the topic two symposia in less than three years (Lyons, February 1983; Rotterdam, December 1985), without experiencing any difficulties in getting valuable papers: on econometrics of risks and medical insurance, on the measurement of health status and of efficiency of medical techniques, on general models allowing simulation. These were the themes for the second meeting, but other aspects of medical-economic problems had presented themselves already to the analyst: medical decision making and its consequences, the behaviour of the actors - patients and physicians -, regional medicometrics and what not: some of them have been covered by the first meeting. Finally, in July 1988 took place in Lyons the Fourth International Conference on System Science in Health Care; it should not be astonishing ."
Empirical Studies on Volatility in International Stock Markets describes the existing techniques for the measurement and estimation of volatility in international stock markets with emphasis on the SV model and its empirical application. Eugenie Hol develops various extensions of the SV model, which allow for additional variables in both the mean and the variance equation. In addition, the forecasting performance of SV models is compared not only to that of the well-established GARCH model but also to implied volatility and so-called realised volatility models which are based on intraday volatility measures. The intended readers are financial professionals who seek to obtain more accurate volatility forecasts and wish to gain insight about state-of-the-art volatility modelling techniques and their empirical value, and academic researchers and students who are interested in financial market volatility and want to obtain an updated overview of the various methods available in this area.
This highly useful book contains methodology for the analysis of data that arise from multiscale processes. It brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. These methods can handle different amounts of prior knowledge at different scales, as often occurs in practice.
The place in survival analysis now occupied by proportional hazards models and their generalizations is so large that it is no longer conceivable to offer a course on the subject without devoting at least half of the content to this topic alone. This book focuses on the theory and applications of a very broad class of models - proportional hazards and non-proportional hazards models, the former being viewed as a special case of the latter - which underlie modern survival analysis. Researchers and students alike will find that this text differs from most recent works in that it is mostly concerned with methodological issues rather than the analysis itself.
Written by a world leader in the field and aimed at researchers in applied and engineering sciences, this brilliant text has as its main goal imparting an understanding of the methods so that practitioners can make immediate use of existing algorithms and software, and so that researchers can extend the state of the art and find new applications. It includes algorithms on seeking feasibility and analyzing infeasibility, as well as describing new and surprising applications.
This book provides a synthesis of some recent issues and an up-to-date treatment of some of the major important issues in distributional analysis that I have covered in my previous book Ethical Social Index Numbers, which was widely accepted by students, teachers, researchers and practitioners in the area. Wide coverage of on-going and advanced topics and their analytical, articulate and authoritative p- sentation make the book theoretically and methodologically quite contemporary and inclusive, and highly responsive to the practical problems of recent concern. Since many countries of the world are still characterized by high levels of income inequality, Chap. 1 analyzes the problems of income inequality measurement in detail. Poverty alleviation is an overriding goal of development and social policy. To formulate antipoverty policies, research on poverty has mostly focused on inco- based indices. In view of this, a substantive analysis of income-based poverty has been presented in Chap. 2. The subject of Chap. 3 is people's perception about income inequality in terms of deprivation. Since polarization is of current concern to analysts and social decisi- makers, a discussion on polarization is presented in Chap. 4.
Income Elasticity and Economic Development Methods and Applications is mainly concerned with methods of estimating income elasticity. This field is connected with economic development that can be achieved by reducing income inequality. This is highly relevant in today's world, where the gap between rich and poor is widening with the growth of economic development. Income Elasticity and Economic Development Methods and Applications provides a good example in showing how to calculate income elasticity, using a number of methods from widely available grouped data. Some of the techniques presented here can be used in a wide range of policy areas in all developed, developing and under-developed countries. Policy analysts, economists, business analysts and market researchers will find this book very useful.
The econometric consequences of nonstationary data have wide ranging im plications for empirical research in economics. Specifically, these issues have implications for the study of empirical relations such as a money demand func tion that links macroeconomic aggregates: real money balances, real income and a nominal interest rate. Traditional monetary theory predicts that these nonsta tionary series form a cointegrating relation and accordingly, that the dynamics of a vector process comprised of these variables generates distinct patterns. Re cent econometric developments designed to cope with nonstationarities have changed the course of empirical research in the area, but many fundamental challenges, for example the issue of identification, remain. This book represents the efforts undertaken by the authors in recent years in an effort to determine the consequences that nonstationarity has for the study of aggregate money demand relations. We have brought together an empirical methodology that we find useful in conducting empirical research. Some of the work was undertaken during the authors' sabbatical periods and we wish to acknowledge the generous support of Arizona State University and Michigan State University respectively. Professor Hoffman wishes to acknowledge the support of the Fulbright-Hays Foundation that supported sabbattical research in Europe and separate support of the Council of 100 Summer Research Program at Arizona State University."
Lawrence Klein, University of Pennsylvania Jaime Marquez, Federal Reserve BoarrI* All examination of the economics literature over the last twenty years reveals a marked tendency towards polarisation. On the one hand, there has been a propensity to develop theoretical models which have little connection with either empirical verification or problems requiring immediate attention. On the other iland, empirical analyses are generally typified by testing for its own sake, with limited examination of the implications of the results. As a result, the number of papers confronting theory with facts towards the solution of economic problems has been on the decline for years. To fill this growing gap in the literature, we have invited a number of authors to write papers using both theoretical and empirical techniques to address current issues of interest to the profession at large: the US trade deficit and the global implications of policies that attempt to reduce it, the international ramifications of the debt crisis, the international oil market and its implications for the US oil industry, and the development of new econometric techniques. In addressing these issues, each author has approached the subject matter from an eclectic standpoint - that is, avoiding strict adherence to a given doctrine.
Markov chains have increasingly become useful way of capturing stochastic nature of many economic and financial variables. Although the hidden Markov processes have been widely employed for some time in many engineering applications e.g. speech recognition, its effectiveness has now been recognized in areas of social science research as well. The main aim of Hidden Markov Models: Applications to Financial Economics is to make such techniques available to more researchers in financial economics. As such we only cover the necessary theoretical aspects in each chapter while focusing on real life applications using contemporary data mainly from OECD group of countries. The underlying assumption here is that the researchers in financial economics would be familiar with such application although empirical techniques would be more traditional econometrics. Keeping the application level in a more familiar level, we focus on the methodology based on hidden Markov processes. This will, we believe, help the reader to develop more in-depth understanding of the modeling issues thereby benefiting their future research.
In a relatively short period of time, data envelopment analysis (DEA) has grown into a powerful analytical tool for measuring and evaluating performance. DEA is computational at its core and this book is one of several Springer aim to publish on the subject. This work deals with the micro aspects of handling and modeling data issues in DEA problems. It is a handbook treatment dealing with specific data problems, including imprecise data and undesirable outputs.
This book is the result of my doctoral dissertation research at the Department of Econometrics of the University of Geneva, Switzerland. This research was also partially financed by the Swiss National Science Foundation (grants 12- 31072.91 and 12-40300.94). First and foremost, I wish to express my deepest gratitude to Professor Manfred Gilli, my thesis supervisor, for his constant support and help. I would also like to thank the president of my jury, Professor Fabrizio Carlevaro, as well as the other members of the jury, Professor Andrew Hughes Hallett, Professor Jean-Philippe Vial and Professor Gerhard Wanner. I am grateful to my colleagues and friends of the Departement of Econometrics, especially David Miceli who provided constant help and kind understanding during all the stages of my research. I would also like to thank Pascale Mignon for proofreading my text and im proving my English. Finally, I am greatly indebted to my parents for their kindness and encourage ments without which I could never have achieved my goals. Giorgio Pauletto Department of Econometrics, University of Geneva, Geneva, Switzerland Chapter 1 Introduction The purpose of this book is to present the available methodologies for the solution of large-scale macroeconometric models. This work reviews classical solution methods and introduces more recent techniques, such as parallel com puting and nonstationary iterative algorithms."
Patrick Artus and Yves Barroux The Applied Econometric Association organised an international conference on "Monetary and Financial Models" in Geneva in January 1987. The purpose of this book is to make available to the public a choice of the papers that were presented at the conference. The selected papers all deal with the setting of monetary targets and the effects of monetary policy on the economy as well as with the analysis of the financial behaviours of economic agents. Other papers presented at the same conference but dealing with the external aspects of monetary policy (exchange rate policy, international coordination of economic policies, international transmission of business cycles, . . . ) are the matter of a distinct publication. The papers put together to make up this book either are theoretical research contributions or consist of applied statistical or econometric work. It seemed to be more logical to start with the more theoretical papers. The topics tackled in the first two parts of the book have in common the fact that they appeared just recently in the field of economic research and deal with the analysis of the behaviour of Central Banks. They analyse this behaviour so as to be able to exhibit its major determinants as well as revealed preferences of Central Banks: this topic comes under the caption "optimal monetary policy and reaction function of the monetary authorities."
Simulation methods are revolutionizing the practice of applied economic analysis. This volume collects eighteen chapters written by leading researchers from prestigious research institutions the world over. The common denominator of the papers is their relevance for applied research in environmental and resource economics. The topics range from discrete choice modeling with heterogeneity of preferences, to Bayesian estimation, to Monte Carlo experiments, to structural estimation of Kuhn-Tucker demand systems, to evaluation of simulation noise in maximum simulated likelihood estimates, to dynamic natural resource modeling. Empirical cases are used to show the practical use and the results brought forth by the different methods.
A non-technical introduction to the question of modeling with time-varying parameters, using the beta coefficient from Financial Economics as the main example. After a brief introduction to this coefficient for those not versed in finance, the book presents a number of rather well known tests for constant coefficients and then performs these tests on data from the Stockholm Exchange. The Kalman filter is then introduced and a simple example is used to demonstrate the power of the filter. The filter is then used to estimate the market model with time-varying betas. The book concludes with further examples of how the Kalman filter may be used in estimation models used in analyzing other aspects of finance. Since both the programs and the data used in the book are available for downloading, the book is especially valuable for students and other researchers interested in learning the art of modeling with time varying coefficients. |
You may like...
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Optimal Control Theory with Economic…
A. Seierstad, K. Sydsaeter
Hardcover
R1,373
Discovery Miles 13 730
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Economics and Econometrics of the…
Angeliki Menegaki
Paperback
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|