![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Economists and psychologists have, on the whole, exhibited sharply different perspectives on the elicitation of preferences. Economists, who have made preference the central primitive in their thinking about human behavior, have for the most part rejected elicitation and have instead sought to infer preferences from observations of choice behavior. Psychologists, who have tended to think of preference as a context-determined subjective construct, have embraced elicitation as their dominant approach to measurement. This volume, based on a symposium organized by Daniel McFadden at the University of California at Berkeley, provides a provocative and constructive engagement between economists and psychologists on the elicitation of preferences.
Multivariate Statistical Analysis
Aggregation of individual opinions into a social decision is a problem widely observed in everyday life. For centuries people tried to invent the best' aggregation rule. In 1951 young American scientist and future Nobel Prize winner Kenneth Arrow formulated the problem in an axiomatic way, i.e., he specified a set of axioms which every reasonable aggregation rule has to satisfy, and obtained that these axioms are inconsistent. This result, often called Arrow's Paradox or General Impossibility Theorem, had become a cornerstone of social choice theory. The main condition used by Arrow was his famous Independence of Irrelevant Alternatives. This very condition pre-defines the local' treatment of the alternatives (or pairs of alternatives, or sets of alternatives, etc.) in aggregation procedures. Remaining within the framework of the axiomatic approach and based on the consideration of local rules, Arrovian Aggregation Models investigates three formulations of the aggregation problem according to the form in which the individual opinions about the alternatives are defined, as well as to the form of desired social decision. In other words, we study three aggregation models. What is common between them is that in all models some analogue of the Independence of Irrelevant Alternatives condition is used, which is why we call these models Arrovian aggregation models. Chapter 1 presents a general description of the problem of axiomatic synthesis of local rules, and introduces problem formulations for various versions of formalization of individual opinions and collective decision. Chapter 2 formalizes precisely the notion of rationality' of individual opinions and social decision. Chapter 3 deals with the aggregation model for the case of individual opinions and social decisions formalized as binary relations. Chapter 4 deals with Functional Aggregation Rules which transform into a social choice function individual opinions defined as choice functions. Chapter 5 considers another model &endash; Social Choice Correspondences when the individual opinions are formalized as binary relations, and the collective decision is looked for as a choice function. Several new classes of rules are introduced and analyzed.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
Industrial Price, Quantity, and Productivity Indices: The Micro-Economic Theory and an Application gives a comprehensive account of the micro-economic foundations of industrial price, quantity, and productivity indices. The various results available from the literature have been brought together into a consistent framework, based upon modern duality theory. This integration also made it possible to generalize several of these results. Thus, this book will be an important resource for theoretically as well as empirically-oriented researchers who seek to analyse economic problems with the help of index numbers. Although this book's emphasis is on micro-economic theory, it is also intended as a practical guide. A full chapter is therefore devoted to an empirical application. Three different approaches are pursued: a straightforward empirical approach, a non-parametric estimation approach, and a parametric estimation approach. As well as illustrating some of the more important concepts explored in this book, and showing to what extent different computational approaches lead to different outcomes for the same measures, this chapter also makes a powerful case for the use of enterprise micro-data in economic research.
This volume contains a selection of papers presented at the first conference of the Society for Computational Economics held at ICC Institute, Austin, Texas, May 21-24, 1995. Twenty-two papers are included in this volume, devoted to applications of computational methods for the empirical analysis of economic and financial systems; the development of computing methodology, including software, related to economics and finance; and the overall impact of developments in computing. The various contributions represented in the volume indicate the growing interest in the topic due to the increased availability of computational concepts and tools and the necessity of analyzing complex decision problems. The papers in this volume are divided into four sections: Computational methods in econometrics, Computational methods in finance, Computational methods for a social environment and New computational methods.GBP/LISTGBP
When von Neumann's and Morgenstern's Theory of Games and Economic Behavior appeared in 1944, one thought that a complete theory of strategic social behavior had appeared out of nowhere. However, game theory has, to this very day, remained a fast-growing assemblage of models which have gradually been united in a new social theory - a theory that is far from being completed even after recent advances in game theory, as evidenced by the work of the three Nobel Prize winners, John F. Nash, John C. Harsanyi, and Reinhard Selten. Two of them, Harsanyi and Selten, have contributed important articles to the present volume. This book leaves no doubt that the game-theoretical models are on the right track to becoming a respectable new theory, just like the great theories of the twentieth century originated from formerly separate models which merged in the course of decades. For social scientists, the age of great discover ies is not over. The recent advances of today's game theory surpass by far the results of traditional game theory. For example, modem game theory has a new empirical and social foundation, namely, societal experiences; this has changed its methods, its "rationality. " Morgenstern (I worked together with him for four years) dreamed of an encompassing theory of social behavior. With the inclusion of the concept of evolution in mathematical form, this dream will become true. Perhaps the new foundation will even lead to a new name, "conflict theory" instead of "game theory."
Scientific visualization may be defined as the transformation of numerical scientific data into informative graphical displays. The text introduces a nonverbal model to subdisciplines that until now has mostly employed mathematical or verbal-conceptual models. The focus is on how scientific visualization can help revolutionize the manner in which the tendencies for (dis)similar numerical values to cluster together in location on a map are explored and analyzed. In doing so, the concept known as spatial autocorrelation - which characterizes these tendencies - is further demystified.
Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."
The field of econometrics has gone through remarkable changes during the last thirty-five years. Widening its earlier focus on testing macroeconomic theories, it has become a rather comprehensive discipline concemed with the development of statistical methods and their application to the whole spectrum of economic data. This development becomes apparent when looking at the biography of an econometrician whose illustrious research and teaching career started about thirty-five years ago and who will retire very soon after his 65th birthday. This is Gerd Hansen, professor of econometrics at the Christian Albrechts University at Kiel and to whom this volume with contributions from colleagues and students has been dedicated. He has shaped the econometric landscape in and beyond Germany throughout these thirty-five years. At the end of the 1960s he developed one of the first econometric models for the German econ omy which adhered c10sely to the traditions put forth by the Cowles commission."
Data envelopment analysis develops a set of nonparametric and semiparametric techniques for measuring economic efficiency among firms and nonprofit organizations. Over the past decade this technique has found most widespread applications in public sector organizations. However these applications have been mostly static. This monograph extends this static framework of efficiency analysis in several new directions. These include but are not limited to the following: (1) a dynamic view of the production and cost frontier, where capital inputs are treated differently from the current inputs, (2) a direct role of the technological progress and regress, which is so often stressed in total factor productivity discussion in modem growth theory in economics, (3) stochastic efficiency in a dynamic setting, where reliability improvement competes with technical efficiency, (4) flexible manufacturing systems, where flexibility of the production process and the economies of scope play an important role in efficiency analysis and (5) the role of economic factors such as externalities and input interdependences. Efficiency is viewed here in the framework of a general systems theory model. Such a view is intended to broaden the scope of applications of this promising new technique of data envelopment analysis. The monograph stresses the various applied aspects of the dynamic theory, so that it can be empirically implemented in different situations. As far as possible abstract mathematical treatments are avoided and emphasis placed on the statistical examples and empirical illustrations.
This book covers a highly relevant and timely topic that is of wide interest, especially in finance, engineering and computational biology. The introductory material on simulation and stochastic differential equation is very accessible and will prove popular with many readers. While there are several recent texts available that cover stochastic differential equations, the concentration here on inference makes this book stand out. No other direct competitors are known to date. With an emphasis on the practical implementation of the simulation and estimation methods presented, the text will be useful to practitioners and students with minimal mathematical background. What's more, because of the many R programs, the information here is appropriate for many mathematically well educated practitioners, too.
hen my husband died in 1973 I had to go through his W papers. Some of them were still in manuscript form and had never before been published. I selected several of these, plus a number of other articles that had appeared in periodicals but were no longer in print. This book is the result. At my request Richard Ebeling wrote an introduction which he has done in great detail. The depth of Ebeling's understanding of my husband's work is certainly apparent in his writing. I am pleased to have the Ludwig von Mises Institute present this volume to the public. Margit von Mises New York City September 1989 vii Introduction I I n the 1920s and the 1930s, Ludwig von Mises was recognized as one of the leading economic theorists on the European Conti nent. I F. A. Hayek has said that Mises's critique of the possibilities for economic calculation under socialism had "the most profound impression on my generation . . . . To none of us '" who read his] book Socialism] when it appeared was the world ever the same again., 2 Lord Lionel Robbins, in introducing the Austrian School literature on money and the trade cycle to English-speaking readers in 1931, emphasized the "marvelous renaissance" the "School of Vienna" had experienced "under the leadership of . . . Professor Mises."
Econometrics of Health Care - which we have sometimes called 'medico metrics' - is a field in full expansion. The reasons are numerous: our knowl edge of quantitative relations in the field of health econometrics is far from being perfect, a large number of analytical difficulties - combining medical (latent factors, e. g. ) and economic facts (spatial behaviour, e. g. ) are faced by the research worker, medical and pharmaceutical techniques change rapidly, medical costs rocket more than proportionally with available resources, of being tightened. medical budgets are in the process So it is not surprising that the practice of 'hygieconometrics' - to produce a neologism - is more and more included in the programmes of econometri cians. The Applied Econometrics Association has devoted to the topic two symposia in less than three years (Lyons, February 1983; Rotterdam, December 1985), without experiencing any difficulties in getting valuable papers: on econometrics of risks and medical insurance, on the measurement of health status and of efficiency of medical techniques, on general models allowing simulation. These were the themes for the second meeting, but other aspects of medical-economic problems had presented themselves already to the analyst: medical decision making and its consequences, the behaviour of the actors - patients and physicians -, regional medicometrics and what not: some of them have been covered by the first meeting. Finally, in July 1988 took place in Lyons the Fourth International Conference on System Science in Health Care; it should not be astonishing ."
Empirical Studies on Volatility in International Stock Markets describes the existing techniques for the measurement and estimation of volatility in international stock markets with emphasis on the SV model and its empirical application. Eugenie Hol develops various extensions of the SV model, which allow for additional variables in both the mean and the variance equation. In addition, the forecasting performance of SV models is compared not only to that of the well-established GARCH model but also to implied volatility and so-called realised volatility models which are based on intraday volatility measures. The intended readers are financial professionals who seek to obtain more accurate volatility forecasts and wish to gain insight about state-of-the-art volatility modelling techniques and their empirical value, and academic researchers and students who are interested in financial market volatility and want to obtain an updated overview of the various methods available in this area.
This highly useful book contains methodology for the analysis of data that arise from multiscale processes. It brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. These methods can handle different amounts of prior knowledge at different scales, as often occurs in practice.
The place in survival analysis now occupied by proportional hazards models and their generalizations is so large that it is no longer conceivable to offer a course on the subject without devoting at least half of the content to this topic alone. This book focuses on the theory and applications of a very broad class of models - proportional hazards and non-proportional hazards models, the former being viewed as a special case of the latter - which underlie modern survival analysis. Researchers and students alike will find that this text differs from most recent works in that it is mostly concerned with methodological issues rather than the analysis itself.
Written by a world leader in the field and aimed at researchers in applied and engineering sciences, this brilliant text has as its main goal imparting an understanding of the methods so that practitioners can make immediate use of existing algorithms and software, and so that researchers can extend the state of the art and find new applications. It includes algorithms on seeking feasibility and analyzing infeasibility, as well as describing new and surprising applications.
This book provides a synthesis of some recent issues and an up-to-date treatment of some of the major important issues in distributional analysis that I have covered in my previous book Ethical Social Index Numbers, which was widely accepted by students, teachers, researchers and practitioners in the area. Wide coverage of on-going and advanced topics and their analytical, articulate and authoritative p- sentation make the book theoretically and methodologically quite contemporary and inclusive, and highly responsive to the practical problems of recent concern. Since many countries of the world are still characterized by high levels of income inequality, Chap. 1 analyzes the problems of income inequality measurement in detail. Poverty alleviation is an overriding goal of development and social policy. To formulate antipoverty policies, research on poverty has mostly focused on inco- based indices. In view of this, a substantive analysis of income-based poverty has been presented in Chap. 2. The subject of Chap. 3 is people's perception about income inequality in terms of deprivation. Since polarization is of current concern to analysts and social decisi- makers, a discussion on polarization is presented in Chap. 4.
Income Elasticity and Economic Development Methods and Applications is mainly concerned with methods of estimating income elasticity. This field is connected with economic development that can be achieved by reducing income inequality. This is highly relevant in today's world, where the gap between rich and poor is widening with the growth of economic development. Income Elasticity and Economic Development Methods and Applications provides a good example in showing how to calculate income elasticity, using a number of methods from widely available grouped data. Some of the techniques presented here can be used in a wide range of policy areas in all developed, developing and under-developed countries. Policy analysts, economists, business analysts and market researchers will find this book very useful.
The econometric consequences of nonstationary data have wide ranging im plications for empirical research in economics. Specifically, these issues have implications for the study of empirical relations such as a money demand func tion that links macroeconomic aggregates: real money balances, real income and a nominal interest rate. Traditional monetary theory predicts that these nonsta tionary series form a cointegrating relation and accordingly, that the dynamics of a vector process comprised of these variables generates distinct patterns. Re cent econometric developments designed to cope with nonstationarities have changed the course of empirical research in the area, but many fundamental challenges, for example the issue of identification, remain. This book represents the efforts undertaken by the authors in recent years in an effort to determine the consequences that nonstationarity has for the study of aggregate money demand relations. We have brought together an empirical methodology that we find useful in conducting empirical research. Some of the work was undertaken during the authors' sabbatical periods and we wish to acknowledge the generous support of Arizona State University and Michigan State University respectively. Professor Hoffman wishes to acknowledge the support of the Fulbright-Hays Foundation that supported sabbattical research in Europe and separate support of the Council of 100 Summer Research Program at Arizona State University."
Lawrence Klein, University of Pennsylvania Jaime Marquez, Federal Reserve BoarrI* All examination of the economics literature over the last twenty years reveals a marked tendency towards polarisation. On the one hand, there has been a propensity to develop theoretical models which have little connection with either empirical verification or problems requiring immediate attention. On the other iland, empirical analyses are generally typified by testing for its own sake, with limited examination of the implications of the results. As a result, the number of papers confronting theory with facts towards the solution of economic problems has been on the decline for years. To fill this growing gap in the literature, we have invited a number of authors to write papers using both theoretical and empirical techniques to address current issues of interest to the profession at large: the US trade deficit and the global implications of policies that attempt to reduce it, the international ramifications of the debt crisis, the international oil market and its implications for the US oil industry, and the development of new econometric techniques. In addressing these issues, each author has approached the subject matter from an eclectic standpoint - that is, avoiding strict adherence to a given doctrine.
Markov chains have increasingly become useful way of capturing stochastic nature of many economic and financial variables. Although the hidden Markov processes have been widely employed for some time in many engineering applications e.g. speech recognition, its effectiveness has now been recognized in areas of social science research as well. The main aim of Hidden Markov Models: Applications to Financial Economics is to make such techniques available to more researchers in financial economics. As such we only cover the necessary theoretical aspects in each chapter while focusing on real life applications using contemporary data mainly from OECD group of countries. The underlying assumption here is that the researchers in financial economics would be familiar with such application although empirical techniques would be more traditional econometrics. Keeping the application level in a more familiar level, we focus on the methodology based on hidden Markov processes. This will, we believe, help the reader to develop more in-depth understanding of the modeling issues thereby benefiting their future research. |
You may like...
Capitalism and Power 2023 - SOCIALIST…
Greg Albo, Nicole Aschoff, …
Hardcover
R1,947
Discovery Miles 19 470
|