![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
Game Theory: Stochastics, Information, Strategies and Cooperation provides a discussion of some relevant topics in game theory. It is composed partially from material compiled by Professor Joachim Rosenmuller when lecturing at IMW, the Institute of Mathematical Economics at the University of Bielefeld. On the other hand, it also contains research topics that are not presented in a typical game theory textbook. Thus, the volume may provide the basis for an advanced course in game theory; simultaneously it may be called a monograph, and, as a third aspect, it also supplies some rather elementary versions of advanced topics of the field. The volume has a non-cooperative and a cooperative part and in both of them the reader is assumed to have some basic knowledge in game theory, for instance, concerning the normal form (bimatrix games, Nash equilibria of the mixed extension, backwards induction in games with perfect information) on one hand and the coalitional function (simple games, convex games, superadditive games, the core, the Shapley volume) on the other hand. Some emphasis is laid on the probabilistic background; however, the author treats stochastic games using the language of probability in order to consider simple models in which measure theory can be omitted.
The approach to many problems in economic analysis has changed drastically with the development and dissemination of new and more efficient computational techniques. Computational Economic Systems: Models, Methods & Econometrics presents a selection of papers illustrating the use of new computational methods and computing techniques to solve economic problems. Part I of the volume consists of papers which focus on modelling economic systems, presenting computational methods to investigate the evolution of behavior of economic agents, techniques to solve complex inventory models on a parallel computer and an original approach for the construction and solution of multicriteria models involving logical conditions. Contributions to Part II concern new computational approaches to economic problems. We find an application of wavelets to outlier detection. New estimation algorithms are presented, one concerning seemingly related regression models, a second one on nonlinear rational expectation models and a third one dealing with switching GARCH estimation. Three contributions contain original approaches for the solution of nonlinear rational expectation models.
Simulation methods are revolutionizing the practice of applied economic analysis. This volume collects eighteen chapters written by leading researchers from prestigious research institutions the world over. The common denominator of the papers is their relevance for applied research in environmental and resource economics. The topics range from discrete choice modeling with heterogeneity of preferences, to Bayesian estimation, to Monte Carlo experiments, to structural estimation of Kuhn-Tucker demand systems, to evaluation of simulation noise in maximum simulated likelihood estimates, to dynamic natural resource modeling. Empirical cases are used to show the practical use and the results brought forth by the different methods.
Covers applications to risky assets traded on the markets for
funds, fixed-income products and electricity derivatives.
Patrick Artus and Yves Barroux The Applied Econometric Association organised an international conference on "Monetary and Financial Models" in Geneva in January 1987. The purpose of this book is to make available to the public a choice of the papers that were presented at the conference. The selected papers all deal with the setting of monetary targets and the effects of monetary policy on the economy as well as with the analysis of the financial behaviours of economic agents. Other papers presented at the same conference but dealing with the external aspects of monetary policy (exchange rate policy, international coordination of economic policies, international transmission of business cycles, . . . ) are the matter of a distinct publication. The papers put together to make up this book either are theoretical research contributions or consist of applied statistical or econometric work. It seemed to be more logical to start with the more theoretical papers. The topics tackled in the first two parts of the book have in common the fact that they appeared just recently in the field of economic research and deal with the analysis of the behaviour of Central Banks. They analyse this behaviour so as to be able to exhibit its major determinants as well as revealed preferences of Central Banks: this topic comes under the caption "optimal monetary policy and reaction function of the monetary authorities."
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
The generous social welfare system in Europe is one of the most important differences between Europe and the US. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. Others argue that the "invisible hand" in the US economy is equally powerful in reducing unemployment and preventing crime. This book takes this trade-off as a starting point and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. In doing so, it evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation.
Non-Parametric Statistical Diagnosis
A new approach to explaining the existence of firms and markets, focusing on variability and coordination. It stands in contrast to the emphasis on transaction costs, and on monitoring and incentive structures, which are prominent in most of the modern literature in this field. This approach, called the variability approach, allows us to: show why both the need for communication and the coordination costs increase when the division of labor increases; explain why, while the firm relies on direction, the market does not; rigorously formulate the optimum divisionalization problem; better understand the relationship between technology and organization; show why the size' of the firm is limited; and to refine the analysis of whether the existence of a sharable input, or the presence of an external effect leads to the emergence of a firm. The book provides a wealth of insights for students and professionals in economics, business, law and organization.
A non-technical introduction to the question of modeling with time-varying parameters, using the beta coefficient from Financial Economics as the main example. After a brief introduction to this coefficient for those not versed in finance, the book presents a number of rather well known tests for constant coefficients and then performs these tests on data from the Stockholm Exchange. The Kalman filter is then introduced and a simple example is used to demonstrate the power of the filter. The filter is then used to estimate the market model with time-varying betas. The book concludes with further examples of how the Kalman filter may be used in estimation models used in analyzing other aspects of finance. Since both the programs and the data used in the book are available for downloading, the book is especially valuable for students and other researchers interested in learning the art of modeling with time varying coefficients.
During 1985-86, the acquisition editor for the humanities and social sciences division of Kluwer Academic Publishers in the Netherlands visited the University of Horida (where I was also visiting while on sabbatical leave from Wilfrid Laurier University as the McKethan-Matherly Senior Research Fellow) to discuss publishing plans of the faculty. He expressed a keen interest in publishing the proceedings of the conference of the Canadian Econometric Study Group (CESG) that was to be held the following year at WLU. This volume is the end product of his interest, endurance, and persistence. But for his persistence I would have given up on th~ project Most of the papers (though not all) included in this volume are based on presentations at CESG conferences. In some cases scholars were invited to contribute to this volume where their research complimented those presented at these conferences even though they were not conference participants. Since papers selected for presentation at the CESG conferences are generally the finished product of scholarly research and often under submission to refereed journals, it was not possible to publish the conference proceedings in their entirety. Accordingly it was decided, in consultation with the publisher, to invite a select list of authors to submit significant extensions of the papers they presented at the CESG conferences for inclusion in this volume. The editor wishes to express gratitude to all those authors who submitted their papers for evaluation by anonymous referees and for making revisions to conform to our editorial process.
This book is the result of my doctoral dissertation research at the Department of Econometrics of the University of Geneva, Switzerland. This research was also partially financed by the Swiss National Science Foundation (grants 12- 31072.91 and 12-40300.94). First and foremost, I wish to express my deepest gratitude to Professor Manfred Gilli, my thesis supervisor, for his constant support and help. I would also like to thank the president of my jury, Professor Fabrizio Carlevaro, as well as the other members of the jury, Professor Andrew Hughes Hallett, Professor Jean-Philippe Vial and Professor Gerhard Wanner. I am grateful to my colleagues and friends of the Departement of Econometrics, especially David Miceli who provided constant help and kind understanding during all the stages of my research. I would also like to thank Pascale Mignon for proofreading my text and im proving my English. Finally, I am greatly indebted to my parents for their kindness and encourage ments without which I could never have achieved my goals. Giorgio Pauletto Department of Econometrics, University of Geneva, Geneva, Switzerland Chapter 1 Introduction The purpose of this book is to present the available methodologies for the solution of large-scale macroeconometric models. This work reviews classical solution methods and introduces more recent techniques, such as parallel com puting and nonstationary iterative algorithms."
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
Productivity growth is a keyword for sustainable economic growth in a knowledge-based society. There has been significant methodological development in the literature on productivity and efficiency analysis, e.g. SFA (Stochastic Frontier Analysis) and DEA (Data Envelopment Analysis). All these methodological developments should be matched with applications in order to provide practical implications for private and public decision-makers. This volume provides a collection of up-to-date and new applications of productivity and efficiency analysis. In particular, the case studies cover various economic issues in the Asia-Pacific region. The authors analyze the performance of manufacturing firms, banks, venture capital, broadcasting firms, as well as the issues of efficiency in the education sector, regional development, and defense industry. These case studies will shed light on the potential contribution of productivity and efficiency analysis to the enhancement of economic performance.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
E. Dijkgraaf and R. H. J. M. Gradus 1. 1 Introduction In 2004 Elbert Dijkgraaf nished a PhD-thesis 'Regulating the Dutch waste market' at the Erasmus University Rotterdam. It was interesting that not much is published about the waste market, although it is a very important sector from an economic and environmental viewpoint. In 2006 we were participants at a very interesting conf- ence on Local Government Reform: privatization and public-private collaboration in Barcelona organized by Germa ` Bel. It was interesting to notice that researchers from Spain, Scandinavian countries, the UK and the USA were studying this issue as well. From this we brought forward the idea to publish a book about the waste market. Because of its legal framework we want to focus on Europe. In this chapter we give an introduction to this book. In the next paragraph we present a short overview of the waste collection market. Since 1960 the importance of the waste sector has increased substantially both in the waste streams and the costs of waste collection and treatment. Furthermore, we discuss policy measures to deal with these increases and give an overview of the different measures in - countries. In the last paragraph we present different chapters of our book. 1. 2 Empirical Update of the Waste Collection Market The Dutch case provides a nice example why studying the waste market is int- esting from an economic point of view.
This second edition sees the light three years after the first one: too short a time to feel seriously concerned to redesign the entire book, but sufficient to be challenged by the prospect of sharpening our investigation on the working of econometric dynamic models and to be inclined to change the title of the new edition by dropping the "Topics in" of the former edition. After considerable soul searching we agreed to include several results related to topics already covered, as well as additional sections devoted to new and sophisticated techniques, which hinge mostly on the latest research work on linear matrix polynomials by the second author. This explains the growth of chapter one and the deeper insight into representation theorems in the last chapter of the book. The role of the second chapter is that of providing a bridge between the mathematical techniques in the backstage and the econometric profiles in the forefront of dynamic modelling. For this purpose, we decided to add a new section where the reader can find the stochastic rationale of vector autoregressive specifications in econometrics. The third (and last) chapter improves on that of the first edition by re- ing the fruits of the thorough analytic equipment previously drawn up."
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
The present book is a collection of panel data papers, both theoretical and applied. Theoretical topics include methodology papers on panel data probit models, treatment models, error component models with an ARMA process on the time specific effects, asymptotic tests for poolability and their bootstrapped versions, confidence intervals for a doubly heteroskedastic stochastic production frontiers, estimation of semiparametric dynamic panel data models and a review of survey attrition and nonresponse in the European Community Household Panel. Applications include as different topics as e.g. the impact of uncertainty on UK investment, a Tobin-q investment model using US firm data, cost efficiency of Spanish banks, immigrant integration in Canada, the dynamics of individual health in the UK, the relation between inflation and growth among OECD and APEC countries, technical efficiency of cereal farms in England, and employment effects of education for disabled workers in Norway.
Education and training are key to explain the current competitive strengths of national economies. While in the past educational and training institutions were often seen as providers of necessary skills for national economies, this view has changed, with education and training now being seen as a key ingredient for international competitiveness. This collection of papers on various aspects of the economics of education and training reflects this new interest.
Shedding light on some of the most pressing open questions in the analysis of high frequency data, this volume presents cutting-edge developments in high frequency financial econometrics. Coverage spans a diverse range of topics, including market microstructure, tick-by-tick data, bond and foreign exchange markets, and large dimensional volatility modeling. The volume is of interest to graduate students, researchers, and industry professionals.
This book examines non-Gaussian distributions. It addresses the causes and consequences of non-normality and time dependency in both asset returns and option prices. The book is written for non-mathematicians who want to model financial market prices so the emphasis throughout is on practice. There are abundant empirical illustrations of the models and techniques described, many of which could be equally applied to other financial time series.
World-renowned experts in spatial statistics and spatial econometrics present the latest advances in specification and estimation of spatial econometric models. This includes information on the development of tools and software, and various applications. The text introduces new tests and estimators for spatial regression models, including discrete choice and simultaneous equation models. The performance of techniques is demonstrated through simulation results and a wide array of applications related to economic growth, international trade, knowledge externalities, population-employment dynamics, urban crime, land use, and environmental issues. An exciting new text for academics with a theoretical interest in spatial statistics and econometrics, and for practitioners looking for modern and up-to-date techniques.
In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.
This text provides a new approach to the subject, including a comprehensive survey of novel theoretical approaches, methods, and models used in macroeconomics and macroeconometrics. The book gives extensive insight into economic policy, incorporates a strong international perspective, and offers a broad historical perspective. |
![]() ![]() You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Handbook of Field Experiments, Volume 1
Esther Duflo, Abhijit Banerjee
Hardcover
R3,541
Discovery Miles 35 410
|