![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
Individuals and families make key decisions that impact many aspects of financial stability and determine the future of the economy. These decisions involve balancing current sacrifice against future benefits. People have to decide how much to invest in health care, exercise, their diet, and insurance. They must decide how much debt to take on, and how much to save. And they make choices about jobs that determine employment and unemployment levels. "Forward-Looking Decision Making" is about modeling this individual or family-based decision making using an optimizing dynamic programming model. Robert Hall first reviews ideas about dynamic programs and introduces new ideas about numerical solutions and the representation of solved models as Markov processes. He surveys recent research on the parameters of preferences--the intertemporal elasticity of substitution, the Frisch elasticity of labor supply, and the Frisch cross-elasticity. He then examines dynamic programming models applied to health spending, long-term care insurance, employment, entrepreneurial risk-taking, and consumer debt. Linking theory with data and applying them to real-world problems, "Forward-Looking Decision Making" uses dynamic optimization programming models to shed light on individual behaviors and their economic implications.
Die OEkonometrie nimmt bei der empirischen Fundierung oekonomischer Hypothesen und Theorien eine herausragende Stellung ein. Kenntnisse oekonometrischer Methoden werden inzwischen in vielen Bereichen - z. B. in der Konjunkturanalyse, Politiksimulation, Finanzmarktanalyse, Regionaloekonomik oder auch in der Marktforschung - vorausgesetzt. Die Autoren gehen gezielt auf neuere Entwicklungen auf dem Gebiet der Zeitreihenanalyse, Paneloekonometrie und robusten Statistik ein, die vorteilhaft bei empirisch fundierten oekonomischen Analysen eingesetzt werden koennen.
What happens to risk as the economic horizon goes to zero and risk is seen as an exposure to a change in state that may occur instantaneously at any time? All activities that have been undertaken statically at a fixed finite horizon can now be reconsidered dynamically at a zero time horizon, with arrival rates at the core of the modeling. This book, aimed at practitioners and researchers in financial risk, delivers the theoretical framework and various applications of the newly established dynamic conic finance theory. The result is a nonlinear non-Gaussian valuation framework for risk management in finance. Risk-free assets disappear and low risk portfolios must pay for their risk reduction with negative expected returns. Hedges may be constructed to enhance value by exploiting risk interactions. Dynamic trading mechanisms are synthesized by machine learning algorithms. Optimal exposures are designed for option positioning simultaneously across all strikes and maturities.
"Econometric Modeling" provides a new and stimulating introduction to econometrics, focusing on modeling. The key issue confronting empirical economics is to establish sustainable relationships that are both supported by data and interpretable from economic theory. The unified likelihood-based approach of this book gives students the required statistical foundations of estimation and inference, and leads to a thorough understanding of econometric techniques. David Hendry and Bent Nielsen introduce modeling for a range of situations, including binary data sets, multiple regression, and cointegrated systems. In each setting, a statistical model is constructed to explain the observed variation in the data, with estimation and inference based on the likelihood function. Substantive issues are always addressed, showing how both statistical and economic assumptions can be tested and empirical results interpreted. Important empirical problems such as structural breaks, forecasting, and model selection are covered, and Monte Carlo simulation is explained and applied. "Econometric Modeling" is a self-contained introduction for advanced undergraduate or graduate students. Throughout, data illustrate and motivate the approach, and are available for computer-based teaching. Technical issues from probability theory and statistical theory are introduced only as needed. Nevertheless, the approach is rigorous, emphasizing the coherent formulation, estimation, and evaluation of econometric models relevant for empirical research.
This collection of essays honors a remarkable man and his work. Erik Thorbecke has made significant contributions to the microeconomic and the macroeconomic analysis of poverty, inequality and development, ranging from theory to empirics and policy. The essays in this volume display the same range. As a collection they make the fundamental point that deep understanding of these phenomena requires both the micro and the macro perspectives together, utilizing the strengths of each but also the special insights that come when the two are linked together. After an overview section which contains the introductory chapter and a chapter examining the historical roots of Erik Thorbecke's motivations, the essays in this volume are grouped into four parts, each part identifying a major strand of Erik's work-Measurement of Poverty and Inequality, Micro Behavior and Market Failure, SAMs and CGEs, and Institutions and Development. The range of topics covered in the essays, written by leading authorities in their own areas, highlight the extraordinary depth and breadth of Erik Thorbecke's influence in research and policy on poverty, inequality and development. Acknowledgements These papers were presented at a conference in honor of Erik Thorbecke held at Cornell University on October 10-11, 2003. The conference was supported by the funds of the H. E. Babcock Chair in Food, Nutrition and Public Policy, and the T. H. Lee Chair in World Affairs at Cornell University.
Ragnar Frisch (1895-1973) played a major role in the foundation of econometrics as a discipline. Joint winner with Jan Tinbergen of the first Nobel prize in economics, he exerted a strong influence both on its development in the 1930s and, as editor of Econometrica for more than twenty years, its subsequent growth following the Second World War. Beginning with his early contributions to utility measurement and index problems, macrodynamics and econometric methods, this outstanding collection of his essays also features his later work on macroeconomic models and planning methods, including programming techniques and preference functions. Edited with a biographical introduction by Olav Bjerkholt, this two volume set makes many important papers readily accessible and provides a broad assessment of the work of a great econometrician.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
This two volume set brings together a key selection of papers written by Jacques J. Polak over the last 50 years in the fields of economics, econometrics and finance. Presented under five broad headings, the collection begins with his work on international and national business cycles - a subject on which the author worked with Nobel Prize winner Jan Tinbergen - problems of international trade and balance of payments adjustment. Later sections examine exchange rates and how they affect the balance of payments, inflation and hyperinflation; the monetary approach to the balance of payments, a subject that the author pioneered in the IMF and that became the framework of the conditionality of IMF credits; and international liquidity, with particular reference to the special drawing right (SDR). The final section features the author's essays on the international monetary system itself, including topics such as the international co ordination of national economic policies, the changes over time in the objectives of national policy making in the main industrial countries and reform of the system. Economic Theory and Financial Policy will be welcomed by researchers, students and practitioners concerned with economics, government finance, banking and international economic relations.
This important reference work offers readers, researchers and students a thoughtful, balanced selection of core articles from the voluminous literature on panel data. The Econometrics of Panel Data will be welcomed by econometricians and economists as a central reference point and guide to current thinking. The first volume features work on variance components model, its extensions and applications, estimation of variances, dynamic models, instrumental variable estimators and random coefficient models. The second volume covers errors in variables and incomplete data, specification tests, limited dependent variables, frontier production functions and some practical problems with panel data. G.S. Maddala has chosen a series of key contributions by leading econometricians which guide the reader through the literature. As well as reproducing the central articles and papers, intact with their original pagination, the editor provides a comprehensive introduction and additional references which will allow students and researchers to pursue their studies further.
This guidebook provides tools for disaggregated data production, analysis, and communication relevant for measuring progress in line with the 2030 Agenda for Sustainable Development. The "leave no one behind" principle espoused by the 2030 Agenda requires measures of progress for different segments of the population. This entails detailed disaggregated data to identify subgroups that might be falling behind, to ensure progress toward achieving the Sustainable Development Goals (SDGs). ADB and the Statistics Division of the United Nations Department of Economic and Social Affairs developed this practical guidebook with tools to collect, compile, analyze, and disseminate disaggregated data. It also provides materials on issues and experiences of countries regarding data disaggregation for the SDGs.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society.
The main theme of this volume is credit risk and credit derivatives. Recent developments in financial markets show that appropriate modeling and quantification of credit risk is fundamental in the context of modern complex structured financial products. The reader will find several points of view on credit risk when looked at from the perspective of Econometrics and Financial Mathematics. The volume consists of eleven contributions by both practitioners and theoreticians with expertise in financial markets, in general, and econometrics and mathematical finance in particular. The challenge of modeling defaults and their correlations is addressed, and new results on copula, reduced form and structural models, and the top-down approach are presented. After the so-called subprime crisis that hit global markets in the summer of 2007, the volume is very timely and will be useful to researchers in the area of credit risk.
Econometric issues have provoked a lively and sometimes adversarial debate in the economics profession. The excitement and intellectual vitality of that debate is captured here for the reader in a lucid overview of econometric approaches, describing their advantages and limitations. This ambitious book focuses on the underlying methodological issues rather than concentrating upon econometric techniques. The limits of econometric investigations are identified through a critical appraisal of three different approaches associated with the work of Professors Hendry, Leamer and Sims. After explaining why the early optimism in econometrics was misplaced, it argues that rejection is not an appropriate response. It offers a rich spectrum of approaches to a problem of central importance in the development of modern economics. The book will appeal not only to all econometricians whatever their persuasion but also to all those with an interest in the methodology of economics.
Many problems in statistics and econometrics offer themselves naturally to
optimization in statistics and econometrics, followed by detailed discussion of a relatively new and very powerful optimization heuristic, threshold accepting. The final part consists of many applications of the methods described earlier, encompassing experimental design, model selection, aggregation of tiime series, and censored quantile regression models. Those researching and working in econometrics, statistics and operations research are given the tools to apply optimization heuristic methods in their work. Postgraduate students of statistics and econometrics will find the book provides a good introduction to optimization heuristic methods.
""Nonparametric Econometrics" by Li and Racine is a must for any serious econometrician or statistician who is working on cutting-edge problems. The theoretical treatment of nonparametric methods is remarkably complete in its coverage of mainstream and relatively arcane topics. I particularly like Li and Racine's general treatment of continuous and discrete regressors and of specification testing, topics that I have not seen handled in such a comprehensive fashion. I will certainly use this in my graduate econometrics courses and in conducting my own research."--Robin Sickles, Rice University "Very few studies have tried to apply the nonparametric techniques to analyze real data. The lack of applications of those techniques is perhaps attributable to the lack of a good textbook that explains intuitively how and why those techniques work. This book by Li and Racine serves both applied researchers and graduate students. It is written in plain language so that it can be understood by anyone with basic econometrics but zero knowledge of nonparametric methods. And it contains enough specifics that clearly spell out steps to implement those methods."--Chunrong Ai, University of Florida "This book represents a very significant contribution to the field of econometrics. It provides an extremely thorough coverage of our knowledge in the area of nonparametric and semiparametric methods as they apply to economic models and economic data. And it makes accessible, for the first time, a body of relatively new material relating to discrete and 'mixed' data. There is a good balance of theoretical material and applications. Apart from serving as a superb teaching text in graduate-level courseswhere the students have a strong econometrics/statistics preparation, I believe this book will become a must-have reference resource for many researchers."--David E. Giles, University of Victoria
"Asset Pricing Theory" is an advanced textbook for doctoral students and researchers that offers a modern introduction to the theoretical and methodological foundations of competitive asset pricing. Costis Skiadas develops in depth the fundamentals of arbitrage pricing, mean-variance analysis, equilibrium pricing, and optimal consumption/portfolio choice in discrete settings, but with emphasis on geometric and martingale methods that facilitate an effortless transition to the more advanced continuous-time theory. Among the book's many innovations are its use of recursive utility as the benchmark representation of dynamic preferences, and an associated theory of equilibrium pricing and optimal portfolio choice that goes beyond the existing literature. "Asset Pricing Theory" is complete with extensive exercises at the end of every chapter and comprehensive mathematical appendixes, making this book a self-contained resource for graduate students and academic researchers, as well as mathematically sophisticated practitioners seeking a deeper understanding of concepts and methods on which practical models are built.Covers in depth the modern theoretical foundations of competitive asset pricing and consumption/portfolio choice Uses recursive utility as the benchmark preference representation in dynamic settings Sets the foundations for advanced modeling using geometric arguments and martingale methodology Features self-contained mathematical appendixes Includes extensive end-of-chapter exercises |
![]() ![]() You may like...
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,513
Discovery Miles 65 130
A History of Macroeconometric…
Ronald G. Bodkin, Lawrence R. Klein, …
Hardcover
R5,493
Discovery Miles 54 930
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,745
Discovery Miles 27 450
|