![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
The Handbook is a definitive reference source and teaching aid for
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
This book provides a practical introduction to mathematics for economics using R software. Using R as a basis, this book guides the reader through foundational topics in linear algebra, calculus, and optimization. The book is organized in order of increasing difficulty, beginning with a rudimentary introduction to R and progressing through exercises that require the reader to code their own functions in R. All chapters include applications for topics in economics and econometrics. As fully reproducible book, this volume gives readers the opportunity to learn by doing and develop research skills as they go. As such, it is appropriate for students in economics and econometrics.
Bringing together leading-edge research and innovative energy markets econometrics, this book collects the author's most important recent contributions in energy economics. In particular, the book:* applies recent advances in the field of applied econometrics to investigate a number of issues regarding energy markets, including the theory of storage and the efficient markets hypothesis* presents the basic stylized facts on energy price movements using correlation analysis, causality tests, integration theory, cointegration theory, as well as recently developed procedures for testing for shared and codependent cycles* uses recent advances in the financial econometrics literature to model time-varying returns and volatility in energy prices and to test for causal relationships between energy prices and their volatilities* explores the functioning of electricity markets and applies conventional models of time series analysis to investigate a number of issues regarding wholesale power prices in the western North American markets* applies tools from statistics and dynamical systems theory to test for nonlinear dynamics and deterministic chaos in a number of North American hydrocarbon markets (those of ethane, propane, normal butane, iso-butane, naptha, crude oil, and natural gas)
It has been held that when economic policy makers use economic models, there is a one way flow of information from the models to policy analysis. This text challenges this assumption, recognizing that in practice the requirements and questions of policy makers play an important role in the development and revision of those very models. Written by highly-placed practitioners and academic economists, it provides a picture of how modellers and policy makers interact with depth, insight and conviction. It offers international case studies of particular interactions between models and policy making, exploring questions such as: how does interaction work? What roles do different professional groups play in interaction? What strategies make the use of models in policy preparation successful? What insights can sociologists and historians give on the interaction between models and policy makers?
This is a book macroeconomists were waiting for - a lucid history of macroeconometric model building. A must read.' - Willi Semmler, New School for Social Research, US'The authors have succeeded in orchestrating a lively debate over the scientific foundations of structural econometrics. Their book deserves a broad readership.' - From the foreword by Lawrence R. Klein 'The wide range of issues brilliantly discussed, the entertaining and accessible language, the sharpness of the authors' judgement and the constructive critical mood makes this book recommended reading for anyone who feels uneasy with the empirical scope and policy significance of standard econometric methods and of mainstream economic models.' - Alessandro Vercelli, University of Siena, Italy This challenging and original book takes a fresh, innovative look at econometrics, and re-examines the scientific standing of structural econometrics as developed by the founders (Frisch and Tinbergen) and extended by Haavelmo and the Cowles modellers (particularly Klein) during the period 1930-1960. The authors begin by rethinking the scientific foundations of structural econometrics, offering a way around the problem of induction that also justifies the assumption of a data generating mechanism', and of ways to model this. They go on to explain how current critiques of the methodological foundations of structural econometrics are direct consequences of implicitly accepted but seriously flawed elements in neoclassical thinking. In the final part they present their distinctive methodological contribution: a blend of fieldwork and conceptual analysis designed to ensure that their models are well grounded in reality, and at the same time, conceptually coherent as well as statistically adequate. In so doing, they outline a number of elements that will be needed to develop a 'good' macroeconometric model of an advanced economy. Rational Econometric Man will prove a stimulating and thought-provoking read for scholars and researchers in the field of economics, and, more specifically, heterodox economics. Contents: Foreword by Lawrence R. Klein Introduction Part I: From Rational Economic Man to Rational Econometric Man 1. Re-reading Hollis and Nell 2. Haavelmo Reconsidered as Rational Econometric Man 3. Induction and the Empiricist Account of General Laws 4. Variables, Laws and Induction I: Are There Laws of Nature? 5. Variables, Laws and Induction II: Scientific Variables and Scientific Laws in Economics 6. The Concept of the 'Model' and the Methodology of Model Building Part II: The Critiques and the Foundations 7. Debating the Foundations: A New Perspective? 8. Scientific Issues in Structural Econometrics 9. Haavelmo and Beyond: Probability, Uncertainty, Specification and Stochasticism Part III: Structural Econometrics in its Place: Mapping New Directions 10. Conceptual Analysis, Fieldwork and the Methodology of Model Building 11. Working with Open Models: Lawlike Relations and an Uncertain Future Conclusion References Index
Distributional issues may not have always been among the main
concerns of the economic profession. Today, in the beginning of the
2000s, the position is different. During the last quarter of a
century, economic growth proved to be unsteady and rather slow on
average. The situation of those at the bottom ceased to improve
regularly as in the preceding fast growth and full-employment
period. Europe has seen prolonged unemployment and there has been
widening wage dispersion in a number of OECD countries. Rising
affluence in rich countries coexists, in a number of such
countries, with the persistence of poverty. As a consequence, it is
difficult nowadays to think of an issue ranking high in the public
economic debate without some strong explicit distributive
implications. Monetary policy, fiscal policy, taxes, monetary or
trade union, privatisation, price and competition regulation, the
future of the Welfare State are all issues which are now often
perceived as conflictual because of their strong redistributive
content.
For more information on the Handbooks in Economics series, please see our home page on http: //www.elsevier.nl/locate/hes
Volumes 45a and 45b of Advances in Econometrics honor Professor Joon Y. Park, who has made numerous and substantive contributions to the field of econometrics over a career spanning four decades since the 1980s and counting. This second volume, Essays in Honor of Joon Y. Park: Econometric Methodology in Empirical Applications, focuses on econometric applications related, some closely and some very loosely, to Professor Park’s more recent work before concluding with a retrospective summarizing four decades of Advances in Econometrics.
This study, first published in 1979, examines and contrasts two concepts of credit rationing. The first concept takes the relevant price of credit to be the explicit interest rate on the loan and defines the demand for credit as the amount an individual borrower would like to receive at that rate. Under the alternative definition, the price of credit consists of the complete set of loan terms confronting a class of borrowers with given characteristics, while the demand for credit equals the total number of loan which members of the class would like to receive at those terms. This title will be of interest to students of monetary economics.
This study, first published in 1994, is intended to deepen the readers understanding of the phenomenon of equilibrium credit rationing in two areas. The first area concerns the form that equilibrium credit rationing assumes and its importance in determining the behaviour of interest rates. The second concerns the role of equilibrium credit rationing in transmitting monetary shocks to the real sector. This title will be of interest to students of monetary economics.
First Published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
The object of this work, first published in 1977, is to examine the history of the economic and monetary union (EMU) in the European Community, the policies of the parties involved and the conflicts of interest created in the political and economic environment within which all this has taken place. This title will be of interest to students of monetary economics and finance.
This title, first published in 1984, considers a temporary monetary equilibrium theory under certainty in a differentiable framework. Using the techniques of differential topology the author investigates the structure of the set of temporary monetary equilibria. Temporary Monetary Equilibrium Theory: A Differentiable Approach will be of interest to students of monetary economics.
This is an open access title available under the terms of a CC BY-NC-ND 4.0 International licence. It is free to read at Oxford Academic and offered as a free PDF download from OUP and selected open access locations. In October 2019, Abhijit Banerjee, Esther Duflo, and Michael Kremer jointly won the 51st Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel "for their experimental approach to alleviating global poverty." But what is the exact scope of their experimental method, known as randomized control trials (RCTs)? Which sorts of questions are RCTs able to address and which do they fail to answer? The first of its kind, Randomized Control Trials in the Field of Development: A Critical Perspective provides answers to these questions, explaining how RCTs work, what they can achieve, why they sometimes fail, how they can be improved and why other methods are both useful and necessary. Bringing together leading specialists in the field from a range of backgrounds and disciplines (economics, econometrics, mathematics, statistics, political economy, socioeconomics, anthropology, philosophy, global health, epidemiology, and medicine), it presents a full and coherent picture of the main strengths and weaknesses of RCTs in the field of development. Looking beyond the epistemological, political, and ethical differences underlying many of the disagreements surrounding RCTs, it explores the implementation of RCTs on the ground, outside of their ideal theoretical conditions and reveals some unsuspected uses and effects, their disruptive potential, but also their political uses. The contributions uncover the implicit worldview that many RCTs draw on and disseminate, and probe the gap between the method's narrow scope and its success, while also proposing improvements and alternatives. Without disputing the contribution of RCTs to scientific knowledge, Randomized Control Trials in the Field of Development warns against the potential dangers of their excessive use, arguing that the best use for RCTs is not necessarily that which immediately springs to mind. Written in plain language, this book offers experts and laypeople alike a unique opportunity to come to an informed and reasoned judgement on RCTs and what they can bring to development.
This book brings together cutting edge contributions in the fields of international economics, micro theory, welfare economics and econometrics, with contributions from Donald R. Davis, Avinash K. Dixit, Tadashi Inoue, Ronald W. Jones, Dale W. Jorgenson, K. Rao Kadiyala, Murray C. Kemp, Kenneth M. Kletzer, Anne O. Krueger, Mukul Majumdar, Daniel McFadden, Lionel McKenzie, James R. Melvin, James C. Moore, Takashi Negishi, Yoshihiko Otani, Raymond Riezman, Paul A. Samuelson, Joaquim Silvestre and Marie Thursby.
In the theory and practice of econometrics the model, the method and the data are all interdependent links in information recovery-estimation and inference. Seldom, however, are the economic and statistical models correctly specified, the data complete or capable of being replicated, the estimation rules ‘optimal’ and the inferences free of distortion. Faced with these problems, Maximum Entropy Economeirics provides a new basis for learning from economic and statistical models that may be non-regular in the sense that they are ill-posed or underdetermined and the data are partial or incomplete. By extending the maximum entropy formalisms used in the physical sciences, the authors present a new set of generalized entropy techniques designed to recover information about economic systems. The authors compare the generalized entropy techniques with the performance of the relevant traditional methods of information recovery and clearly demonstrate theories with applications including
Turbulence in Economics presents the economy as an evolutionary process, economics as a realistic science and reintroduces history as fundamental to understanding economic processes. It examines cycles and fluctuations in economic history from the point of view of turbulence in the physical sciences, (specifically hydrodynamics), and argues that an evolutionary approach is required for a better understanding of historical economic processes. Economic time is marked by a succession of long periods of economic expansion and depression, separated by deep structural changes. These periods represent distinct forms of organization of social relations, science and technology, cultural trends and political and social institutions. This is accepted by historians but rejected in orthodox economics. In this book the author challenges this and argues that the divorce between economics and history limits the ability of economics to explain reality. Within this inquiry into the crisis of orthodox economics the author considers Keynes's, Mitchell's and Schumpeter's critiques of neoclassical economics. The author then compares these to the contributions of Frisch and Wicksell, and examines recent studies of chaos, nonlinear and complex dynamics to explain the historical development of modern economics. This book will be welcomed by economic historians, historians of economic thought, institutional and evolutionary economists and those interested in chaos, complexity and modern methodology.
Portfolio theory and much of asset pricing, as well as many empirical applications, depend on the use of multivariate probability distributions to describe asset returns. Traditionally, this has meant the multivariate normal (or Gaussian) distribution. More recently, theoretical and empirical work in financial economics has employed the multivariate Student (and other) distributions which are members of the elliptically symmetric class. There is also a growing body of work which is based on skew-elliptical distributions. These probability models all exhibit the property that the marginal distributions differ only by location and scale parameters or are restrictive in other respects. Very often, such models are not supported by the empirical evidence that the marginal distributions of asset returns can differ markedly. Copula theory is a branch of statistics which provides powerful methods to overcome these shortcomings. This book provides a synthesis of the latest research in the area of copulae as applied to finance and related subjects such as insurance. Multivariate non-Gaussian dependence is a fact of life for many problems in financial econometrics. This book describes the state of the art in tools required to deal with these observed features of financial data. This book was originally published as a special issue of the European Journal of Finance.
In Capital Theory and Equilibrium Analysis and Recursive Utility, Robert Becker and John Boyd have synthesized their previously unpublished work on recursive models. The use of recursive utility emphasizes time-consistent decision making. This permits a unified and systematic account of economic dynamics based on neoclassical growth theory.The book provides extensive coverage of optimal growth (including endogenous growth), dynamic competitive equilibria, nonlinear dynamics, and monotone comparative dynamics. It is addressed to all researchers in economic growth, and will be useful to professional economists and graduate students alike.
This book studies the information spillover among financial markets and explores the intraday effect and ACD models with high frequency data. This book also contributes theoretically by providing a new statistical methodology with comparative advantages for analyzing co-movements between two time series. It explores this new method by testing the information spillover between the Chinese stock market and the international market, futures market and spot market. Using the high frequency data, this book investigates the intraday effect and examines which type of ACD model is particularly suited in capturing financial duration dynamics. The book will be of invaluable use to scholars and graduate students interested in co-movements among different financial markets and financial market microstructure and to investors and regulation departments looking to improve their risk management.
This volume gathers peer-reviewed contributions that address a wide range of recent developments in the methodology and applications of data analysis and classification tools in micro and macroeconomic problems. The papers were originally presented at the 29th Conference of the Section on Classification and Data Analysis of the Polish Statistical Association, SKAD 2020, held in Sopot, Poland, September 7-9, 2020. Providing a balance between methodological contributions and empirical papers, the book is divided into five parts focusing on methodology, finance, economics, social issues and applications dealing with COVID-19 data. It is aimed at a wide audience, including researchers at universities and research institutions, graduate and doctoral students, practitioners, data scientists and employees in public statistical institutions.
The analysis, prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader. |
![]() ![]() You may like...
Extreme Value Theory and Applications…
J. Galambos, James Lechner, …
Hardcover
R5,889
Discovery Miles 58 890
Stochastic Analysis and Applications in…
Ana Isabel Cardoso, Margarida de Faria, …
Hardcover
R2,726
Discovery Miles 27 260
Classical Newtonian Gravity - A…
Roberto A. Capuzzo Dolcetta
Hardcover
R2,260
Discovery Miles 22 600
Geometry and Statistics, Volume 46
Frank Nielsen, Arni S.R. Srinivasa Rao, …
Hardcover
Linear Stochastic Systems - A Geometric…
Anders Lindquist, Giorgio Picci
Hardcover
R4,764
Discovery Miles 47 640
Networks of Learning Automata…
M.A.L. Thathachar, P.S. Sastry
Hardcover
R2,902
Discovery Miles 29 020
Stochastic and Integral Geometry
Rolf Schneider, Wolfgang Weil
Hardcover
R3,949
Discovery Miles 39 490
|