![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This trusted textbook returns in its 4th edition with even more exercises to help consolidate understanding - and a companion website featuring additional materials, including a solutions manual for instructors. Offering a unique blend of theory and practical application, it provides ideal preparation for doing applied econometric work as it takes students from a basic level up to an advanced understanding in an intuitive, step-by-step fashion. Clear presentation of economic tests and methods of estimation is paired with practical guidance on using several types of software packages. Using real world data throughout, the authors place emphasis upon the interpretation of results, and the conclusions to be drawn from them in econometric work. This book will be essential reading for economics undergraduate and master's students taking a course in applied econometrics. Its practical nature makes it ideal for modules requiring a research project. New to this Edition: - Additional practical exercises throughout to help consolidate understanding - A freshly-updated companion website featuring a new solutions manual for instructors
This book grows from a conference on the state of the art and recent advances in Efficiency and Productivity. Papers were commissioned from leading researchers in the field, and include eight explorations into the analytical foundations of efficiency and productivity analysis. Chapters on modeling advances include reverse directional distance function, a new method for estimating technological production possibilities, a new distance function called a loss distance function, an analysis of productivity and price recovery indices, the relation of technical efficiency measures to productivity measures, the implications for benchmarking and target setting of imposing weight restrictions on DEA models, weight restrictions in a regulatory setting, and the Principle of Least Action. Chapters on empirical applications include a study of innovative firms that use innovation inputs to produce innovation outputs, a study of the impact of potential "coopetition" or cooperation among competitors on the financial performance of European automobile plants, using SFA to estimate the eco-efficiency of dairy farms in Spain, a DEA bankruptcy prediction model, a combined stochastic cost frontier analysis model/mixture hazard model, the evolution of energy intensity in nine Spanish manufacturing industries, and the productivity of US farmers as they age.
The three volumes of the "Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume consists of Cass' early work from his time in graduate school at Stanford University, studying under Hirofumi Uzawa, and as an assistant professor at Yale's Cowles Commission, and his tenure at Carnegie Mellon University's Graduate School of Industrial Administration. The work in this volume focuses primarily on Cass' contributions to what is now known as the Ramsey-Cass-Kooopmans neoclassical growth model, and the development of what is now known as the Cass criterion for determining whether intertemporal allocations are efficient. This period also includes Cass' early work on overlapping generation's models, asset pricing models, and methodological contributions in dynamic systems applications in economics.
The three volumes of the "The Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume consists of the work Cass completed after leaving Carnegie Mellon for the University of Pennsylvania's Economics Department (where he remained for the rest of his career). The work during this period encompasses his well-known collaboration with Karl Shell and Yves Balasko on overlapping generations models, and his development with Karl of the notion of 'sunspot equilibria' - rational expectations equilibria which are essentially self-fulfilling prophecies. This period also saw the beginnings of Cass' pioneering research into the theory of incomplete markets, which grew naturally form his early interest in models of asset pricing, and includes the paper which developed what is now known as the Cass trick for analyzing incomplete markets models.
The three volumes of the "Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume covers the period from the middle 1980's through the end of Cass' life in 2008. Cass' research during this period included definitive papers showing that competitive equilibrium is generically indeterminate when markets are incomplete, and on the relationship between market incompleteness and the existence of sunspot equilibrium. This period also saw the follow-on papers addressing the issue of how financial innovation affects economic welfare, showing in particular that innovation can lead to welfare losses as well as gains, depending on the nature of the innovation.
Within the subprime crisis (2007) and the recent global financial crisis of 2008-2009, we have observed significant decline, corrections and structural changes in most US and European financial markets. Furthermore, it seems that this crisis has been rapidly transmitted toward the most developed and emerging countries and has strongly affected the whole economy. This volume aims to present recent researches in linear and nonlinear modelling of economic and financial time-series. The several discussions of empirical results of its chapters clearly help to improve the understanding of the financial mechanisms inherent to this crisis. They also yield an important overview on the sources of the financial crisis and its main economic and financial consequences. The book provides the audience a comprehensive understanding of financial and economic dynamics in various aspects using modern financial econometric methods. It addresses the empirical techniques needed by economic agents to analyze the dynamics of these markets and illustrates how they can be applied to the actual data. It also presents and discusses new research findings and their implications.
This volume is a collection of methodological developments and applications of simulation-based methods that were presented at a workshop at Louisiana State University in November, 2009. The first two papers are extensions of the GHK simulator: one reconsiders the computation of the probabilities in a discrete choice model while another example uses an adaptive version of sparse-grids integration (SGI) instead of simulation. Two studies are focused specifically on the methodology: the first compares the performance of the maximum-simulated likelihood (MSL) approach with a proposed composite marginal likelihood (CML) approach in multivariate ordered-response situations, while the second examines methods of testing for the presence of heterogeneity in the heterogeneity model. Further topics examined include: education savings accounts, parent contributions and education attainment; estimating the effect of exchange rate flexibility on financial account openness; estimating a fractional response model with a count endogenous regressor; and modelling and forecasting volatility in a bayesian approach.
Volume 1 covers statistical methods related to unit roots, trend breaks and their interplay. Testing for unit roots has been a topic of wide interest and the author was at the forefront of this research. The book covers important topics such as the Phillips-Perron unit root test and theoretical analyses about their properties, how this and other tests could be improved, and ingredients needed to achieve better tests and the proposal of a new class of tests. Also included are theoretical studies related to time series models with unit roots and the effect of span versus sampling interval on the power of the tests. Moreover, this book deals with the issue of trend breaks and their effect on unit root tests. This research agenda fostered by the author showed that trend breaks and unit roots can easily be confused. Hence, the need for new testing procedures, which are covered.Volume 2 is about statistical methods related to structural change in time series models. The approach adopted is off-line whereby one wants to test for structural change using a historical dataset and perform hypothesis testing. A distinctive feature is the allowance for multiple structural changes. The methods discussed have, and continue to be, applied in a variety of fields including economics, finance, life science, physics and climate change. The articles included address issues of estimation, testing and/or inference in a variety of models: short-memory regressors and errors, trends with integrated and/or stationary errors, autoregressions, cointegrated models, multivariate systems of equations, endogenous regressors, long-memory series, among others. Other issues covered include the problems of non-monotonic power and the pitfalls of adopting a local asymptotic framework. Empirical analyses are provided for the US real interest rate, the US GDP, the volatility of asset returns and climate change.
The aim of the proposed volume will be to present new developments in the methodology and practice of CGE techniques as they apply to recent issues in international trade policy. The volume will be of interest to academic researchers working in trade policy analysis and applied general equilibrium, advanced graduate students in international economics, applied researchers in multilateral organizations, and policymakers who need to work with and interpret the results of CGE analysis.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Presenting an economic perspective of deforestation in the Brazilan Amazon, this study utilizes economic and ecological data from 1970 to 1996. It examines the extent to which land clearing promotes economic activity and growth and analyzes policies such as road building and subsidized credit. It explores whether the economic benefits of land clearing surpass the ecological costs and considers the viability of extractivism as an alternative to deforestation.
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit theorems) are described under SRD; mixing and weak dependence are also reviewed. In closing, it describes moment techniques together with their relations to cumulant sums as well as an application to kernel type estimation.The appendix reviews basic probability theory facts and discusses useful laws stemming from the Gaussian laws as well as the basic principles of probability, and is completed by R-scripts used for the figures. Richly illustrated with examples and simulations, the book is recommended for advanced master courses for mathematicians just entering the field of time series, and statisticians who want more mathematical insights into the background of non-linear time series.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals' realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.
The book addresses the problem of calculation of d-dimensional integrals (conditional expectations) in filter problems. It develops new methods of deterministic numerical integration, which can be used to speed up and stabilize filter algorithms. With the help of these methods, better estimates and predictions of latent variables are made possible in the fields of economics, engineering and physics. The resulting procedures are tested within four detailed simulation studies.
Central Bank Balance Sheet and Real Business Cycles argues that a deeper comprehension of changes to the central bank balance sheet can lead to more effective policymaking. Any transaction engaged in by the central bank-issuing currency, conducting foreign exchange operations, investing its own funds, intervening to provide emergency liquidity assistance and carrying out monetary policy operations-influences its balance sheet. Despite this, many central banks throughout the world have largely ignored balance sheet movements, and have instead focused on implementing interest rates. In this book, Mustapha Abiodun Akinkunmi highlights the challenges and controversies faced by central banks in the past and present when implementing policies, and analyzes the links between these policies, the central bank balance sheet, and the consequences to economies as a whole. He argues that the composition and evolution of the central bank balance sheet provides a valuable basis for understanding the needs of an economy, and is an important tool in developing strategies that would most effectively achieve policy goals. This book is an important resource for anyone interested in monetary policy or whose work is affected by the actions of the policies of central banks.
This is the second volume in a two-part series on frontiers in regional research. It identifies methodological advances as well as trends and future developments in regional systems modelling and open science. Building on recent methodological and modelling advances, as well as on extensive policy-analysis experience, top international regional scientists identify and evaluate emerging new conceptual and methodological trends and directions in regional research. Topics such as dynamic interindustry modelling, computable general equilibrium models, exploratory spatial data analysis, geographic information science, spatial econometrics and other advanced methods are the central focus of this book. The volume provides insights into the latest developments in object orientation, open source, and workflow systems, all in support of open science. It will appeal to a wide readership, from regional scientists and economists to geographers, quantitatively oriented regional planners and other related disciplines. It offers a source of relevant information for academic researchers and policy analysts in government, and is also suitable for advanced teaching courses on regional and spatial science, economics and political science.
Principles of Econometrics, 4th Edition, is an introductory book on economics and finance designed to provide an understanding of why econometrics is necessary, and a working knowledge of basic econometric tools. This latest edition is updated to reflect current state of economic and financial markets and provides new content on Kernel Density Fitting and Analysis of Treatment Effects. It offers new end-of-chapters questions and problems in each chapter; updated comprehensive Glossary of Terms; and summary of Probably and Statistics. The text applies basic econometric tools to modeling, estimation, inference, and forecasting through real world problems and evaluates critically the results and conclusions from others who use basic econometric tools. Furthermore, it provides a foundation and understanding for further study of econometrics and more advanced techniques.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often.
Microeconometrics Using Stata, Second Edition is an invaluable reference for researchers and students interested in applied microeconometric methods. Like previous editions, this text covers all the classic microeconometric techniques ranging from linear models to instrumental-variables regression to panel-data estimation to nonlinear models such as probit, tobit, Poisson, and choice models. Each of these discussions has been updated to show the most modern implementation in Stata, and many include additional explanation of the underlying methods. In addition, the authors introduce readers to performing simulations in Stata and then use simulations to illustrate methods in other parts of the book. They even teach you how to code your own estimators in Stata. The second edition is greatly expanded—the new material is so extensive that the text now comprises two volumes. In addition to the classics, the book now teaches recently developed econometric methods and the methods newly added to Stata. Specifically, the book includes entirely new chapters on duration models randomized control trials and exogenous treatment effects endogenous treatment effects models for endogeneity and heterogeneity, including finite mixture models, structural equation models, and nonlinear mixed-effects models spatial autoregressive models semiparametric regression lasso for prediction and inference Bayesian analysis Anyone interested in learning classic and modern econometric methods will find this the perfect companion. And those who apply these methods to their own data will return to this reference over and over as they need to implement the various techniques described in this book.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
"Bayesian Econometrics" illustrates the scope and diversity of modern applications, reviews some recent advances, and highlights many desirable aspects of inference and computations. It begins with an historical overview by Arnold Zellner who describes key contributions to development and makes predictions for future directions. In the second paper, Giordani and Kohn makes suggestions for improving Markov chain Monte Carlo computational strategies. The remainder of the book is categorized according to microeconometric and time-series modeling. Models considered include an endogenous selection ordered probit model, a censored treatment-response model, equilibrium job search models and various other types. These are used to study a variety of applications for example dental insurance and care, educational attainment, voter opinions and the marketing share of various brands and an aggregate cross-section production function. Models and topics considered include the potential problem of improper posterior densities in a variety of dynamic models, selection and averaging for forecasting with vector autoregressions, a consumption capital-asset pricing model and various others. Applications involve U.S. macroeconomic variables, exchange rates, an investigation of purchasing power parity, data from London Metals Exchange, international automobile production data, and data from the Asian stock market.
Nonlinear models have been used extensively in the areas of economics and finance. Recent literature on the topic has shown that a large number of series exhibit nonlinear dynamics as opposed to the alternative--linear dynamics. Incorporating these concepts involves deriving and estimating nonlinear time series models, and these have typically taken the form of Threshold Autoregression (TAR) models, Exponential Smooth Transition (ESTAR) models, and Markov Switching (MS) models, among several others. This edited volume provides a timely overview of nonlinear estimation techniques, offering new methods and insights into nonlinear time series analysis. It features cutting-edge research from leading academics in economics, finance, and business management, and will focus on such topics as Zero-Information-Limit-Conditions, using Markov Switching Models to analyze economics series, and how best to distinguish between competing nonlinear models. Principles and techniques in this book will appeal to econometricians, finance professors teaching quantitative finance, researchers, and graduate students interested in learning how to apply advances in nonlinear time series modeling to solve complex problems in economics and finance.
Davidson and MacKinnon have written an outstanding textbook for graduates in econometrics, covering both basic and advanced topics and using geometrical proofs throughout for clarity of exposition. The book offers a unified theoretical perspective, and emphasizes the practical applications of modern theory. |
![]() ![]() You may like...
Probabilistic Cellular Automata…
Pierre-Yves Louis, Francesca R. Nardi
Hardcover
R5,059
Discovery Miles 50 590
|