![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This book explores the role of national fiscal policies in a selected group of Euro-area countries under the European Economic and Monetary Union (EMU). In particular, the authors characterize the response of output to fiscal consolidations and expansions in the small Euro-area open economies affected by high public and private debt. It is shown that the macroeconomic outcome of fiscal shocks is strongly related to debt levels. The Euro-area countries included in the investigation are Greece, Ireland, Italy, the Netherlands, Spain, and Portugal, over the sample period 1999-2016, i.e., the EMU period. The main econometric tools used in this research are structural vector autoregressive (VAR) models, including panel VAR models. The available literature relating to the subject is also fully reviewed. A further closely investigated topic is the potential spillover effects of German fiscal policies on the selected small Euro-area economies. Moreover, in the perspective of the evolution of the Euro Area towards a full Monetary and Fiscal Union, the authors study the effects of area-wide government spending shocks on aggregate output and other macroeconomic variables during the EMU period. The closing chapter of the book considers evidence on the consequences of austerity policies for European labour markets during recent years.
This book highlights the latest research findings from the 46th International Meeting of the Italian Statistical Society (SIS) in Rome, during which both methodological and applied statistical research was discussed. This selection of fully peer-reviewed papers, originally presented at the meeting, addresses a broad range of topics, including the theory of statistical inference; data mining and multivariate statistical analysis; survey methodologies; analysis of social, demographic and health data; and economic statistics and econometrics.
This book covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students' knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical analysis through interactive examples and is suitable for undergraduate and graduate students taking their first statistics courses, as well as for undergraduate students in non-mathematical fields, e.g. economics, the social sciences etc.
This book grows from a conference on the state of the art and recent advances in Efficiency and Productivity. Papers were commissioned from leading researchers in the field, and include eight explorations into the analytical foundations of efficiency and productivity analysis. Chapters on modeling advances include reverse directional distance function, a new method for estimating technological production possibilities, a new distance function called a loss distance function, an analysis of productivity and price recovery indices, the relation of technical efficiency measures to productivity measures, the implications for benchmarking and target setting of imposing weight restrictions on DEA models, weight restrictions in a regulatory setting, and the Principle of Least Action. Chapters on empirical applications include a study of innovative firms that use innovation inputs to produce innovation outputs, a study of the impact of potential "coopetition" or cooperation among competitors on the financial performance of European automobile plants, using SFA to estimate the eco-efficiency of dairy farms in Spain, a DEA bankruptcy prediction model, a combined stochastic cost frontier analysis model/mixture hazard model, the evolution of energy intensity in nine Spanish manufacturing industries, and the productivity of US farmers as they age.
This trusted textbook returns in its 4th edition with even more exercises to help consolidate understanding - and a companion website featuring additional materials, including a solutions manual for instructors. Offering a unique blend of theory and practical application, it provides ideal preparation for doing applied econometric work as it takes students from a basic level up to an advanced understanding in an intuitive, step-by-step fashion. Clear presentation of economic tests and methods of estimation is paired with practical guidance on using several types of software packages. Using real world data throughout, the authors place emphasis upon the interpretation of results, and the conclusions to be drawn from them in econometric work. This book will be essential reading for economics undergraduate and master's students taking a course in applied econometrics. Its practical nature makes it ideal for modules requiring a research project. New to this Edition: - Additional practical exercises throughout to help consolidate understanding - A freshly-updated companion website featuring a new solutions manual for instructors
The three volumes of the "Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume consists of Cass' early work from his time in graduate school at Stanford University, studying under Hirofumi Uzawa, and as an assistant professor at Yale's Cowles Commission, and his tenure at Carnegie Mellon University's Graduate School of Industrial Administration. The work in this volume focuses primarily on Cass' contributions to what is now known as the Ramsey-Cass-Kooopmans neoclassical growth model, and the development of what is now known as the Cass criterion for determining whether intertemporal allocations are efficient. This period also includes Cass' early work on overlapping generation's models, asset pricing models, and methodological contributions in dynamic systems applications in economics.
The three volumes of the "The Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume consists of the work Cass completed after leaving Carnegie Mellon for the University of Pennsylvania's Economics Department (where he remained for the rest of his career). The work during this period encompasses his well-known collaboration with Karl Shell and Yves Balasko on overlapping generations models, and his development with Karl of the notion of 'sunspot equilibria' - rational expectations equilibria which are essentially self-fulfilling prophecies. This period also saw the beginnings of Cass' pioneering research into the theory of incomplete markets, which grew naturally form his early interest in models of asset pricing, and includes the paper which developed what is now known as the Cass trick for analyzing incomplete markets models.
The three volumes of the "Collected Scientific Works of David Cass" are ordered chronologically, which happens to coincide with the development of the three major advances in Cass' research agenda, the development of the neoclassical growth model, the discovery of sunspot equilibria, and the analysis of models of market incompleteness. This volume covers the period from the middle 1980's through the end of Cass' life in 2008. Cass' research during this period included definitive papers showing that competitive equilibrium is generically indeterminate when markets are incomplete, and on the relationship between market incompleteness and the existence of sunspot equilibrium. This period also saw the follow-on papers addressing the issue of how financial innovation affects economic welfare, showing in particular that innovation can lead to welfare losses as well as gains, depending on the nature of the innovation.
This book presents a systematic overview of cutting-edge research in the field of parametric modeling of personal income and wealth distribution, which allows one to represent how income/wealth is distributed within a given population. The estimated parameters may be used to gain insights into the causes of the evolution of income/wealth distribution over time, or to interpret the differences between distributions across countries. Moreover, once a given parametric model has been fitted to a data set, one can straightforwardly compute inequality and poverty measures. Finally, estimated parameters may be used in empirical modeling of the impact of macroeconomic conditions on the evolution of personal income/wealth distribution. In reviewing the state of the art in the field, the authors provide a thorough discussion of parametric models belonging to the " -generalized" family, a new and fruitful set of statistical models for the size distribution of income and wealth that they have developed over several years of collaborative and multidisciplinary research. This book will be of interest to all who share the belief that problems of income and wealth distribution merit detailed conceptual and methodological attention.
Within the subprime crisis (2007) and the recent global financial crisis of 2008-2009, we have observed significant decline, corrections and structural changes in most US and European financial markets. Furthermore, it seems that this crisis has been rapidly transmitted toward the most developed and emerging countries and has strongly affected the whole economy. This volume aims to present recent researches in linear and nonlinear modelling of economic and financial time-series. The several discussions of empirical results of its chapters clearly help to improve the understanding of the financial mechanisms inherent to this crisis. They also yield an important overview on the sources of the financial crisis and its main economic and financial consequences. The book provides the audience a comprehensive understanding of financial and economic dynamics in various aspects using modern financial econometric methods. It addresses the empirical techniques needed by economic agents to analyze the dynamics of these markets and illustrates how they can be applied to the actual data. It also presents and discusses new research findings and their implications.
Across the globe every nation wrestles with how it will pay for, provide, regulate and administer its healthcare system. Health economics is the field of economics that deals with every one of those issues and the difficult issue of allocating resources where the allocation can literally mean life or death, alleviating suffering or not. A key issue that is always mentioned, but little acted on, is the role that preventive measures play in the battle against disease and using limited healthcare resources more efficaciously. This book brings together leading researchers in the healthcare economics field presenting new research on some of these key issues such as the impact of obesity on health, children's' healthcare policies, education and health; and many more.
This volume is a collection of methodological developments and applications of simulation-based methods that were presented at a workshop at Louisiana State University in November, 2009. The first two papers are extensions of the GHK simulator: one reconsiders the computation of the probabilities in a discrete choice model while another example uses an adaptive version of sparse-grids integration (SGI) instead of simulation. Two studies are focused specifically on the methodology: the first compares the performance of the maximum-simulated likelihood (MSL) approach with a proposed composite marginal likelihood (CML) approach in multivariate ordered-response situations, while the second examines methods of testing for the presence of heterogeneity in the heterogeneity model. Further topics examined include: education savings accounts, parent contributions and education attainment; estimating the effect of exchange rate flexibility on financial account openness; estimating a fractional response model with a count endogenous regressor; and modelling and forecasting volatility in a bayesian approach.
Volume 1 covers statistical methods related to unit roots, trend breaks and their interplay. Testing for unit roots has been a topic of wide interest and the author was at the forefront of this research. The book covers important topics such as the Phillips-Perron unit root test and theoretical analyses about their properties, how this and other tests could be improved, and ingredients needed to achieve better tests and the proposal of a new class of tests. Also included are theoretical studies related to time series models with unit roots and the effect of span versus sampling interval on the power of the tests. Moreover, this book deals with the issue of trend breaks and their effect on unit root tests. This research agenda fostered by the author showed that trend breaks and unit roots can easily be confused. Hence, the need for new testing procedures, which are covered.Volume 2 is about statistical methods related to structural change in time series models. The approach adopted is off-line whereby one wants to test for structural change using a historical dataset and perform hypothesis testing. A distinctive feature is the allowance for multiple structural changes. The methods discussed have, and continue to be, applied in a variety of fields including economics, finance, life science, physics and climate change. The articles included address issues of estimation, testing and/or inference in a variety of models: short-memory regressors and errors, trends with integrated and/or stationary errors, autoregressions, cointegrated models, multivariate systems of equations, endogenous regressors, long-memory series, among others. Other issues covered include the problems of non-monotonic power and the pitfalls of adopting a local asymptotic framework. Empirical analyses are provided for the US real interest rate, the US GDP, the volatility of asset returns and climate change.
Bayesian Statistical Methods provides data scientists with the foundational and computational tools needed to carry out a Bayesian analysis. This book focuses on Bayesian methods applied routinely in practice including multiple linear regression, mixed effects models and generalized linear models (GLM). The authors include many examples with complete R code and comparisons with analogous frequentist procedures. In addition to the basic concepts of Bayesian inferential methods, the book covers many general topics: Advice on selecting prior distributions Computational methods including Markov chain Monte Carlo (MCMC) Model-comparison and goodness-of-fit measures, including sensitivity to priors Frequentist properties of Bayesian methods Case studies covering advanced topics illustrate the flexibility of the Bayesian approach: Semiparametric regression Handling of missing data using predictive distributions Priors for high-dimensional regression models Computational techniques for large datasets Spatial data analysis The advanced topics are presented with sufficient conceptual depth that the reader will be able to carry out such analysis and argue the relative merits of Bayesian and classical methods. A repository of R code, motivating data sets, and complete data analyses are available on the book's website. Brian J. Reich, Associate Professor of Statistics at North Carolina State University, is currently the editor-in-chief of the Journal of Agricultural, Biological, and Environmental Statistics and was awarded the LeRoy & Elva Martin Teaching Award. Sujit K. Ghosh, Professor of Statistics at North Carolina State University, has over 22 years of research and teaching experience in conducting Bayesian analyses, received the Cavell Brownie mentoring award, and served as the Deputy Director at the Statistical and Applied Mathematical Sciences Institute.
The aim of the proposed volume will be to present new developments in the methodology and practice of CGE techniques as they apply to recent issues in international trade policy. The volume will be of interest to academic researchers working in trade policy analysis and applied general equilibrium, advanced graduate students in international economics, applied researchers in multilateral organizations, and policymakers who need to work with and interpret the results of CGE analysis.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Presenting an economic perspective of deforestation in the Brazilan Amazon, this study utilizes economic and ecological data from 1970 to 1996. It examines the extent to which land clearing promotes economic activity and growth and analyzes policies such as road building and subsidized credit. It explores whether the economic benefits of land clearing surpass the ecological costs and considers the viability of extractivism as an alternative to deforestation.
This volume presents classical results of the theory of enlargement of filtration. The focus is on the behavior of martingales with respect to the enlarged filtration and related objects. The study is conducted in various contexts including immersion, progressive enlargement with a random time and initial enlargement with a random variable. The aim of this book is to collect the main mathematical results (with proofs) previously spread among numerous papers, great part of which is only available in French. Many examples and applications to finance, in particular to credit risk modelling and the study of asymmetric information, are provided to illustrate the theory. A detailed summary of further connections and applications is given in bibliographic notes which enables to deepen study of the topic. This book fills a gap in the literature and serves as a guide for graduate students and researchers interested in the role of information in financial mathematics and in econometric science. A basic knowledge of the general theory of stochastic processes is assumed as a prerequisite.
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit theorems) are described under SRD; mixing and weak dependence are also reviewed. In closing, it describes moment techniques together with their relations to cumulant sums as well as an application to kernel type estimation.The appendix reviews basic probability theory facts and discusses useful laws stemming from the Gaussian laws as well as the basic principles of probability, and is completed by R-scripts used for the figures. Richly illustrated with examples and simulations, the book is recommended for advanced master courses for mathematicians just entering the field of time series, and statisticians who want more mathematical insights into the background of non-linear time series.
This contributed volume applies spatial and space-time econometric methods to spatial interaction modeling. The first part of the book addresses general cutting-edge methodological questions in spatial econometric interaction modeling, which concern aspects such as coefficient interpretation, constrained estimation, and scale effects. The second part deals with technical solutions to particular estimation issues, such as intraregional flows, Bayesian PPML and VAR estimation. The final part presents a number of empirical applications, ranging from interregional tourism competition and domestic trade to space-time migration modeling and residential relocation.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
Four years ago "Research in Experimental Economics" published experimental evidence on fundraising and charitable contributions. This volume returns to the intrigue with philanthropy. Employing a mixture of laboratory and field experiments as well as theoretical research we present this new volume, "Charity with Choice." New waves of experiments are taking advantage of well calibrated environments established by past efforts to add new features to experiments such as endogeneity and self-selection. Adventurous new research programs are popping up and some of them are captured here in this volume. Among the major themes in which the tools of choice, endogeneity, and self-selection are employed are: What increases or decreases charitable activity? and How do organizational and managerial issues affect the performance of non-profit organizations?
This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals' realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.
The book addresses the problem of calculation of d-dimensional integrals (conditional expectations) in filter problems. It develops new methods of deterministic numerical integration, which can be used to speed up and stabilize filter algorithms. With the help of these methods, better estimates and predictions of latent variables are made possible in the fields of economics, engineering and physics. The resulting procedures are tested within four detailed simulation studies.
This book presents a comprehensive study of multivariate time series with linear state space structure. The emphasis is put on both the clarity of the theoretical concepts and on efficient algorithms for implementing the theory. In particular, it investigates the relationship between VARMA and state space models, including canonical forms. It also highlights the relationship between Wiener-Kolmogorov and Kalman filtering both with an infinite and a finite sample. The strength of the book also lies in the numerous algorithms included for state space models that take advantage of the recursive nature of the models. Many of these algorithms can be made robust, fast, reliable and efficient. The book is accompanied by a MATLAB package called SSMMATLAB and a webpage presenting implemented algorithms with many examples and case studies. Though it lays a solid theoretical foundation, the book also focuses on practical application, and includes exercises in each chapter. It is intended for researchers and students working with linear state space models, and who are familiar with linear algebra and possess some knowledge of statistics. |
![]() ![]() You may like...
Software Engineering for Parallel and…
Innes Jelly, Ian Gorton, …
Hardcover
R5,771
Discovery Miles 57 710
Fault-Tolerant Parallel Computation
Paris Christos Kanellakis, Alex Allister Shvartsman
Hardcover
R2,996
Discovery Miles 29 960
PARLE '91. Parallel Architectures and…
Emile H. L. Aarts, Jan Van Leeuwen, …
Paperback
R1,743
Discovery Miles 17 430
Introduction to Parallel Computing - A…
Wesley Petersen, Peter Arbenz
Hardcover
R6,204
Discovery Miles 62 040
Nonlinear Assignment Problems…
Panos M. Pardalos, L.S. Pitsoulis
Hardcover
R4,390
Discovery Miles 43 900
Competence in High Performance Computing…
Christian Bischof, Heinz-Gerd Hegering, …
Hardcover
R2,889
Discovery Miles 28 890
|