![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This Handbook provides up-to-date coverage of both new developments and well-established fields in the sphere of economic forecasting. The chapters are written by world experts in their respective fields, and provide authoritative yet accessible accounts of the key concepts, subject matter and techniques in a number of diverse but related areas. It covers the ways in which the availability of ever more plentiful data and computational power have been used in forecasting, either in terms of the frequency of observations, the number of variables, or the use of multiple data vintages. Greater data availability has been coupled with developments in statistical theory and economic theory to allow more elaborate and complicated models to be entertained; the volume provides explanations and critiques of these developments. These include factor models, DSGE models, restricted vector autoregressions, and non-linear models, as well as models for handling data observed at mixed frequencies, high-frequency data, multiple data vintages, and methods for forecasting when there are structural breaks, and how breaks might be forecast. Also covered are areas which are less commonly associated with economic forecasting, such as climate change, health economics, long-horizon growth forecasting, and political elections. Econometric forecasting has important contributions to make in these areas, as well as their developments informing the mainstream. In the early 21st century, climate change and the forecasting of health expenditures and population are topics of pressing importance.
Back in the good old days on the fourth floor of the Altbau of Bonn's Ju ridicum, Werner Hildenbrand put an end to a debate about a festschrift in honor of an economist on the occasion of his turning 60 with a laconic: "Much too early." Remembering his position five years ago, we did not dare to think about one for him. But now he has turned 65. If consulted, he would most likely still answer: "Much too early." However, he has to take his official re tirement, and we believe that this is the right moment for such an endeavor. No doubt Werner Hildenbrand will not really retire. As professor emeritus, free from the constraints of a rigid teaching schedule and the burden of com mittee meetings, he will be able to indulge his passions. We expect him to pursue, with undiminished enthusiasm, his research, travel, golfing, the arts, and culinary pleasures - escaping real retirement."
Factor models have become the most successful tool in the analysis and forecasting of high-dimensional time series. This monograph provides an extensive account of the so-called General Dynamic Factor Model methods. The topics covered include: asymptotic representation problems, estimation, forecasting, identification of the number of factors, identification of structural shocks, volatility analysis, and applications to macroeconomic and financial data.
The aim of this book is to bridge the gap between standard textbook models and a range of models where the dynamic structure of the data manifests itself fully. The common denominator of such models is stochastic processes. The authors show how counting processes, martingales, and stochastic integrals fit very nicely with censored data. Beginning with standard analyses such as Kaplan-Meier plots and Cox regression, the presentation progresses to the additive hazard model and recurrent event data. Stochastic processes are also used as natural models for individual frailty; they allow sensible interpretations of a number of surprising artifacts seen in population data. The stochastic process framework is naturally connected to causality. The authors show how dynamic path analyses can incorporate many modern causality ideas in a framework that takes the time aspect seriously. To make the material accessible to the reader, a large number of practical examples, mainly from medicine, are developed in detail. Stochastic processes are introduced in an intuitive and non-technical manner. The book is aimed at investigators who use event history methods and want a better understanding of the statistical concepts. It is suitable as a textbook for graduate courses in statistics and biostatistics.
Spatial econometrics deals with spatial dependence and spatial heterogeneity, critical aspects of the data used by regional scientists. These characteristics may cause standard econometric techniques to become inappropriate. In this book, I combine several recent research results to construct a comprehensive approach to the incorporation of spatial effects in econometrics. My primary focus is to demonstrate how these spatial effects can be considered as special cases of general frameworks in standard econometrics, and to outline how they necessitate a separate set of methods and techniques, encompassed within the field of spatial econometrics. My viewpoint differs from that taken in the discussion of spatial autocorrelation in spatial statistics - e.g., most recently by Cliff and Ord (1981) and Upton and Fingleton (1985) - in that I am mostly concerned with the relevance of spatial effects on model specification, estimation and other inference, in what I caIl a model-driven approach, as opposed to a data-driven approach in spatial statistics. I attempt to combine a rigorous econometric perspective with a comprehensive treatment of methodological issues in spatial analysis.
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include:
Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
This book analyses the dynamics of Indian stock market with a special emphasis during the period following emergence of Covid-19. Coming from the instability in stock market following Covid-19, it delves deeper into the dynamics and unfolds the causal relationship between various economic fundamentals and the stock prices. Observing short-term herding in the stock market following Covid-19, the book's finding suggests that investors in the Indian stock market made investment choices irrationally during Covid-19 crisis periods. It also showcases how the stock market became inefficient following the emergence of pandemic and did not follow the fundamentals. Interestingly, the findings suggest no relationship between stock returns and real economic activities in India. The format of presentation makes the book well suited not only for students, academics, policy makers and investors in the stock markets, but also people engaged or interested in business and finance. The book would thus be of interest to both specialists and the laity. Analysis contained in this book will help different readership groups in different ways. Researchers from economics and finance disciplines will be able to learn about frontiers in the theoretical paradigms discussed in the book; advanced econometric techniques applied in the book will also be useful for their own research. The macroeconomic insights, and insights from behavioural economics, can expand the knowledge of corporate sector, useful in making real life decisions. Finally, it will help policy makers, like SEBI (Securities and Exchange Board of India), to formulate appropriate regulatory policies so as to minimize possibility of speculative bubbles as experienced during the pandemic period in the Indian stock markets.
Analyze key indicators more accurately to make smarter market moves The Economic Indicator Handbook helps investors more easily evaluate economic trends, to better inform investment decision making and other key strategic financial planning. Written by a Bloomberg Senior Economist, this book presents a visual distillation of the indicators every investor should follow, with clear explanation of how they're measured, what they mean, and how that should inform investment thinking. The focus on graphics, professional application, Bloomberg terminal functionality, and practicality makes this guide a quick, actionable read that could immediately start improving investment outcomes. Coverage includes gross domestic product, employment data, industrial production, new residential construction, consumer confidence, retail and food service sales, and commodities, plus guidance on the secret indicators few economists know or care about. Past performance can predict future results if you know how to read the indicators. Modern investing requires a careful understanding of the macroeconomic forces that lift and topple markets on a regular basis, and how they shift to move entire economies. This book is a visual guide to recognizing these forces and tracking their behavior, helping investors identify entry and exit points that maximize profit and minimize loss. * Quickly evaluate economic trends * Make more informed investment decisions * Understand the most essential indicators * Translate predictions into profitable actions Savvy market participants know how critical certain indicators are to the formulation of a profitable, effective market strategy. A daily indicator check can inform day-to-day investing, and long-term tracking can result in a stronger, more robust portfolio. For the investor who knows that better information leads to better outcomes, The Economic Indicator Handbook is an exceptionally useful resource.
This groundbreaking textbook combines straightforward explanations with a wealth of practical examples to offer an innovative approach to teaching linear algebra. Requiring no prior knowledge of the subject, it covers the aspects of linear algebra - vectors, matrices, and least squares - that are needed for engineering applications, discussing examples across data science, machine learning and artificial intelligence, signal and image processing, tomography, navigation, control, and finance. The numerous practical exercises throughout allow students to test their understanding and translate their knowledge into solving real-world problems, with lecture slides, additional computational exercises in Julia and MATLAB (R), and data sets accompanying the book online. Suitable for both one-semester and one-quarter courses, as well as self-study, this self-contained text provides beginning students with the foundation they need to progress to more advanced study.
Petri Nets were defined for the study of discrete events systems and later extended for many purposes including dependability assessment. In our knowledge, no book deals specifically with the use of different type of PN to dependability. We propose in addition to bring a focus on the adequacy of Petri net types to the study of various problems related to dependability such as risk analysis and probabilistic assessment. In the first part, the basic models of PN and some useful extensions are briefly recalled. In the second part, the PN are used as a formal model to describe the evolution process of critical system in the frame of an ontological approach. The third part focuses on the stochastic Petri Nets (SPN) and their use in dependability assessment. Different formal models of SPN are formally presented (semantics, evolution rules...) and their equivalence with the corresponding class of Markov processes to get an analytical assessment of dependability. Simplification methods are proposed in order to reduce the size of analytical model and to make it more calculable. The introduction of some concepts specific to high level PN allows too the consideration of complex systems. Few applications in the field of the instrumentation and control (l&C) systems, safety integrated systems (SIS) emphasize the benefits of SPN for dependability assessment.
This book proposes new solutions to the problem of poverty, and begins with providing analyses. It bases most of the analyses and solutions in the context of the digital era. The book also follows, in addition to a scientific distribution, a spatial-geographical one: analyses of countries of the European Union as well as South Africa, while it referring to two main variables, television and art, as agents of poverty alleviation. The book places particular focus on how poverty is understood in the framework of Industry 4.0. It introduces a new expanded Multidimensional Poverty Index with more than 20 dimensions; moreover, it provides a mathematically based solution for the disposal of perishable food. Finally, it does not disregard the crucial aspect of the issue of poverty: that of education planning. This book is of interest to specialists in poverty research, from students to professionals and from professors to activists, without excluding engineers.
This third edition capitalizes on the success of the previous editions and leverages the important advancements in visualization, data analysis, and sharing capabilities that have emerged in recent years. It serves as an accelerated guide to decision support designs for consultants, service professionals and students. This 'fast track' enables a ramping up of skills in Excel for those who may have never used it to reach a level of mastery that will allow them to integrate Excel with widely available associated applications, make use of intelligent data visualization and analysis techniques, automate activity through basic VBA designs, and develop easy-to-use interfaces for customizing use. The content of this edition has been completely restructured and revised, with updates that correspond with the latest versions of software and references to contemporary add-in development across platforms. It also features best practices in design and analytical consideration, including methodical discussions of problem structuring and evaluation, as well as numerous case examples from practice.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics.
This textbook provides a self-contained presentation of the theory and models of time series analysis. Putting an emphasis on weakly stationary processes and linear dynamic models, it describes the basic concepts, ideas, methods and results in a mathematically well-founded form and includes numerous examples and exercises. The first part presents the theory of weakly stationary processes in time and frequency domain, including prediction and filtering. The second part deals with multivariate AR, ARMA and state space models, which are the most important model classes for stationary processes, and addresses the structure of AR, ARMA and state space systems, Yule-Walker equations, factorization of rational spectral densities and Kalman filtering. Finally, there is a discussion of Granger causality, linear dynamic factor models and (G)ARCH models. The book provides a solid basis for advanced mathematics students and researchers in fields such as data-driven modeling, forecasting and filtering, which are important in statistics, control engineering, financial mathematics, econometrics and signal processing, among other subjects.
This book focuses on economic inequality, its measurement, and its relationship with economic growth and development. The current literature uses multiple points of view, ranging from ethical, legal, philosophical, to political and economic, to understand the nature of (in)equality. Presenting the problem objectively, this book shows how to measure the phenomenon statistically along with an international comparison of the level of income inequality and economic growth and of their complex relationship. The book also analyzes three decades of theoretical and empirical evidence to understand this phenomenon and discusses a number of political measures to reduce economic disparities while stimulating economic growth.
Across the social sciences there has been increasing focus on reproducibility, i.e., the ability to examine a study's data and methods to ensure accuracy by reproducing the study. Reproducible Econometrics Using R combines an overview of key issues and methods with an introduction to how to use them using open source software (R) and recently developed tools (R Markdown and bookdown) that allow the reader to engage in reproducible econometric research. Jeffrey S. Racine provides a step-by-step approach, and covers five sets of topics, i) linear time series models, ii) robust inference, iii) robust estimation, iv) model uncertainty, and v) advanced topics. The time series material highlights the difference between time-series analysis, which focuses on forecasting, versus cross-sectional analysis, where the focus is typically on model parameters that have economic interpretations. For the time series material, the reader begins with a discussion of random walks, white noise, and non-stationarity. The reader is next exposed to the pitfalls of using standard inferential procedures that are popular in cross sectional settings when modelling time series data, and is introduced to alternative procedures that form the basis for linear time series analysis. For the robust inference material, the reader is introduced to the potential advantages of bootstrapping and the Jackknifing versus the use of asymptotic theory, and a range of numerical approaches are presented. For the robust estimation material, the reader is presented with a discussion of issues surrounding outliers in data and methods for addressing their presence. Finally, the model uncertainly material outlines two dominant approaches for dealing with model uncertainty, namely model selection and model averaging. Throughout the book there is an emphasis on the benefits of using R and other open source tools for ensuring reproducibility. The advanced material covers machine learning methods (support vector machines that are useful for classification) and nonparametric kernel regression which provides the reader with more advanced methods for confronting model uncertainty. The book is well suited for advanced undergraduate and graduate students alike. Assignments, exams, slides, and a solution manual are available for instructors.
This valuable text provides a comprehensive introduction to VAR modelling and how it can be applied. In particular, the author focuses on the properties of the Cointegrated VAR model and its implications for macroeconomic inference when data are non-stationary. The text provides a number of insights into the links between statistical econometric modelling and economic theory and gives a thorough treatment of identification of the long-run and short-run structure as well as of the common stochastic trends and the impulse response functions, providing in each case illustrations of applicability. This book presents the main ingredients of the Copenhagen School of Time-Series Econometrics in a transparent and coherent framework. The distinguishing feature of this school is that econometric theory and applications have been developed in close cooperation. The guiding principle is that good econometric work should take econometrics, institutions, and economics seriously. The author uses a single data set throughout most of the book to guide the reader through the econometric theory while also revealing the full implications for the underlying economic model. To test ensure full understanding the book concludes with the introduction of two new data sets to combine readers understanding of econometric theory and economic models, with economic reality.
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
This new text book by Urs Birchler and Monika Butler is an
introduction to the study of how information affects economic
relations. The authors provide a narrative treatment of the more
formal concepts of Information Economics, using easy to understand
and lively illustrations from film and literature and nutshell
examples. This book also comes with a supporting website (www.alicebob.info), maintained by the authors.
This book provides an accessible presentation of the standard
statistical techniques used by labor economists. It emphasises both
the input and the output of empirical analysis and covers five
major topics concerning econometric methods used in labor
economics: regression and related methods, choice modelling,
selectivity issues, duration analysis, and policy evaluation
techniques. Each of these is presented in terms of model
specification, possible estimation problems, diagnostic checking,
and interpretation of the output. It aims to provide guidance to
practitioners on how to use the techniques and how to make sense of
the results that are produced. It covers methods that are
considered to be "standard" tools in labor economics, but which are
often given only a brief and highly technical treatment in
econometrics textbooks.
"Structural Models of Wage and Employment Dynamics" contains
selected papers from a conference held in honour of Professor Dale
T. Mortensen upon the occasion of his 65th birthday. The papers are
on some of Professor Dale T. Mortensen's current research topics:
The development of equilibrium dynamic models designed to account
for wage dispersion and the time series behaviour of job and worker
flows. The conference is the sixth in a series. From the beginning
there has been a close interplay among economic theorists,
econometricians, and applied economists. This book also has a
section with theoretical papers as well as sections wtih micro- and
macro-econometric papers. These conferences have had significant
influence on how we think about public policy in the labour market,
and what kinds of data would be needed to answer questions about
these policies.
Stochastic differential equations are differential equations whose solutions are stochastic processes. They exhibit appealing mathematical properties that are useful in modeling uncertainties and noisy phenomena in many disciplines. This book is motivated by applications of stochastic differential equations in target tracking and medical technology and, in particular, their use in methodologies such as filtering, smoothing, parameter estimation, and machine learning. It builds an intuitive hands-on understanding of what stochastic differential equations are all about, but also covers the essentials of Ito calculus, the central theorems in the field, and such approximation schemes as stochastic Runge-Kutta. Greater emphasis is given to solution methods than to analysis of theoretical properties of the equations. The book's practical approach assumes only prior understanding of ordinary differential equations. The numerous worked examples and end-of-chapter exercises include application-driven derivations and computational assignments. MATLAB/Octave source code is available for download, promoting hands-on work with the methods.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007-2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
Since the 1980s, and especially since the Rio Earth Summit in 1992, there has been a substantial extension in the adoption and use of Environmental Assessment (EA) procedures in developing countries and countries in transition (low and middle income countries). However, few existing texts in environmental assessment or development studies have reflected this trend sufficiently, until this publication. The book is divided into two main parts:
|
![]() ![]() You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,790
Discovery Miles 37 900
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,791
Discovery Miles 27 910
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,881
Discovery Miles 38 810
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,736
Discovery Miles 67 360
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,695
Discovery Miles 26 950
|