Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
In the light of better and more detailed administrative databases, this open access book provides statistical tools for evaluating the effects of public policies advocated by governments and public institutions. Experts from academia, national statistics offices and various research centers present modern econometric methods for an efficient data-driven policy evaluation and monitoring, assess the causal effects of policy measures and report on best practices of successful data management and usage. Topics include data confidentiality, data linkage, and national practices in policy areas such as public health, education and employment. It offers scholars as well as practitioners from public administrations, consultancy firms and nongovernmental organizations insights into counterfactual impact evaluation methods and the potential of data-based policy and program evaluation.
This book analyzes how the choice of a particular disclosure limitation method, namely additive and multiplicative measurement error, affects the quality of the data and limits its usefulness for empirical research. Generally, a disclosure limitation method can be regarded as a data filter that transforms the true data generating process. This book focuses explicitly on the consequences of additive and multiplicative measurement error for the properties of nonlinear econometric estimators. It investigates the extent to which appropriate econometric techniques can yield consistent and unbiased estimates of the true data generating process in the case of disclosure limitation. Sandra Nolte received her PhD in Economics at the University of Konstanz, Germany in 2008 and is a postdoctoral researcher at the Financial Econometric Research Centre at the Warwick Business School, UK since 2009. Her research areas include microeconometrics and financial econometrics.
Two main purposes of econometrics are to give empirical content to economic theory by formulating economic models in testable form and to estimate those models and test them as to acceptance or rejection. Topics discussed in this compilation include an assessment of the econometric methods for program evaluation and a proposal to extend the difference-in-differences estimator to dynamic treatment; empirical estimations of FDI spillovers; econometric modelling of time to event data; regression and data envelopment analysis methods to assess practice efficiency and implication of instability on econometric and financial time series modelling.
The increasing volume of available data, the rapid advances in telecommunications, and the interrelationships of the global markets which are becoming stronger, create new challenges for researchers and practitioners in economics and finance. At the same time, the regulatory authorities are in a continuous attempt to establish effective rules for monitoring and regulating the global markets. As a result, the decision making process in economics and finance becomes more and more complex, thus requiring new advanced and sophisticated analysis methodologies to be developed and implemented. The contents of this volume cover a wide range of topics, including among others portfolio optimisation, stock market prediction and trading, auditing, investment decisions, banking management and corporate performance.
This book focuses on economic inequality, its measurement, and its relationship with economic growth and development. The current literature uses multiple points of view, ranging from ethical, legal, philosophical, to political and economic, to understand the nature of (in)equality. Presenting the problem objectively, this book shows how to measure the phenomenon statistically along with an international comparison of the level of income inequality and economic growth and of their complex relationship. The book also analyzes three decades of theoretical and empirical evidence to understand this phenomenon and discusses a number of political measures to reduce economic disparities while stimulating economic growth.
Seasonality in economic time series can "obscure" movements of other components in a series that are operationally more important for economic and econometric analyses. In practice, one often prefers to work with seasonally adjusted data to assess the current state of the economy and its future course. This book presents a seasonal adjustment program called CAMPLET, an acronym of its tuning parameters, which consists of a simple adaptive procedure to extract the seasonal and the non-seasonal component from an observed series. Once this process is carried out, there will be no need to revise these components at a later stage when new observations become available. The authors describe the main features of CAMPLET, evaluate the outcomes of CAMPLET and X-13ARIMA-SEATS in a controlled simulation framework using a variety of data generating processes, and illustrate CAMPLET and X-13ARIMA-SEATS with three time series: US non-farm payroll employment, operational income of Ahold and real GDP in the Netherlands. Furthermore they show how CAMPLET performs under the COVID-19 crisis, and its attractiveness in dealing with daily data. This book appeals to scholars and students of econometrics and statistics, interested in the application of statistical methods for empirical economic modeling.
This Handbook provides up-to-date coverage of both new developments and well-established fields in the sphere of economic forecasting. The chapters are written by world experts in their respective fields, and provide authoritative yet accessible accounts of the key concepts, subject matter and techniques in a number of diverse but related areas. It covers the ways in which the availability of ever more plentiful data and computational power have been used in forecasting, either in terms of the frequency of observations, the number of variables, or the use of multiple data vintages. Greater data availability has been coupled with developments in statistical theory and economic theory to allow more elaborate and complicated models to be entertained; the volume provides explanations and critiques of these developments. These include factor models, DSGE models, restricted vector autoregressions, and non-linear models, as well as models for handling data observed at mixed frequencies, high-frequency data, multiple data vintages, and methods for forecasting when there are structural breaks, and how breaks might be forecast. Also covered are areas which are less commonly associated with economic forecasting, such as climate change, health economics, long-horizon growth forecasting, and political elections. Econometric forecasting has important contributions to make in these areas, as well as their developments informing the mainstream. In the early 21st century, climate change and the forecasting of health expenditures and population are topics of pressing importance.
A complete resource for finance students, this textbook presents the most common empirical approaches in finance in a comprehensive and well-illustrated manner that shows how econometrics is used in practice, and includes detailed case studies to explain how the techniques are used in relevant financial contexts. Maintaining the accessible prose and clear examples of previous editions, the new edition of this best-selling textbook provides support for the main industry-standard software packages, expands the coverage of introductory mathematical and statistical techniques into two chapters for students without prior econometrics knowledge, and includes a new chapter on advanced methods. Learning outcomes, key concepts and end-of-chapter review questions (with full solutions online) highlight the main chapter takeaways and allow students to self-assess their understanding. Online resources include extensive teacher and student support materials, including EViews, Stata, R, and Python software guides.
Back in the good old days on the fourth floor of the Altbau of Bonn's Ju ridicum, Werner Hildenbrand put an end to a debate about a festschrift in honor of an economist on the occasion of his turning 60 with a laconic: "Much too early." Remembering his position five years ago, we did not dare to think about one for him. But now he has turned 65. If consulted, he would most likely still answer: "Much too early." However, he has to take his official re tirement, and we believe that this is the right moment for such an endeavor. No doubt Werner Hildenbrand will not really retire. As professor emeritus, free from the constraints of a rigid teaching schedule and the burden of com mittee meetings, he will be able to indulge his passions. We expect him to pursue, with undiminished enthusiasm, his research, travel, golfing, the arts, and culinary pleasures - escaping real retirement."
The aim of this book is to bridge the gap between standard textbook models and a range of models where the dynamic structure of the data manifests itself fully. The common denominator of such models is stochastic processes. The authors show how counting processes, martingales, and stochastic integrals fit very nicely with censored data. Beginning with standard analyses such as Kaplan-Meier plots and Cox regression, the presentation progresses to the additive hazard model and recurrent event data. Stochastic processes are also used as natural models for individual frailty; they allow sensible interpretations of a number of surprising artifacts seen in population data. The stochastic process framework is naturally connected to causality. The authors show how dynamic path analyses can incorporate many modern causality ideas in a framework that takes the time aspect seriously. To make the material accessible to the reader, a large number of practical examples, mainly from medicine, are developed in detail. Stochastic processes are introduced in an intuitive and non-technical manner. The book is aimed at investigators who use event history methods and want a better understanding of the statistical concepts. It is suitable as a textbook for graduate courses in statistics and biostatistics.
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include:
Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book.
This book analyses the dynamics of Indian stock market with a special emphasis during the period following emergence of Covid-19. Coming from the instability in stock market following Covid-19, it delves deeper into the dynamics and unfolds the causal relationship between various economic fundamentals and the stock prices. Observing short-term herding in the stock market following Covid-19, the book's finding suggests that investors in the Indian stock market made investment choices irrationally during Covid-19 crisis periods. It also showcases how the stock market became inefficient following the emergence of pandemic and did not follow the fundamentals. Interestingly, the findings suggest no relationship between stock returns and real economic activities in India. The format of presentation makes the book well suited not only for students, academics, policy makers and investors in the stock markets, but also people engaged or interested in business and finance. The book would thus be of interest to both specialists and the laity. Analysis contained in this book will help different readership groups in different ways. Researchers from economics and finance disciplines will be able to learn about frontiers in the theoretical paradigms discussed in the book; advanced econometric techniques applied in the book will also be useful for their own research. The macroeconomic insights, and insights from behavioural economics, can expand the knowledge of corporate sector, useful in making real life decisions. Finally, it will help policy makers, like SEBI (Securities and Exchange Board of India), to formulate appropriate regulatory policies so as to minimize possibility of speculative bubbles as experienced during the pandemic period in the Indian stock markets.
This collection of original articles 8 years in the making
shines a bright light on recent advances in financial econometrics.
From a survey of mathematical and statistical tools for
understanding nonlinear Markov processes to an exploration of the
time-series evolution of the risk-return tradeoff for stock market
investment, noted scholars Yacine Ait-Sahalia and Lars Peter Hansen
benchmark the current state of knowledge while contributors build a
framework for its growth. Whether in the presence of statistical
uncertainty or the proven advantages and limitations of value at
risk models, readers will discover that they can set few
constraints on the value of this long-awaited volume.
This book explores the novel uses and potentials of Data Envelopment Analysis (DEA) under big data. These areas are of widespread interest to researchers and practitioners alike. Considering the vast literature on DEA, one could say that DEA has been and continues to be, a widely used technique both in performance and productivity measurement, having covered a plethora of challenges and debates within the modelling framework.
This book proposes new solutions to the problem of poverty, and begins with providing analyses. It bases most of the analyses and solutions in the context of the digital era. The book also follows, in addition to a scientific distribution, a spatial-geographical one: analyses of countries of the European Union as well as South Africa, while it referring to two main variables, television and art, as agents of poverty alleviation. The book places particular focus on how poverty is understood in the framework of Industry 4.0. It introduces a new expanded Multidimensional Poverty Index with more than 20 dimensions; moreover, it provides a mathematically based solution for the disposal of perishable food. Finally, it does not disregard the crucial aspect of the issue of poverty: that of education planning. This book is of interest to specialists in poverty research, from students to professionals and from professors to activists, without excluding engineers.
Across the social sciences there has been increasing focus on reproducibility, i.e., the ability to examine a study's data and methods to ensure accuracy by reproducing the study. Reproducible Econometrics Using R combines an overview of key issues and methods with an introduction to how to use them using open source software (R) and recently developed tools (R Markdown and bookdown) that allow the reader to engage in reproducible econometric research. Jeffrey S. Racine provides a step-by-step approach, and covers five sets of topics, i) linear time series models, ii) robust inference, iii) robust estimation, iv) model uncertainty, and v) advanced topics. The time series material highlights the difference between time-series analysis, which focuses on forecasting, versus cross-sectional analysis, where the focus is typically on model parameters that have economic interpretations. For the time series material, the reader begins with a discussion of random walks, white noise, and non-stationarity. The reader is next exposed to the pitfalls of using standard inferential procedures that are popular in cross sectional settings when modelling time series data, and is introduced to alternative procedures that form the basis for linear time series analysis. For the robust inference material, the reader is introduced to the potential advantages of bootstrapping and the Jackknifing versus the use of asymptotic theory, and a range of numerical approaches are presented. For the robust estimation material, the reader is presented with a discussion of issues surrounding outliers in data and methods for addressing their presence. Finally, the model uncertainly material outlines two dominant approaches for dealing with model uncertainty, namely model selection and model averaging. Throughout the book there is an emphasis on the benefits of using R and other open source tools for ensuring reproducibility. The advanced material covers machine learning methods (support vector machines that are useful for classification) and nonparametric kernel regression which provides the reader with more advanced methods for confronting model uncertainty. The book is well suited for advanced undergraduate and graduate students alike. Assignments, exams, slides, and a solution manual are available for instructors.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics.
This textbook provides a self-contained presentation of the theory and models of time series analysis. Putting an emphasis on weakly stationary processes and linear dynamic models, it describes the basic concepts, ideas, methods and results in a mathematically well-founded form and includes numerous examples and exercises. The first part presents the theory of weakly stationary processes in time and frequency domain, including prediction and filtering. The second part deals with multivariate AR, ARMA and state space models, which are the most important model classes for stationary processes, and addresses the structure of AR, ARMA and state space systems, Yule-Walker equations, factorization of rational spectral densities and Kalman filtering. Finally, there is a discussion of Granger causality, linear dynamic factor models and (G)ARCH models. The book provides a solid basis for advanced mathematics students and researchers in fields such as data-driven modeling, forecasting and filtering, which are important in statistics, control engineering, financial mathematics, econometrics and signal processing, among other subjects.
This textbook for master programs in economics offers a comprehensive overview of microeconomics. It employs a carefully graded approach where basic game theory concepts are already explained within the simpler decision framework. The unavoidable mathematical content is supplied when needed, not in an appendix. The book covers a lot of ground, from decision theory to game theory, from bargaining to auction theory, from household theory to oligopoly theory, and from the theory of general equilibrium to regulation theory. Additionally, cooperative game theory is introduced. This textbook has been recommended and developed for university courses in Germany, Austria and Switzerland.
This valuable text provides a comprehensive introduction to VAR modelling and how it can be applied. In particular, the author focuses on the properties of the Cointegrated VAR model and its implications for macroeconomic inference when data are non-stationary. The text provides a number of insights into the links between statistical econometric modelling and economic theory and gives a thorough treatment of identification of the long-run and short-run structure as well as of the common stochastic trends and the impulse response functions, providing in each case illustrations of applicability. This book presents the main ingredients of the Copenhagen School of Time-Series Econometrics in a transparent and coherent framework. The distinguishing feature of this school is that econometric theory and applications have been developed in close cooperation. The guiding principle is that good econometric work should take econometrics, institutions, and economics seriously. The author uses a single data set throughout most of the book to guide the reader through the econometric theory while also revealing the full implications for the underlying economic model. To test ensure full understanding the book concludes with the introduction of two new data sets to combine readers understanding of econometric theory and economic models, with economic reality.
This new text book by Urs Birchler and Monika Butler is an
introduction to the study of how information affects economic
relations. The authors provide a narrative treatment of the more
formal concepts of Information Economics, using easy to understand
and lively illustrations from film and literature and nutshell
examples. This book also comes with a supporting website (www.alicebob.info), maintained by the authors.
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets. |
You may like...
History of Pittsburgh Jazz - Swinging in…
Richard Gazarik, Karen Anthony Cole
Hardcover
R678
Discovery Miles 6 780
|