Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book is an introductory exposition of different topics that emerged in the literature as unifying themes between two fields of econometrics of time series, namely nonlinearity and nonstationarity. Papers on these topics have exploded over the last two decades, but they are rarely ex amined together. There is, undoubtedly, a variety of arguments that justify such a separation. But there are also good reasons that motivate their combination. People who are reluctant to a combined analysis might argue that nonlinearity and nonstationarity enhance non-trivial problems, so their combination does not stimulate interest in regard to plausibly increased difficulties. This argument can, however, be balanced by other ones of an economic nature. A predominant idea, today, is that a nonstationary series exhibits persistent deviations from its long-run components (either deterministic or stochastic trends). These persistent deviations are modelized in various ways: unit root models, fractionally integrated processes, models with shifts in the time trend, etc. However, there are many other behaviors inherent to nonstationary processes, that are not reflected in linear models. For instance, economic variables with mixture distributions, or processes that are state-dependent, undergo episodes of changing dynamics. In models with multiple long-run equi libria, the moving from an equilibrium to another sometimes implies hys teresis. Also, it is known that certain shocks can change the economic fundamentals, thereby reducing the possibility that an initial position is re-established after a shock (irreversibility)."
This book combines both a comprehensive analytical framework and economic statistics that enable business decision makers to anticipate developing economic trends. The author blends recent and historical economic data with economic theory to provide important benchmarks or rules of thumb that give both economists and noneconomists enhanced understanding of unfolding economic data and their interrelationships. Through the matrix system, a disciplined approach is described for integrating readily available economic data into a comprehensive analysis without complex formulas. The extensive appendix of monthly key economic factors for 1978-1991 makes this an important reference source for economic and financial trend analysis. A new and practical method for economic trend analysis is introduced that provides more advanced knowledge than available from economic newsletters. Schaeffer begins with a general description of the business cycle and the typical behavior and effect of the credit markets, commercial banks, and the Federal Reserve. Next, fourteen key economic factors regularly reported by the business press are described, such as the capacity utilization rate and yield on three-month Treasury bills. Benchmarks for each of these key economic factors are set forth, together with an insightful discussion of the interrelationships indicating economic trends. A detailed discussion of the 1978-1991 American economy, incorporating monthly data from the historical matrix, demonstrates the practical application of the matrix system. Executives, investors, financial officers, and government policymakers will find this book useful in decision making.
Major transport infrastructures are increasingly in the news as both the engineering and financing possibilities come together. However, these projects have also demonstrated the inadequacy of most existing approaches to forecasting their impacts and their overall evaluation. This collection of papers from a conference organized by the Association of d'Econometrie Appliquee represents a state of the art look at issues of forecasting traffic, developing pricing strategies and estimating the impacts in a set of papers by leading authorities from Europe, North America and Japan.
How might one determine if a financial institution is taking risk in a balanced and productive manner? A powerful tool to address this question is economic capital, which is a model-based measure of the amount of equity that an entity must hold to satisfactorily offset its risk-generating activities. This book, with a particular focus on the credit-risk dimension, pragmatically explores real-world economic-capital methodologies and applications. It begins with the thorny practical issues surrounding the construction of an (industrial-strength) credit-risk economic-capital model, defensibly determining its parameters, and ensuring its efficient implementation. It then broadens its gaze to examine various critical applications and extensions of economic capital; these include loan pricing, the computation of loan impairments, and stress testing. Along the way, typically working from first principles, various possible modelling choices and related concepts are examined. The end result is a useful reference for students and practitioners wishing to learn more about a centrally important financial-management device.
A timely work which represents a major reappraisal of business cycle theory. It revives, with the help of modern analytical techniques, an old theme of Keynesian macroeconomics, namely that "market psychology" (i.e., volatile expectations) may be a significant cause of economic fluctuations. It is of interest not only to economists, but also to mathematicians and physicists.
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
This book contains an extensive up-to-date overview of nonlinear
time series models and their application to modelling economic
relationships. It considers nonlinear models in stationary and
nonstationary frameworks, and both parametric and nonparametric
models are discussed. The book contains examples of nonlinear
models in economic theory and presents the most common nonlinear
time series models. Importantly, it shows the reader how to apply
these models in practice. For this purpose, the building of various
nonlinear models with its three stages of model building:
specification, estimation and evaluation, is discussed in detail
and is illustrated by several examples involving both economic and
non-economic data. Since estimation of nonlinear time series models
is carried out using numerical algorithms, the book contains a
chapter on estimating parametric nonlinear models and another on
estimating nonparametric ones.
A new approach to explaining the existence of firms and markets, focusing on variability and coordination. It stands in contrast to the emphasis on transaction costs, and on monitoring and incentive structures, which are prominent in most of the modern literature in this field. This approach, called the variability approach, allows us to: show why both the need for communication and the coordination costs increase when the division of labor increases; explain why, while the firm relies on direction, the market does not; rigorously formulate the optimum divisionalization problem; better understand the relationship between technology and organization; show why the size' of the firm is limited; and to refine the analysis of whether the existence of a sharable input, or the presence of an external effect leads to the emergence of a firm. The book provides a wealth of insights for students and professionals in economics, business, law and organization.
This book overviews latest ideas and developments in financial econometrics, with an emphasis on how to best use prior knowledge (e.g., Bayesian way) and how to best use successful data processing techniques from other application areas (e.g., from quantum physics). The book also covers applications to economy-related phenomena ranging from traditionally analyzed phenomena such as manufacturing, food industry, and taxes, to newer-to-analyze phenomena such as cryptocurrencies, influencer marketing, COVID-19 pandemic, financial fraud detection, corruption, and shadow economy. This book will inspire practitioners to learn how to apply state-of-the-art Bayesian, quantum, and related techniques to economic and financial problems and inspire researchers to further improve the existing techniques and come up with new techniques for studying economic and financial phenomena. The book will also be of interest to students interested in latest ideas and results.
Time Series: Theory and Methods is a systematic account of linear time series models and their application to the modelling and prediction of data collected sequentially in time. The aim is to provide specific techniques for handling data and at the same time to provide a thorough understanding of the mathematical basis for techniques. Both time and frequency domain methods are discussed, but the book is written in such a way that either approach could be emphasized. The book intended to be a text for graduate students in statistics, mathematics, engineering, and the natural or social sciences. It contains substantial chapters on multivariate series and state-space models (including applications of the Kalman recursions to missing-value problems) and shorter accounts of special topics including long-range dependence, infinite variance processes and non-linear models. Most of the programs used in the book are available on diskettes for the IBM-PC. These diskettes, with the accompanying manual, ITSM: The Interactive Time Series Modelling Package for the PC, also by Brockwell and Davis, can be purchased from Springer-Verlag.
The more generous social welfare system in Europe is one of the most important differences between the European and the US society. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. On the other hand, the US economy is performing quite well such that crime rates might come down due to better legal income opportunities. This book takes this trade-off as a point of departure and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. It evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation. Among other aspects, results clearly reveal the crime reducing potential of intact families and the link beween crime and labour market. A special focus is on estimating the consequences of crime, a topic rarely analysed in literature.
The issue of unfunded public pension systems has moved to the center of public debate all over the world. Unfortunately, a large part of the discussions have remained on a qualitative level. This book seeks to address this by providing detailed knowledge on modeling pension systems.
Spatial econometrics deals with spatial dependence and spatial heterogeneity, critical aspects of the data used by regional scientists. These characteristics may cause standard econometric techniques to become inappropriate. In this book, I combine several recent research results to construct a comprehensive approach to the incorporation of spatial effects in econometrics. My primary focus is to demonstrate how these spatial effects can be considered as special cases of general frameworks in standard econometrics, and to outline how they necessitate a separate set of methods and techniques, encompassed within the field of spatial econometrics. My viewpoint differs from that taken in the discussion of spatial autocorrelation in spatial statistics - e.g., most recently by Cliff and Ord (1981) and Upton and Fingleton (1985) - in that I am mostly concerned with the relevance of spatial effects on model specification, estimation and other inference, in what I caIl a model-driven approach, as opposed to a data-driven approach in spatial statistics. I attempt to combine a rigorous econometric perspective with a comprehensive treatment of methodological issues in spatial analysis.
Dynamics and Income Distribution brings together Irma Adelman's pioneering applications of econometrics, as well as papers on the poverty and income distribution implications of growth and development. The volume combines some early papers on business cycles and long swings with other pieces focusing on just economic development. With a firm emphasis on the dynamics of income inequality, this volume includes empirical study of how inequality changes with economic development and the conceptual development of dynamic indices of income inequality. Professor Adelman's papers draw on quantitative simulation models and the experience of specific countries to discuss policies to alleviate poverty and reduce inequality. The author argues that trickle-down processes are not likely to reduce poverty sufficiently rapidly. Land reform and the equal access to education need to be focused in order to generate the initial conditions for equalizing economic development. Economic development and poverty reduction, she suggests, require an emphasis on education, on institutions determining access to jobs and resources, and on labour-intensive types of economic growth. With its companion volume, Institutions and Development Strategies, this collection of selected essays makes a significant contribution by improving access to Irma Adelman's pioneering work on the economics and policy of development.
Shows the application of some of the developments in the mathematics of optimization, including the concepts of invexity and quasimax to models of economic growth, and to finance and investment. This book introduces a computational package called SCOM, for solving optimal control problems on MATLAB.
The rich, multi-faceted and multi-disciplinary field of matching-based market design is an active and important one due to its highly successful applications with economic and sociological impact. Its home is economics, but with intimate connections to algorithm design and operations research. With chapters contributed by over fifty top researchers from all three disciplines, this volume is unique in its breadth and depth, while still being a cohesive and unified picture of the field, suitable for the uninitiated as well as the expert. It explains the dominant ideas from computer science and economics underlying the most important results on market design and introduces the main algorithmic questions and combinatorial structures. Methodologies and applications from both the pre-Internet and post-Internet eras are covered in detail. Key chapters discuss the basic notions of efficiency, fairness and incentives, and the way market design seeks solutions guided by normative criteria borrowed from social choice theory.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled."
Many economic and social surveys are designed as panel studies, which provide important data for describing social changes and testing causal relations between social phenomena. This textbook shows how to manage, describe, and model these kinds of data. It presents models for continuous and categorical dependent variables, focusing either on the level of these variables at different points in time or on their change over time. It covers fixed and random effects models, models for change scores and event history models. All statistical methods are explained in an application-centered style using research examples from scholarly journals, which can be replicated by the reader through data provided on the accompanying website. As all models are compared to each other, it provides valuable assistance with choosing the right model in applied research. The textbook is directed at master and doctoral students as well as applied researchers in the social sciences, psychology, business administration and economics. Readers should be familiar with linear regression and have a good understanding of ordinary least squares estimation.
Introduction to Computational Economics Using Fortran is the essential guide to conducting economic research on a computer. Aimed at students of all levels of education as well as advanced economic researchers, it facilitates the first steps into writing programs using Fortran. Introduction to Computational Economics Using Fortran assumes no prior experience as it introduces the reader to this programming language. It shows the reader how to apply the most important numerical methods conducted by computational economists using the toolbox that accompanies this text. It offers various examples from economics and finance organized in self-contained chapters that speak to a diverse range of levels and academic backgrounds. Each topic is supported by an explanation of the theoretical background, a demonstration of how to implement the problem on the computer, and a discussion of simulation results. Readers can work through various exercises that promote practical experience and deepen their economic and technical insights. This textbook is accompanied by a website from which readers can download all program codes as well as a numerical toolbox, and receive technical information on how to install Fortran on their computer.
This unorthodox book derives and tests a simple theory of economic time series using several well-known empirical economic puzzles, from stock market bubbles to the failure of conventional economic theory, to explain low levels of inflation and unemployment in the US.Professor Stanley develops a new econometric methodology which demonstrates the explanatory power of the behavioral inertia hypothesis and solves the pretest/specification dilemma. He then applies this to important measures of the world's economies including GDP, prices and consumer spending. The behavioral inertia hypothesis claims that inertia and randomness (or 'caprice') are the most important factors in representing and forecasting many economic time series. The development of this new model integrates well-known patterns in economic time series data with well-accepted ideas in contemporary philosophy of science. Academic economists will find this book interesting as it presents a unified approach to economic time series, solves a number of important empirical puzzles and introduces a new econometric methodology. Business and financial analysts will also find it useful because it offers a simple, yet powerful, framework in which to study and predict financial market movements.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
Nonlinear and nonnormal filters are introduced and developed. Traditional nonlinear filters such as the extended Kalman filter and the Gaussian sum filter give biased filtering estimates, and therefore several nonlinear and nonnormal filters have been derived from the underlying probability density functions. The density-based nonlinear filters introduced in this book utilize numerical integration, Monte-Carlo integration with importance sampling or rejection sampling and the obtained filtering estimates are asymptotically unbiased and efficient. By Monte-Carlo simulation studies, all the nonlinear filters are compared. Finally, as an empirical application, consumption functions based on the rational expectation model are estimated for the nonlinear filters, where US, UK and Japan economies are compared. |
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
|