![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled."
Many economic and social surveys are designed as panel studies, which provide important data for describing social changes and testing causal relations between social phenomena. This textbook shows how to manage, describe, and model these kinds of data. It presents models for continuous and categorical dependent variables, focusing either on the level of these variables at different points in time or on their change over time. It covers fixed and random effects models, models for change scores and event history models. All statistical methods are explained in an application-centered style using research examples from scholarly journals, which can be replicated by the reader through data provided on the accompanying website. As all models are compared to each other, it provides valuable assistance with choosing the right model in applied research. The textbook is directed at master and doctoral students as well as applied researchers in the social sciences, psychology, business administration and economics. Readers should be familiar with linear regression and have a good understanding of ordinary least squares estimation.
This unorthodox book derives and tests a simple theory of economic time series using several well-known empirical economic puzzles, from stock market bubbles to the failure of conventional economic theory, to explain low levels of inflation and unemployment in the US.Professor Stanley develops a new econometric methodology which demonstrates the explanatory power of the behavioral inertia hypothesis and solves the pretest/specification dilemma. He then applies this to important measures of the world's economies including GDP, prices and consumer spending. The behavioral inertia hypothesis claims that inertia and randomness (or 'caprice') are the most important factors in representing and forecasting many economic time series. The development of this new model integrates well-known patterns in economic time series data with well-accepted ideas in contemporary philosophy of science. Academic economists will find this book interesting as it presents a unified approach to economic time series, solves a number of important empirical puzzles and introduces a new econometric methodology. Business and financial analysts will also find it useful because it offers a simple, yet powerful, framework in which to study and predict financial market movements.
Nonlinear and nonnormal filters are introduced and developed. Traditional nonlinear filters such as the extended Kalman filter and the Gaussian sum filter give biased filtering estimates, and therefore several nonlinear and nonnormal filters have been derived from the underlying probability density functions. The density-based nonlinear filters introduced in this book utilize numerical integration, Monte-Carlo integration with importance sampling or rejection sampling and the obtained filtering estimates are asymptotically unbiased and efficient. By Monte-Carlo simulation studies, all the nonlinear filters are compared. Finally, as an empirical application, consumption functions based on the rational expectation model are estimated for the nonlinear filters, where US, UK and Japan economies are compared.
Hardbound. This book is a result of recent developments in several fields. Mathematicians, statisticians, finance theorists, and economists found several interconnections in their research. The emphasis was on common methods, although the applications were also interrelated.The main topic is dynamic stochastic models, in which information arrives and decisions are made sequentially. This gives rise to what finance theorists call option value, what some economists label quasi-option value. Some papers extend the mathematical theory, some deal with new methods of economic analysis, while some present important applications, to natural resources in particular.
Halbert White has made a major contribution to key areas of econometrics including specification analysis, specification testing, encompassing and Cox tests and model selection. This book presents his most important published work supplemented with new material setting his work in context.Together with new introductions to each of the chapters, the articles cover work from the early 1980s to 1996 and provide an excellent overview of the breadth of Professor White's work and the evolution of his ideas. Using rigorous mathematical techniques Halbert White develops many of the central themes in econometrics concerning models, data generating processes and estimation procedures. Throughout the book the unifying vision is that econometric models are only imperfect approximations to the processes generating economic data and that this has implications for the interpretation of estimates, inference and selection of econometric models. This unique collection of some of Halbert White's important work, not otherwise readily accessible, will be welcomed by researchers, graduates and academics in econometrics and statistics.
Zvi Griliches has made many seminal contributions to econometrics during the course of a long and distinguished career. His work has focused primarily on the economics of technological change and the econometric problems that arise in trying to study it. This major collection presents Professor Griliches's most important essays and papers on method, applied econometrics and specification problems. It reflects his interests in data-instigated contributions to econometric methodology, developments in and exposition of specification analysis, statistical aggregation, distributed lag models, sample selection bias and measurement error and other unobservable variance component models. These methods are applied to important substantive questions such as the estimation of the returns to education, the measurement of quality change, and productivity and economies of scale. Practicing Econometrics provides an essential reference source to the work of one of the most influential econometricians of the late 20th century.
This book discusses market microstructure environment within the context of the global financial crisis. In the first part, the market microstructure theory is recalled and the main microstructure models and hypotheses are discussed. The second part focuses on the main effects of the financial downturn through an examination of market microstructure dynamics. In particular, the effects of market imperfections and the limitations associated with microstructure models are discussed. Finally, the new regulations and recent developments for financial markets that aim to improve the market microstructure are discussed. Well-known experts on the subject contribute to the chapters in the book. A must-read for academic researchers, students and quantitative practitioners.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
Coverage has been extended to include recent topics. The book again presents a unified treatment of economic theory, with the method of maximum likelihood playing a key role in both estimation and testing. Exercises are included and the book is suitable as a general text for final-year undergraduate and postgraduate students.
This book presents Professor Lawrence R Klein and his group's last quarterly econometric model of the United States economy that they had produced at the University of Pennsylvania. This is the last econometric model that Lawrence Klein and his disciples have left after some 50 years of cumulated efforts of constructing the US economy model up to around 2000. It was widely known as the WEFA Econometric Model Mark 10, and is the culmination of Professor Klein's research which spans more than 70 years, and would please not only Professor Klein's old students and colleagues, but also younger students who have heard so much of Klein models but have yet to see the latest model in its complete and printed form.
The maximum principle and dynamic programming are the two most commonly used approaches in solving optimal control problems. These approaches have been developed independently. The theme of this book is to unify these two approaches, and to demonstrate that the viscosity solution theory provides the framework to unify them.
Financial market volatility plays a crucial role in financial
decision making, as volatility forecasts are important input
parameters in areas such as option pricing, hedging strategies,
portfolio allocation and Value-at-Risk calculations. The fact that
financial innovations arrive at an ever-increasing rate has
motivated both academic researchers and practitioners and advances
in this field have been considerable. The use of Stochastic
Volatility (SV) models is one of the latest developments in this
area. Empirical Studies on Volatility in International Stock
Markets describes the existing techniques for the measurement and
estimation of volatility in international stock markets with
emphasis on the SV model and its empirical application. Eugenie Hol
develops various extensions of the SV model, which allow for
additional variables in both the mean and the variance equation. In
addition, the forecasting performance of SV models is compared not
only to that of the well-established GARCH model but also to
implied volatility and so-called realised volatility models which
are based on intraday volatility measures.
Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."
This book provides the ultimate goal of economic studies to predict how the economy develops-and what will happen if we implement different policies. To be able to do that, we need to have a good understanding of what causes what in economics. Prediction and causality in economics are the main topics of this book's chapters; they use both more traditional and more innovative techniques-including quantum ideas -- to make predictions about the world economy (international trade, exchange rates), about a country's economy (gross domestic product, stock index, inflation rate), and about individual enterprises, banks, and micro-finance institutions: their future performance (including the risk of bankruptcy), their stock prices, and their liquidity. Several papers study how COVID-19 has influenced the world economy. This book helps practitioners and researchers to learn more about prediction and causality in economics -- and to further develop this important research direction.
Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: * Economic growth and technological development * Population dynamics and human impact on the environment * Resource extraction and scarcity * Air and water contamination * Rational management of the economy and environment * Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar investigation techniques are applicable to analyze various models. Instructors will appreciate the substantial flexibility that this text allows while designing their own syllabus. Chapters are essentially self-contained and may be covered in full, in part, and in any order. Appropriate one- and two-semester courses include, but are not limited to, Applied Mathematical Modeling, Mathematical Methods in Economics and Environment, Models of Biological Systems, Applied Optimization Models, and Environmental Models. Prerequisites for the courses are Calculus and, preferably, Differential Equations.
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope."
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
This book presents the effects of integrating information and communication technologies (ICT) and economic processes in macroeconomic dynamics, finance, marketing, industrial policies, and in government economic strategy. The text explores modeling and applications in these fields and also describes, in a clear and accessible manner, the theories that guide the integration among information technology (IT), telecommunications, and the economy, while presenting examples of their applications. Current trends such as artificial intelligence, machine learning, and big data technologies used in economics are also included. This volume is suitable for researchers, practitioners, and students working in economic theory and the computational social sciences.
The first edition of this book has been described as a landmark book, being the first of its kind in applied econometrics. This second edition is thoroughly revised and updated and explains how to use many recent technical developments in time series econometrics. The main objective of the book is to help many applied economists, with a limited background in econometric estimation theory, to understand and apply widely used time eseries econometric techniques.
Volatility ranks among the most active and successful areas of research in econometrics and empirical asset pricing finance over the past three decades. This research review studies and analyses some of the most influential published works from this burgeoning literature, both classic and contemporary. Topics covered include GARCH, stochastic and multivariate volatility models as well as forecasting, evaluation and high-frequency data. This insightful review presents and discusses the most important milestones and contributions that helped pave the way to today's understanding of volatility.
Franz Ferschl is seventy. According to his birth certificate it is true, but it is unbelievable. Two of the three editors remembers very well the Golden Age of Operations Research at Bonn when Franz Ferschl worked together with Wilhelm Krelle, Martin Beckmann and Horst Albach. The importance of this fruitful cooperation is reflected by the fact that half of the contributors to this book were strongly influenced by Franz Ferschl and his colleagues at the University of Bonn. Clearly, Franz Ferschl left his traces at all the other places of his professional activities, in Vienna and Munich. This is demonstrated by the present volume as well. Born in 1929 in the Upper-Austrian Miihlviertel, his scientific education brought him to Vienna where he studied mathematics. In his early years he was attracted by Statistics and Operations Research. During his employment at the Osterreichische Bundeskammer fUr Gewerbliche Wirtschaft in Vienna he prepared his famous book on queueing theory and stochastic processes in economics. This work has been achieved during his scarce time left by his duties at the Bundeskammer, mostly between 6 a.m. and midnight. All those troubles were, however, soon rewarded by the chair of statistics at Bonn University. As a real Austrian, the amenities of the Rhineland could not prevent him from returning to Vienna, where he took the chair of statistics.
Through analysis of the European Union Emissions Trading Scheme (EU ETS) and the Clean Development Mechanism (CDM), this book demonstrates how to use a variety of econometric techniques to analyze the evolving and expanding carbon markets sphere, techniques that can be extrapolated to the worldwide marketplace. It features stylized facts about carbon markets from an economics perspective, as well as covering key aspects of pricing strategies, risk and portfolio management. |
![]() ![]() You may like...
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,791
Discovery Miles 27 910
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,790
Discovery Miles 37 900
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R8,286
Discovery Miles 82 860
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Economic Activity, Trade, and Industry…
F. Gerard Adams, Byron Gangnes, …
Hardcover
R2,777
Discovery Miles 27 770
|