Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
This volume presents a collection of readings which give the reader an idea of the nature and scope of unobserved components (UC) models and the methods used to deal with them. The book is intended to give a self-contained presentation of the methods and applicative issues. Harvey has made major contributions to this field and provides substantial introductions throughout the book to form a unified view of the literature. About the Series Advanced Texts in Econometrics is a distinguished and rapidly expanding series in which leading econometricians assess recent developments in such areas as stochastic probability, panel and time series data analysis, modeling, and cointegration. In both hardback and affordable paperback, each volume explains the nature and applicability of a topic in greater depth than possible in introductory textbooks or single journal articles. Each definitive work is formatted to be as accessible and convenient for those who are not familiar with the detailed primary literature.
This book has been written as a doctoral dissertation at the Department of Economics at the University of Konstanz. I am indebted to my supervisor Winfried Pohlmeier for providing a stimulating and pleasant research en- ronment and his continuous support during my doctoral studies. I strongly bene?tted from inspiring discussions with him, his valuable advices and he- ful comments regarding the contents and the exposition of this book. I am grateful to Luc Bauwens for refereeing my work as a second super- sor. Moreover, I wish to thank him for o?ering me the possibility of a research visit at the Center of Operations Research and Econometrics (CORE) at the Universit e Catholique de Louvain. Important parts of this book have been conceived during this period. Similarly, I am grateful to Tony Hall who invited me for a research visit at the University of Technology, Sydney, and provided me access to an excellent database from the Australian Stock Exchange. I would like to thank him for his valuable support and the permission to use this data for empirical studies in this book. I wish to thank my colleagues at the University of Konstanz Frank G- hard, DieterHess, JoachimInkmann, MarkusJochmann, StefanKlotz, Sandra Lechner and Ingmar Nolte who o?ered me advice, inspiration, friendship and successfulco-operations.Moreover, Iamgratefultothestudentresearchass- tantsat the Chair of Econometrics at the University of Konstanz, particularly Magdalena Ramada Sarasola, Danielle Tucker and Nadine Warmuth who did a lot of editing wo
The book inquires the consequences of speculative trading based on private information about financial asset markets. It presents an extensive and thorough discussion of theoretical and empirical methods used in previous studies on sequential trade models. The text also introduces a new framework for estimation and hypothesis testing that extends earlier work in the field substantially. Several market microstructure models in the spirit of Easley, Kiefer, O'Hara and Paperman (Journal of Finance, 1996) are reviewed. The common theme of these papers is the focus on the consequences of information based trading on the price setting behaviour of the market maker. Assuming that some traders have private information about a security's true value, a number of relations between observable quantities like the spread, the volume, timing of trades and volatility of asset prices can be established. The authors introduce a number of improved methods for estimation and hypothesis testing for sequential trade models and apply this econometric framework employing a high frequency transaction data set for a number of stocks traded on the New York Stock Exchange during August 1996. All results that are necessary for understanding the empirical framework introduced are derived step-by-step. The text is ideally suited as a reference work on old and new results as well as a textbook for graduate courses on Market Microstructure Theory, Empirical Methods in Finance or Econometrics.
This book has been written as a practical guide for finance markets professionals to explain US monetary policy and to make forecasts of future interest rate levels. Aimed at market players, familiar with US policy instruments, Explaining and Forecasting the US Federal Funds Rates will provide a means of making independent interest rate forecasts as well as explaining current rate levels.
Computational Models in the Economics of Environment and Development provides a step-by-step guide in designing, developing, and solving non-linear environment-development models. It accomplishes this by focusing on applied models, using real examples as case studies. Additionally, it gives examples of developing policy interventions based on quantitative model results. Finally, it uses a simple computer program, GAMS, to develop and solve models. This book is targeted towards university lecturers and students in economic modeling and sustainable development, but is also of particular interest to researchers at sustainable development research institutes and policy makers at international sustainable development policy institutions such the World Bank, UNDP, and UNEP.
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I Richard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold M. S. Coxeter Introduction to Modern Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Bruno de Finetti Theory of Probability, Volume 1 Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research Amos de Shalit & Herman Feshbach Theoretical Nuclear Physics, Volume 1 —Nuclear Structure J. L. Doob Stochastic Processes Nelson Dunford & Jacob T. Schwartz Linear Operators, Part One, General Theory Nelson Dunford & Jacob T. Schwartz Linear Operators, Part Two, Spectral Theory—Self Adjoint Operators in Hilbert Space Nelson Dunford & Jacob T. Schwartz Linear Operators, Part Three, Spectral Operators Herman Fsehbach Theoretical Nuclear Physics: Nuclear Reactions Bernard Friedman Lectures on Applications-Oriented Mathematics Gerald d. Hahn & Samuel S. Shapiro Statistical Models in Engineering Morris H. Hansen, William N. Hurwitz & William G. Madow Sample Survey Methods and Theory, Volume I—Methods and Applications Morris H. Hansen, William N. Hurwitz & William G. Madow Sample Survey Methods and Theory, Volume II—Theory Peter Henrici Applied and Computational Complex Analysis, Volume 1—Power Series—lntegration—Conformal Mapping—Location of Zeros Peter Henrici Applied and Computational Complex Analysis, Volume 2—Special Functions—Integral Transforms—Asymptotics—Continued Fractions Peter Henrici Applied and Computational Complex Analysis, Volume 3—Discrete Fourier Analysis—Cauchy Integrals—Construction of Conformal Maps—Univalent Functions Peter Hilton & Yel-Chiang Wu A Course in Modern Algebra Harry Hochetadt Integral Equations Erwin O. Kreyezig Introductory Functional Analysis with Applications William H. Louisell Quantum Statistical Properties of Radiation All Hasan Nayfeh Introduction to Perturbation Techniques Emanuel Parzen Modern Probability Theory and Its Applications P.M. Prenter Splines and Variational Methods Walter Rudin Fourier Analysis on Groups C. L. Siegel Topics in Complex Function Theory, Volume I—Elliptic Functions and Uniformization Theory C. L. Siegel Topics in Complex Function Theory, Volume II—Automorphic and Abelian integrals C. L Siegel Topics in Complex Function Theory, Volume III—Abelian Functions & Modular Functions of Several Variables J. J. Stoker Differential Geometry J. J. Stoker Water Waves: The Mathematical Theory with Applications J. J. Stoker Nonlinear Vibrations in Mechanical and Electrical Systems
An insightful and up-to-date study of the use of periodic models in the description and forecasting of economic data. Incorporating recent developments in the field, the authors investigate such areas as seasonal time series; periodic time series models; periodic integration; and periodic cointegration. The analysis from the inclusion of many new empirical examples and results. Advanced Texts in Econometrics is a distinguished and rapidly expanding series in which leading econometricians assess recent developments in such areas as stochastic probability, panel and time series data analysis, modeling, and cointegration. In both hardback and affordable paperback, each volume explains the nature and applicability of a topic in greater depth than possible in introductory textbooks or single journal articles. Each definitive work is formatted to be as accessible and convenient for those who are not familiar with the detailed primary literature.
Get up to speed on the application of machine learning approaches in macroeconomic research. This book brings together economics and data science. Author Tshepo Chris Nokeri begins by introducing you to covariance analysis, correlation analysis, cross-validation, hyperparameter optimization, regression analysis, and residual analysis. In addition, he presents an approach to contend with multi-collinearity. He then debunks a time series model recognized as the additive model. He reveals a technique for binarizing an economic feature to perform classification analysis using logistic regression. He brings in the Hidden Markov Model, used to discover hidden patterns and growth in the world economy. The author demonstrates unsupervised machine learning techniques such as principal component analysis and cluster analysis. Key deep learning concepts and ways of structuring artificial neural networks are explored along with training them and assessing their performance. The Monte Carlo simulation technique is applied to stimulate the purchasing power of money in an economy. Lastly, the Structural Equation Model (SEM) is considered to integrate correlation analysis, factor analysis, multivariate analysis, causal analysis, and path analysis. After reading this book, you should be able to recognize the connection between econometrics and data science. You will know how to apply a machine learning approach to modeling complex economic problems and others beyond this book. You will know how to circumvent and enhance model performance, together with the practical implications of a machine learning approach in econometrics, and you will be able to deal with pressing economic problems. What You Will Learn Examine complex, multivariate, linear-causal structures through the path and structural analysis technique, including non-linearity and hidden states Be familiar with practical applications of machine learning and deep learning in econometrics Understand theoretical framework and hypothesis development, and techniques for selecting appropriate models Develop, test, validate, and improve key supervised (i.e., regression and classification) and unsupervised (i.e., dimension reduction and cluster analysis) machine learning models, alongside neural networks, Markov, and SEM models Represent and interpret data and models Who This Book Is For Beginning and intermediate data scientists, economists, machine learning engineers, statisticians, and business executives
The global financial crisis highlighted the impact on macroeconomic outcomes of recurrent events like business and financial cycles, highs and lows in volatility, and crashes and recessions. At the most basic level, such recurrent events can be summarized using binary indicators showing if the event will occur or not. These indicators are constructed either directly from data or indirectly through models. Because they are constructed, they have different properties than those arising in microeconometrics, and how one is to use them depends a lot on the method of construction. This book presents the econometric methods necessary for the successful modeling of recurrent events, providing valuable insights for policymakers, empirical researchers, and theorists. It explains why it is inherently difficult to forecast the onset of a recession in a way that provides useful guidance for active stabilization policy, with the consequence that policymakers should place more emphasis on making the economy robust to recessions. The book offers a range of econometric tools and techniques that researchers can use to measure recurrent events, summarize their properties, and evaluate how effectively economic and statistical models capture them. These methods also offer insights for developing models that are consistent with observed financial and real cycles. This book is an essential resource for students, academics, and researchers at central banks and institutions such as the International Monetary Fund.
PPP is one of the most widely researched areas in international finance and one of the most controversial in the theory of exchange rate determination. This book demonstrates the applications of Purchasing Power Parity in exchange rate determination as well as more practical applications of salary comparison and the cost-of living across borders. It uses The Economist's annual Big Mac Index in place of the traditional basket of services used in PPP research. The author demonstrates that this is a good solution to the index-number problem since it is readily available and more appealing as an international monetary standard. The book also shows how The Big Mac Index could have been used to predict the Asian Currency Crisis and the Mexican Peso stand-off where more traditional economic measures failed.
Korea, one of the original 'Tiger Economies', experiences a traumatic and largely unanticipated economic crisis in 1997-98 from which the country is still recovering. Despite having achieved spectacular economic advances from the early 1960s, the crisis laid bare numerous structural, economic and policy weaknesses. Charles Harvie and Hyun-Hoon Lee chronicle and analyze the key factors behind Korea's economic miracle from 1962-1989 and the causes that contributed to the economic downturn and ensuing crisis of 1997-98. Is the Korean economy still fading or is its revival underway? As the country undertakes a series of recovery measures, the authors consider the importance of the ongoing restructuring efforts in the corporate and banking sectors, the development of the 'new economy; and the potential economic advantages to be derived from reunification with the North.
Modelling trends and cycles in economic time series has a long history, with the use of linear trends and moving averages forming the basic tool kit of economists until the 1970s. Several developments in econometrics then led to an overhaul of the techniques used to extract trends and cycles from time series. Terence Mills introduces these various approaches to allow students and researchers to appreciate the variety of techniques and the considerations that underpin their choice for modelling trends and cycles.
Economic indicators provide invaluable insights into how different economies and financial markets are performing, enabling practitioners to adjust their investment strategies in order to gain knowledge about markets and to achieve higher returns. However, in order to make the right decisions, you must know how to interpret the relevant indicators. Using Economic Indicators in Analysing Financial Markets provides this important guidance. The first and second part of Using Economic Indicators in Analysing Financial Markets focuses on the short-term analysis, explaining exactly what the indicators are, why they are significant, where and when they are published, and how reliable they are. In the third part, author Bernd Krampen highlights medium and long-term economic trends: It is shown how some previously discussed and additional market indicators like stocks, bond yields, commodities can be employed as basis for forecasting both GDP growth and inflation. This includes the estimation of possible future recessions. In the fourth part the predominantly good forecast properties of sentiment indicators are illustrated examining the real estate market, which is rounded up by an introduction into psychology and Behavioural Finance providing further tips and tricks in analysing financial markets. Using Economic Indicators in Analysing Financial Markets is an invaluable resource for investors, strategists, policymakers, students, and private investors worldwide who want to understand the true meaning of the latest economic trends to make the best decisions for future profits on financial markets.
Designed to promote students' understanding of econometrics and to build a more operational knowledge of economics through a meaningful combination of words, symbols and ideas. Each chapter commences in the way economists begin new empirical projects--with a question and an economic model--then proceeds to develop a statistical model, select an estimator and outline inference procedures. Contains a copious amount of problems, experimental exercises and case studies.
In March 1998 the European Union formally launched the accession process that will lead to a significant enlargement of the Union. So far ten countries from Central Europe: Bulgaria, the Czech Republic, Estonia, Hungary, Latvia, Lithuania, Poland, Romania, the Slovak Republic and Slovenia have submitted their applications for EU membership. This unique process immediately attracted attention of economists and policy makers. Nevertheless, it can be noticed that among numerous results already published, there is a distinctive shortage of books and papers in which quantitative research methods are applied. This is to a large extent justified by the fact that the transition and accession processes are new to the economic sciences, their methodology is not wellresearched, statistical data for the Central and East European countries are scarce and not always reliable and, generally, quantitative approach seems to be a risky and uncertain business. All these all problems can also be seen as a challenge rather than an obstacle. With this on mind, we have decided to clarify the status quo by organising a research seminar which focused on the methodology and quantitative analysis of the Central and East European transition and pre-accession processes. The seminar, East European Transition and EU Enlargement: a Quantitative Approach organised by Macroeconomic and Financial Data Centre (University of Gdansk and University ofLeicester) took place in Gdansk in June 2001. Our edited volume contains papers developed from this seminar.
This book develops and analyzes dynamic decision models (DDM) with one trajectoral objective according to the methodology of multi-criteria decision making (MCDM). Moreover, DDMs which concomitantly pursue multiple objectives are analyzed, with special emphasis given to hybrid models with scalar and trajectorial objectives as well as models with multiple trajectorial objectives. Introducing the method of distance maximization crucially augments MCDM and proves to be invaluable for DDMs with nonexistent utopia trajectory or with sustainability as objective. The notions of efficiency and sustainability are formally developed and counterposed by means of the construct of trajectorial objective, which is presented here, along with its implications, as a natural advance upon the classical scalar objective.
For some seven decades, econometrics has been almost exclusiveley dealing with constructing and applying econometric equation systems, which constitute constraints in econometric optimization models. The second major component, the scalarvalued objective function, has only in recent years attracted more attention and some progress has been made. This book is devoted to theories, models and methods for constructing scalarvalued objective functions for econometric optimization models, to their applications, and to some related topics like historical issues about pioneering contributions by Ragnar Frisch and Jan Tinbergen.
An Introduction to Wavelets and Other Filtering Methods in Finance
and Economics presents a unified view of filtering techniques with
a special focus on wavelet analysis in finance and economics. It
emphasizes the methods and explanations of the theory that
underlies them. It also concentrates on exactly what wavelet
analysis (and filtering methods in general) can reveal about a time
series. It offers testing issues which can be performed with
wavelets in conjunction with the multi-resolution analysis. The
descriptive focus of the book avoids proofs and provides easy
access to a wide spectrum of parametric and nonparametric filtering
methods. Examples and empirical applications will show readers the
capabilities, advantages, and disadvantages of each method.
In this book, time use behavior within households is modeled as the outcome of a bargaining process between family members who bargain over household resource allocation and the intrafamily distribution of welfare. In view of trends such as rising female employment along with falling fertility rates and increasing divorce rates, a strategic aspect of female employment is analyzed in a dynamic family bargaining framework. The division of housework between spouses and the observed leisure differential between women and men are investigated within non-cooperative bargaining settings. The models developed are tested empirically using data from the German Socio-Economic Panel and the German Time Budget Survey.
Empirical measurement of impacts of active labour market programmes has started to become a central task of economic researchers. New improved econometric methods have been developed that will probably influence future empirical work in various other fields of economics as well. This volume contains a selection of original papers from leading experts, among them James J. Heckman, Noble Prize Winner 2000 in economics, addressing these econometric issues at the theoretical and empirical level. The theoretical part contains papers on tight bounds of average treatment effects, instrumental variables estimators, impact measurement with multiple programme options and statistical profiling. The empirical part provides the reader with econometric evaluations of active labour market programmes in Canada, Germany, France, Italy, Slovak Republic and Sweden.
Design and Analysis of Time Series Experiments presents the elements of statistical time series analysis while also addressing recent developments in research design and causal modeling. A distinguishing feature of the book is its integration of design and analysis of time series experiments. Drawing examples from criminology, economics, education, pharmacology, public policy, program evaluation, public health, and psychology, Design and Analysis of Time Series Experiments is addressed to researchers and graduate students in a wide range of behavioral, biomedical and social sciences. Readers learn not only how-to skills but, also the underlying rationales for the design features and the analytical methods. ARIMA algebra, Box-Jenkins-Tiao models and model-building strategies, forecasting, and Box-Tiao impact models are developed in separate chapters. The presentation of the models and model-building assumes only exposure to an introductory statistics course, with more difficult mathematical material relegated to appendices. Separate chapters cover threats to statistical conclusion validity, internal validity, construct validity, and external validity with an emphasis on how these threats arise in time series experiments. Design structures for controlling the threats are presented and illustrated through examples. The chapters on statistical conclusion validity and internal validity introduce Bayesian methods, counterfactual causality and synthetic control group designs. Building on the earlier of the authors, Design and Analysis of Time Series Experiments includes more recent developments in modeling, and considers design issues in greater detail than any existing work. Additionally, the book appeals to those who want to conduct or interpret time series experiments, as well as to those interested in research designs for causal inference.
Generalized method of moments (GMM) estimation of nonlinear systems has two important advantages over conventional maximum likelihood (ML) estimation: GMM estimation usually requires less restrictive distributional assumptions and remains computationally attractive when ML estimation becomes burdensome or even impossible. This book presents an in-depth treatment of the conditional moment approach to GMM estimation of models frequently encountered in applied microeconometrics. It covers both large sample and small sample properties of conditional moment estimators and provides an application to empirical industrial organization. With its comprehensive and up-to-date coverage of the subject which includes topics like bootstrapping and empirical likelihood techniques, the book addresses scientists, graduate students and professionals in applied econometrics.
and Feldman, 1996 or Audretsch and Stephan, 1996) show that unformalized knowledge may playa major role in the innovation of new products. Now if unformalized knowledge is communicated personally, distance will be an important variable in this process, since the intensity of contacts between persons can be expected to be negatively correlated to the distance between them. In the discussion of section 3.3.1 (page 42) we saw that it was this aspect of localization that Marshall had in mind when he was alluding to "local trade secrets."4 Note that if this spatial dimension of communication between agents exists, it is possible to transfer it to regional aggregates of agents: the closer two regions, the more they will be able to profit from the respective pool of human capital (R&D-output etc.) of the other region. This argument gives a spatial 5 interpretation of the literature on endogenous growth. Now if these spillovers have a spatial dimension then it follows from the discussion in chapter 3 that they will be one driving force in the dynamics of agglomeration. With the model to be developed in this chapter I will investigate the hy pothesis that it is these forces of agglomeration (i.e. spatial spillovers of nonrival goods or foctors) that are responsible for the inhomogeneous pattern of growth con vergence. To analyze this phenomenon, I consider different types of regional aggregates and different distances in the model." |
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
An Introduction to State Space Time…
Jacques J.F. Commandeur, Siem Jan Koopman
Hardcover
R1,885
Discovery Miles 18 850
|