Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
Now in its fourth edition, this landmark text" "provides a fresh, accessible and well-written introduction to the subject. With a rigorous pedagogical framework, which sets it apart from comparable texts, the latest edition features an expanded website providing numerous real life data sets and examples.
In many applications of econometrics and economics, a large proportion of the questions of interest are identification. An economist may be interested in uncovering the true signal when the data could be very noisy, such as time-series spurious regression and weak instruments problems, to name a few. In this book, High-Dimensional Econometrics and Identification, we illustrate the true signal and, hence, identification can be recovered even with noisy data in high-dimensional data, e.g., large panels. High-dimensional data in econometrics is the rule rather than the exception. One of the tools to analyze large, high-dimensional data is the panel data model.High-Dimensional Econometrics and Identification grew out of research work on the identification and high-dimensional econometrics that we have collaborated on over the years, and it aims to provide an up-todate presentation of the issues of identification and high-dimensional econometrics, as well as insights into the use of these results in empirical studies. This book is designed for high-level graduate courses in econometrics and statistics, as well as used as a reference for researchers.
Volume 1 covers statistical methods related to unit roots, trend breaks and their interplay. Testing for unit roots has been a topic of wide interest and the author was at the forefront of this research. The book covers important topics such as the Phillips-Perron unit root test and theoretical analyses about their properties, how this and other tests could be improved, and ingredients needed to achieve better tests and the proposal of a new class of tests. Also included are theoretical studies related to time series models with unit roots and the effect of span versus sampling interval on the power of the tests. Moreover, this book deals with the issue of trend breaks and their effect on unit root tests. This research agenda fostered by the author showed that trend breaks and unit roots can easily be confused. Hence, the need for new testing procedures, which are covered.Volume 2 is about statistical methods related to structural change in time series models. The approach adopted is off-line whereby one wants to test for structural change using a historical dataset and perform hypothesis testing. A distinctive feature is the allowance for multiple structural changes. The methods discussed have, and continue to be, applied in a variety of fields including economics, finance, life science, physics and climate change. The articles included address issues of estimation, testing and/or inference in a variety of models: short-memory regressors and errors, trends with integrated and/or stationary errors, autoregressions, cointegrated models, multivariate systems of equations, endogenous regressors, long-memory series, among others. Other issues covered include the problems of non-monotonic power and the pitfalls of adopting a local asymptotic framework. Empirical analyses are provided for the US real interest rate, the US GDP, the volatility of asset returns and climate change.
Non-market valuation has become a broadly accepted and widely practiced means of measuring the economic values of the environment and natural resources. In this book, the authors provide a guide to the statistical and econometric practices that economists employ in estimating non-market values.The authors develop the econometric models that underlie the basic methods: contingent valuation, travel cost models, random utility models and hedonic models. They analyze the measurement of non-market values as a procedure with two steps: the estimation of parameters of demand and preference functions and the calculation of benefits from the estimated models. Each of the models is carefully developed from the preference function to the behavioral or response function that researchers observe. The models are then illustrated with datasets that characterize the kinds of data researchers typically deal with. The real world data and clarity of writing in this book will appeal to environmental economists, students, researchers and practitioners in multilateral banks and government agencies.
The book describes the structure of the Keynes-Leontief Model (KLM) of Japan and discusses how the Japanese economy can overcome the long-term economic deflation that has taken place since the mid-1990s. The large-scale econometric model and its analysis have been important for planning several policy measures and examining the economic structure of a country. However, it seems that the development and maintenance of the KLM would be very costly. The book discusses how the KLM is developed and employed for the policy analyses.
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
Over the last decade, dynamical systems theory and related
nonlinear methods have had a major impact on the analysis of time
series data from complex systems. Recent developments in
mathematical methods of state-space reconstruction, time-delay
embedding, and surrogate data analysis, coupled with readily
accessible and powerful computational facilities used in gathering
and processing massive quantities of high-frequency data, have
provided theorists and practitioners unparalleled opportunities for
exploratory data analysis, modelling, forecasting, and
control.
The papers collected in the two volumes Nonlinear Models focus on the asymptotic theory of parameter estimators of nonlinear single equation models and systems of nonlinear models, in particular weak and strong consistency, asymptotic normality, and parameter inference, for cross-sections as well as for time series. A selection of papers on testing for, and estimation and inference under, model misspecification is also included. The models under review are parametric, hence their functional form is assured to be known up to a vector of unknown parameters, and the functional form involved is nonlinear in at least one of the parameters.The selection of earlier articles on nonlinear parametric models is extensive and, although they are not all equally influential, each has played a significant part in the development of the field. The more recent articles have been selected on the basis of their potential importance for the further development of this sphere of study.
Although interest in spatial regression models has surged in recent years, a comprehensive, up-to-date text on these approaches does not exist. Filling this void, Introduction to Spatial Econometrics presents a variety of regression methods used to analyze spatial data samples that violate the traditional assumption of independence between observations. It explores a wide range of alternative topics, including maximum likelihood and Bayesian estimation, various types of spatial regression specifications, and applied modeling situations involving different circumstances. Leaders in this field, the authors clarify the often-mystifying phenomenon of simultaneous spatial dependence. By presenting new methods, they help with the interpretation of spatial regression models, especially ones that include spatial lags of the dependent variable. The authors also examine the relationship between spatiotemporal processes and long-run equilibrium states that are characterized by simultaneous spatial dependence. MATLAB (R) toolboxes useful for spatial econometric estimation are available on the authors' websites. This work covers spatial econometric modeling as well as numerous applied illustrations of the methods. It encompasses many recent advances in spatial econometric models-including some previously unpublished results.
Covering a broad range of topics, this text provides a comprehensive survey of the modeling of chaotic dynamics and complexity in the natural and social sciences. Its attention to models in both the physical and social sciences and the detailed philosophical approach make this a unique text in the midst of many current books on chaos and complexity. Including an extensive index and bibliography along with numerous examples and simplified models, this is an ideal course text.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
Originally published in 1979. This book addresses three questions regarding uncertainty in economic life: how do we define uncertainty and use the concept meaningfully to provide conclusions; how can the level of uncertainty associated with a particular variable of economic interest be measured; and does experience provide any support for the view that uncertainty really matters. It develops a theory of the effect of price uncertainty on production and trade, takes a graphical approach to look at effects of a mean preserving spread to create rules for ordering distributions, and finishes with an econometric analysis of the effects of Brazil's adoption of a crawling peg in reducing real exchange rate uncertainty. This is an important early study into the significance of uncertainty.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
Global econometric models have a long history. From the early 1970s to the present, as modeling techniques have advanced, different modeling paradigms have emerged and been used to support national and international policy making. One purpose of this volume - based on a conference in recognition of the seminal impact of Nobel Prize winner in Economic Sciences Lawrence R Klein, whose pioneering work has spawned the field of international econometric modeling - is to survey these developments from today's perspective.A second objective of the volume is to shed light on the wide range of attempts to broaden the scope of modeling on an international scale. Beyond new developments in traditional areas of the trade and financial flows, the volume reviews new approaches to the modeling of linkages between macroeconomic activity and individual economic units, new research on the analysis of trends in income distribution and economic wellbeing on a global scale, and innovative ideas about modeling the interactions between economic development and the environment.With the expansion of elaborated economic linkages, this volume makes an important contribution to the evolving literature of global econometric models.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
In these two volumes, a group of distinguished economists debate the way in which evidence, in particular econometric evidence, can and should be used to relate macroeconomic theories to the real world. Topics covered include the business cycle, monetary policy, economic growth, the impact of new econometric techniques, the IS-LM model, the labour market, new Keynesian macroeconomics, and the use of macroeconomics in official documents.
This volume investigates the accuracy and dynamic performance of a high-frequency forecast model for the Japanese and United States economies based on the Current Quarter Model (CQM) or High Frequency Model (HFM) developed by the late Professor Emeritus Lawrence R. Klein. It also presents a survey of recent developments in high-frequency forecasts and gives an example application of the CQM model in forecasting Gross Regional Products (GRPs).
This book presents Professor Lawrence R Klein and his group's last quarterly econometric model of the United States economy that they had produced at the University of Pennsylvania. This is the last econometric model that Lawrence Klein and his disciples have left after some 50 years of cumulated efforts of constructing the US economy model up to around 2000. It was widely known as the WEFA Econometric Model Mark 10, and is the culmination of Professor Klein's research which spans more than 70 years, and would please not only Professor Klein's old students and colleagues, but also younger students who have heard so much of Klein models but have yet to see the latest model in its complete and printed form.
Since the middle of the twentieth century, economists have invested great resources into using statistical evidence to relate macroeconomic theories to the real world, and many new econometric techniques have been employed. In these two volumes, a distinguished group of economic theorists, econometricians, and economic methodologists examine how evidence has been used and how it should be used to understand the real world. Volume 1 focuses on the contribution of econometric techniques to understanding the macroeconomic world. It covers the use of evidence to understand the business cycle, the operation of monetary policy, and economic growth. A further section offers assessments of the overall impact of recent econometric techniques such as cointegration and unit roots. Volume 2 focuses on the labour market and economic policy, with sections covering the IS-LM model, the labour market, new Keynesian macroeconomics, and the use of macroeconomics in official documents (in both the USA and the EU). These volumes will be valuable to advanced undergraduates, graduate students, and practitioners for their clear presentation of opposing perspectives on macroeconomics and how evidence should be used. The chapters are complemented by discussion sections revealing the perspectives of other contributors on the methodological issues raised.
Originally published in 1976 and with second edition published in 1984. This book established itself as the first genuinely introductory text on econometric methods, assuming no formal background on the part of the reader. The second edition maintains this distinctive feature. Fundamental concepts are carefully explained and, where possible, techniques are developed by verbal reasoning rather than formal proof. It provides all the material for a basic course. and is also ideal for a student working alone. Very little knowledge of maths and statistics is assumed, and the logic of statistical method is carefully stated. There are numerous exercises, designed to help the student assess individual progress. Methods are described with computer solutions in mind and the author shows how a variety of different calculations can be performed with relatively simple programs. This new edition also includes much new material - statistical tables are now included and their use carefully explained. |
You may like...
|