![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This volume provides a coherent analysis of the economic, monetary and political aspects of growth dynamics in the Euro area. The different relevant aspects in this debate, presented and discussed by leading scholars and representatives of international organizations, include an assessment of the newest theoretical growth models for open economies, and empirical investigation of: the growth divergence between the US and Europe the extent to which fiscal co-ordination is desirable in a monetary union the role of product and labor market reforms the complex relationships between exchange rates and growth the contribution of monetary policy to economic growth and the prospects for economic growth in monetary unions. Although primarily focused on the Euro area, the analysis is equally relevant to all other common currency areas and will be welcomed by academics and students with an interest in European studies and financial economics, as well as policy and decision makers in international organisations, national institutions and central banks.
This volume focuses on the analysis and measurement of business cycles in Brazil, Russia, India, China and South Africa (BRICS). Divided into five parts, it begins with an overview of the main concepts and problems involved in monitoring and forecasting business cycles. Then it highlights the role of BRICS in the global economy and explores the interrelatedness of business cycles within BRICS. In turn, part two provides studies on the historical development of business cycles in the individual BRICS countries and describes the driving forces behind those cycles. Parts three and four present national business tendency surveys and composite cyclical indices for real-time monitoring and forecasting of various BRICS economies, while the final part discusses how the lessons learned in the BRICS countries can be used for the analysis of business cycles and their socio-political consequences in other emerging countries.
This book explores widely used seasonal adjustment methods and recent developments in real time trend-cycle estimation. It discusses in detail the properties and limitations of X12ARIMA, TRAMO-SEATS and STAMP - the main seasonal adjustment methods used by statistical agencies. Several real-world cases illustrate each method and real data examples can be followed throughout the text. The trend-cycle estimation is presented using nonparametric techniques based on moving averages, linear filters and reproducing kernel Hilbert spaces, taking recent advances into account. The book provides a systematical treatment of results that to date have been scattered throughout the literature. Seasonal adjustment and real time trend-cycle prediction play an essential part at all levels of activity in modern economies. They are used by governments to counteract cyclical recessions, by central banks to control inflation, by decision makers for better modeling and planning and by hospitals, manufacturers, builders, transportation, and consumers in general to decide on appropriate action. This book appeals to practitioners in government institutions, finance and business, macroeconomists, and other professionals who use economic data as well as academic researchers in time series analysis, seasonal adjustment methods, filtering and signal extraction. It is also useful for graduate and final-year undergraduate courses in econometrics and time series with a good understanding of linear regression and matrix algebra, as well as ARIMA modelling.
The present book is a collection of panel data papers, both theoretical and applied. Theoretical topics include methodology papers on panel data probit models, treatment models, error component models with an ARMA process on the time specific effects, asymptotic tests for poolability and their bootstrapped versions, confidence intervals for a doubly heteroskedastic stochastic production frontiers, estimation of semiparametric dynamic panel data models and a review of survey attrition and nonresponse in the European Community Household Panel. Applications include as different topics as e.g. the impact of uncertainty on UK investment, a Tobin-q investment model using US firm data, cost efficiency of Spanish banks, immigrant integration in Canada, the dynamics of individual health in the UK, the relation between inflation and growth among OECD and APEC countries, technical efficiency of cereal farms in England, and employment effects of education for disabled workers in Norway.
This volume is dedicated to the memory and the achievements of Professor Sir Clive Granger, economics Nobel laureate and one of the great econometricians and applied economists of the twentieth and early twenty-first centuries. It comprises contributions from leading econometricians and applied economists who knew Sir Clive and interacted with him over the years, and who wished to pay tribute to him as both a great economist and econometrician, and as a great man. This book was originally published as a special issue of Applied Financial Economics.
First published in 1987, this is an analysis of the contemporary breakdown of political and economic systems within the Eastern European communist countries. Rather than passively following the developments of this crisis, the author seeks instead to identify the reasons for failure and to examine alternative policies that offer solutions to these problems. Jan Winiecki's work offers a comparative study of the Soviet-type economies of the East with the market economies of the West; providing a cause and effect analysis of each model, with possible scenarios for their future prospects.
This book is intended for second year graduate students and professionals who have an interest in linear and nonlinear simultaneous equations mod els. It basically traces the evolution of econometrics beyond the general linear model (GLM), beginning with the general linear structural econo metric model (GLSEM) and ending with the generalized method of mo ments (GMM). Thus, it covers the identification problem (Chapter 3), maximum likelihood (ML) methods (Chapters 3 and 4), two and three stage least squares (2SLS, 3SLS) (Chapters 1 and 2), the general nonlinear model (GNLM) (Chapter 5), the general nonlinear simultaneous equations model (GNLSEM), the special ca'3e of GNLSEM with additive errors, non linear two and three stage least squares (NL2SLS, NL3SLS), the GMM for GNLSEIVl, and finally ends with a brief overview of causality and re lated issues, (Chapter 6). There is no discussion either of limited dependent variables, or of unit root related topics. It also contains a number of significant innovations. In a departure from the custom of the literature, identification and consistency for nonlinear models is handled through the Kullback information apparatus, as well as the theory of minimum contrast (MC) estimators. In fact, nearly all estimation problems handled in this volume can be approached through the theory of MC estimators. The power of this approach is demonstrated in Chapter 5, where the entire set of identification requirements for the GLSEM, in an ML context, is obtained almost effortlessly, through the apparatus of Kullback information."
This book discusses the need to carefully and prudently apply various regression techniques in order to obtain the full benefits. It also describes some of the techniques developed and used by the authors, presenting their innovative ideas regarding the formulation and estimation of regression decomposition models, hidden Markov chain, and the contribution of regressors in the set-theoretic approach, calorie poverty rate, and aggregate growth rate. Each of these techniques has applications that address a number of unanswered questions; for example, regression decomposition techniques reveal intra-household gender inequalities of consumption, intra-household allocation of resources and adult equivalent scales, while Hidden Markov chain models can forecast the results of future elections. Most of these procedures are presented using real-world data, and the techniques can be applied in other similar situations. Showing how difficult questions can be answered by developing simple models with simple interpretation of parameters, the book is a valuable resource for students and researchers in the field of model building.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
It is unlikely that any frontier of economics/econometrics is being pushed faster, further than that of computational techniques. The computer has become a tool for performing as well as an environment in which to perform economics and econometrics, taking over where theory bogs down, allowing at least approximate answers to questions that defy closed mathematical or analytical solutions. Tasks may now be attempted that were hitherto beyond human potential, and all the forces available can now be marshalled efficiently, leading to the achievement of desired goals. Computational Techniques for Econometrics and Economic Analysis is a collection of recent studies which exemplify all these elements, demonstrating the power that the computer brings to the economic analysts. The book is divided into four parts: 1 -- the computer and econometric methods; 2 -- the computer and economic analysis; 3 -- computational techniques for econometrics; and 4 -- the computer and econometric studies.
Three different lines of approach have contributed to the theory of optimal planning. One approach considers the problem from the view-point of a national government and its adviser, the econometrician planning speci alist. The government can, if this is thought to be desirable, stimulate investment in certain directions and discourage other economic activities. By various fiscal devices, it can influence both the total level and the distribution of investment funds over different sectors of production. Also, in many countries, a public agency plays some kind of coordinat ing role in the formulation of long-term plans for output by the enter prises sector; this may range from administrative direction in so-called centrally planned economies, to persuasion and advice in 'capitalist' economies. Accordingly, the public planner wishes to know what dis tribution of the nation's resources would be 'optimal'. This leads to the construction of various models which may be described under the general heading 'input-output type models'. This type of model has been largely developed by practitioners, among whom Sandee [B2] is probably the most outstanding and the earliest. A later, well-developed example of a model based on this approach is, for example, the Czech model by Cerny et al. [Bl]. A second approach considers the problem from the point of view of the private entrepreneur and his adviser, the manager and financial accountant.
This monograph deals with spatially dependent nonstationary time series in a way accessible to both time series econometricians wanting to understand spatial econometics, and spatial econometricians lacking a grounding in time series analysis. After charting key concepts in both time series and spatial econometrics, the book discusses how the spatial connectivity matrix can be estimated using spatial panel data instead of assuming it to be exogenously fixed. This is followed by a discussion of spatial nonstationarity in spatial cross-section data, and a full exposition of non-stationarity in both single and multi-equation contexts, including the estimation and simulation of spatial vector autoregression (VAR) models and spatial error correction (ECM) models. The book reviews the literature on panel unit root tests and panel cointegration tests for spatially independent data, and for data that are strongly spatially dependent. It provides for the first time critical values for panel unit root tests and panel cointegration tests when the spatial panel data are weakly or spatially dependent. The volume concludes with a discussion of incorporating strong and weak spatial dependence in non-stationary panel data models. All discussions are accompanied by empirical testing based on a spatial panel data of house prices in Israel.
This book consists of four parts: I. Labour demand and supply, II. Productivity slowdown and innovative activity, III. Disequilibrium and business cycle analysis, and IV. Time series analysis of output and employment. It presents a fine selection of articles in the growing field ofthe empirical analysis of output and employment fluctuations with applications in a micro-econometric or a time-series framework. The time-series literature recently has emphasized the careful testing for stationarity and nonlinearity in the data, and the importance of cointegration theory. An essential part of the papers make use of parametric and non-parametric methods developed in this literature and mostly connect their results to the hysteresis discussion about the existence of fragile equilibria. A second set of macro approaches use the disequilibrium framework that has found so much interest in Europe in recent years. The other papers use newly developed methods for microdata, especially qualitative data or limited dependent variables to study microeconomic models of behaviour that explain labour market and output decisions.
In the modern world of gigantic datasets, which scientists and practioners of all fields of learning are confronted with, the availability of robust, scalable and easy-to-use methods for pattern recognition and data mining are of paramount importance, so as to be able to cope with the avalanche of data in a meaningful way. This concise and pedagogical research monograph introduces the reader to two specific aspects - clustering techniques and dimensionality reduction - in the context of complex network analysis. The first chapter provides a short introduction into relevant graph theoretical notation; chapter 2 then reviews and compares a number of cluster definitions from different fields of science. In the subsequent chapters, a first-principles approach to graph clustering in complex networks is developed using methods from statistical physics and the reader will learn, that even today, this field significantly contributes to the understanding and resolution of the related statistical inference issues. Finally, an application chapter examines real-world networks from the economic realm to show how the network clustering process can be used to deal with large, sparse datasets where conventional analyses fail.
Astranger in academia cannot but be impressed by the apparent uniformity and precision of the methodology currently applied to the measurement of economic relationships. In scores of journal articles and other studies, a theoretical argument is typically presented to justify the position that a certain variable is related to certain other, possibly causal, variables. Regression or a related method is applied to a set of observations on these variables, and the conclusion often emerges that the causa, l variables are indeed "significant" at a certain "level," thereby lending support to the theoretical argument-an argument presumably formulated independently of the observations. A variable may be declared significant (and few doubt that this does not mean important) at, say, the 0. 05 level, but not the 0. 01. The effects of the variables are calculated to many significant digits, and are often accompanied by intervals and forecasts of not quite obvious meaning but certainly of reassuring "confidence. " The uniformity is also evident in the many mathematically advanced text books of statistics and econometrics, and in their less rigorous introductory versions for students in economics or business. It is reflected in the tools of the profession: computer programs, from the generaiones addressed to the incidental researcher to the dedicated and sophisticated programs used by the experts, display the same terms and implement the same methodology. In short, there appears no visible alternative to the established methodol ogy and no sign of reservat ions concerning its validity."
Testing for a Unit Root is now an essential part of time series analysis but the literature on the topic is so large that knowing where to start is difficult even for the specialist. This book provides a way into the techniques of unit root testing, explaining the pitfalls and nonstandard cases, using practical examples and simulation analysis.
Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.
Written in honor of Emeritus Professor Georges Prat (University of Paris Nanterre, France), this book includes contributions from eminent authors on a range of topics that are of interest to researchers and graduates, as well as investors and portfolio managers. The topics discussed include the effects of information and transaction costs on informational and allocative market efficiency, bubbles and stock price dynamics, paradox of rational expectations and the principle of limited information, uncertainty and expectation hypotheses, oil price dynamics, and nonlinearity in asset price dynamics.
Capital theory is a cornerstone of modern economics. Its ideas are fundamental for dynamic equilibrium theory and its concepts are applied in many branches of economics like game theory, resource and environmental economics, although this may not be recognized on a first glance. In this monograph, an approach is presented, which allows to derive important results of capital theory in a coherent and readily accessible framework. A special emphasis is given on infinite horizon and overlapping generations economics. Irreversibility of time, or the failure of the market system appear in a different light if an infinite horizon framework is applied. To bridge the gap between pure and applied economic theory, the structure of our theoretical approach is integrated in a computable general equilibrium model.
This volume contains an accessible discussion examining computationally-intensive techniques and bootstrap methods, providing ways to improve the finite-sample performance of well-known asymptotic tests for regression models. The book uses the linear regression model as a framework for introducing simulation-based tests to help perform econometric analyses.
This outstanding collection of William Brock's essays illustrates the power of dynamic modelling to shed light on the forces for stability and instability in economic systems. The articles selected reflect his best work and are indicative both of the type of policy problem that he finds challenging and the complex methodology that he uses to solve them. Also included is an introduction by Brock to his own work, which helps tie together the main aspects of his research to date. The volume covers: * stochastic models and optimal growth * financial and macroeconomic modelling * ecology, mechanism design and regulation * nonlinearity in economics.
As well as providing a history of economic statistics, the book includes contributions by economists from a number of countries, applying economic statistics to the past and to current economic issues.
How does innovation emerge from normal economic activity? Economic Interdependence and Innovative Activity is an original new book which tries to answer this question by reconciling inter-industrial analysis with the study of innovation. This book provides a bridge between economic statics and the dynamics of growth and development. As well as offering important and original empirical data for Canada, France, Italy, Greece and China, the authors make a series of theoretical advances and propose a new way to observe the innovative process as well as new analytical tools to examine innovative activity. Their central thesis is that innovative outputs emerge out of increased social interaction, and division of labour through cooperative networks. An authoritative theoretical introduction and some thought-provoking conclusions have been prepared by Christian DeBresson. Economic Interdependence and Innovative Activity encourage input-output economists to encompass innovative activities in dynamic models and innovation researchers to look at technical interdependencies.
In applications, and especially in mathematical finance, random
time-dependent events are often modeled as stochastic processes.
Assumptions are made about the structure of such processes, and
serious researchers will want to justify those assumptions through
the use of data. As statisticians are wont to say, "In God we
trust; all others must bring data."
|
![]() ![]() You may like...
Stability, Control and Differential…
Alexander Tarasyev, Vyacheslav Maksimov, …
Hardcover
R7,114
Discovery Miles 71 140
Security and Privacy Preserving for IoT…
Ahmed A. Abd El-Latif, Bassem Abd-El-Atty, …
Hardcover
R4,588
Discovery Miles 45 880
Recent Advances in Nonlinear Dynamics…
Kyandoghere Kyamakya, Wolfgang Mathis, …
Hardcover
Machine Intelligence and Data Analytics…
Uttam Ghosh, Yassine Maleh, …
Hardcover
R4,945
Discovery Miles 49 450
Random Walks and Diffusions on Graphs…
Philipp Blanchard, Dimitri Volchenkov
Hardcover
R2,899
Discovery Miles 28 990
Computer Networks, Big Data and IoT…
A.Pasumpon Pandian, Xavier Fernando, …
Paperback
R8,492
Discovery Miles 84 920
|