![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
This major volume of essays by Kenneth F. Wallis features 28 articles published over a quarter of a century on the statistical analysis of economic time series, large-scale macroeconometric modelling, and the interface between them.The first part deals with time-series econometrics and includes significant early contributions to the development of the LSE tradition in time-series econometrics, which is the dominant British tradition and has considerable influence worldwide. Later sections discuss theoretical and practical issues in modelling seasonality and forecasting with applications in both large-scale and small-scale models. The final section summarizes the research programme of the ESRC Macroeconomic Modelling Bureau, a unique comparison project among economy-wide macroeconometric models. Professor Wallis has written a detailed introduction to the papers in this volume in which he explains the background to these papers and comments on subsequent developments.
This volume of Advances in Econometrics contains a selection of papers presented at the "Econometrics of Complex Survey Data: Theory and Applications" conference organized by the Bank of Canada, Ottawa, Canada, from October 19-20, 2017. The papers included in this volume span a range of methodological and practical topics including survey collection comparisons, imputation mechanisms, the bootstrap, nonparametric techniques, specification tests, and empirical likelihood estimation using complex survey data. For academics and students with an interest in econometrics and the ways in which complex survey data can be used and evaluated, this volume is essential.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
This is the first of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The first volume includes theoretical and applied papers addressing topics such as dynamic mechanism design, agency problems, and networks.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
The core methods in today's econometric toolkit are linear regression for statistical control, instrumental variables methods for the analysis of natural experiments, and differences-in-differences methods that exploit policy changes. In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? "Mostly Harmless Econometrics" shows how the basic tools of applied econometrics allow the data to speak. In addition to econometric essentials, "Mostly Harmless Econometrics" covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications"
Coverage has been extended to include recent topics. The book again presents a unified treatment of economic theory, with the method of maximum likelihood playing a key role in both estimation and testing. Exercises are included and the book is suitable as a general text for final-year undergraduate and postgraduate students.
Applied Time Series Modelling and Forecasting provides a relatively non-technical introduction to applied time series econometrics and forecasting involving non-stationary data. The emphasis is very much on the why and how and, as much as possible, the authors confine technical material to boxes or point to the relevant sources for more detailed information. This book is based on an earlier title Using Cointegration Analysis in Econometric Modelling by Richard Harris. As well as updating material covered in the earlier book, there are two major additions involving panel tests for unit roots and cointegration and forecasting of financial time series. Harris and Sollis have also incorporated as many of the latest techniques in the area as possible including: testing for periodic integration and cointegration; GLS detrending when testing for unit roots; structural breaks and season unit root testing; testing for cointegration with a structural break; asymmetric tests for cointegration; testing for super-exogeniety; seasonal cointegration in multivariate models; and approaches to structural macroeconomic modelling. In addition, the discussion of certain topics, such as testing for unique vectors, has been simplified. Applied Time Series Modelling and Forecasting has been written for students taking courses in financial economics and forecasting, applied time series, and econometrics at advanced undergraduate and postgraduate levels. It will also be useful for practitioners who wish to understand the application of time series modelling e.g. financial brokers. Data sets and econometric code for implementing some of the more recent procedures covered in the book can be found on the following web site www.wiley.co.uk/harris
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
Non-market valuation has become a broadly accepted and widely practiced means of measuring the economic values of the environment and natural resources. In this book, the authors provide a guide to the statistical and econometric practices that economists employ in estimating non-market values.The authors develop the econometric models that underlie the basic methods: contingent valuation, travel cost models, random utility models and hedonic models. They analyze the measurement of non-market values as a procedure with two steps: the estimation of parameters of demand and preference functions and the calculation of benefits from the estimated models. Each of the models is carefully developed from the preference function to the behavioral or response function that researchers observe. The models are then illustrated with datasets that characterize the kinds of data researchers typically deal with. The real world data and clarity of writing in this book will appeal to environmental economists, students, researchers and practitioners in multilateral banks and government agencies.
'Experiments in Organizational Economics' highlights the importance of replicating previous economic experiments. Replication enables experimental findings to be subjected to rigorous scrutiny. Despite this obvious advantage, direct replication remains relatively scant in economics. One possible explanation for this situation is that publication outlets favor novel work over tests of robustness. Readers will gain a better understanding of the role that replication plays in economic discovery as well as valuable insights into the robustness of previously reported findings.
Technical analysis points out that the best source of information to beat the market is the price itself. Introducing readers to technical analysis in a more succinct and practical way, Ramlall focuses on the key aspects, benefits, drawbacks, and the main tools of technical analysis. Chart Patterns, Point & Figure, Stochastics, Sentiment indicators, Elliot Wave Theory, RSI, R, Candlesticks and more are covered, including both the concepts and the practical applications. Also including programming technical analysis tools, this book is a valuable tool for both researchers and practitioners.
The last decade has brought dramatic changes in the way that researchers analyze economic and financial time series. This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. "Time Series Analysis" fills an important need for a textbook that integrates economic theory, econometrics, and new results. The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.
A provocative new analysis of immigration's long-term effects on a nation's economy and culture. Over the last two decades, as economists began using big datasets and modern computing power to reveal the sources of national prosperity, their statistical results kept pointing toward the power of culture to drive the wealth of nations. In The Culture Transplant, Garett Jones documents the cultural foundations of cross-country income differences, showing that immigrants import cultural attitudes from their homelands—toward saving, toward trust, and toward the role of government—that persist for decades, and likely for centuries, in their new national homes. Full assimilation in a generation or two, Jones reports, is a myth. And the cultural traits migrants bring to their new homes have enduring effects upon a nation's economic potential. Built upon mainstream, well-reviewed academic research that hasn't pierced the public consciousness, this book offers a compelling refutation of an unspoken consensus that a nation's economic and political institutions won't be changed by immigration. Jones refutes the common view that we can discuss migration policy without considering whether migration can, over a few generations, substantially transform the economic and political institutions of a nation. And since most of the world's technological innovations come from just a handful of nations, Jones concludes, the entire world has a stake in whether migration policy will help or hurt the quality of government and thus the quality of scientific breakthroughs in those rare innovation powerhouses.
Nanak Kakwani and Hyun Hwa Son make use of social welfare functions to derive indicators of development relevant to specific social objectives, such as poverty- and inequality-reduction. Arguing that the measurement of development cannot be value-free, the authors assert that if indicators of development are to have policy relevance, they must be assessed on the basis of the social objectives in question. This study develops indicators that are sensitive to both the level and the distribution of individuals' capabilities. The idea of the social welfare function, defined in income space, is extended to the concept of the social well-being function, defined in capability space. Through empirical analysis from selected developing countries, with a particular focus on Brazil, the authors shape techniques appropriate to the analysis of development in different dimensions. The focus of this evidence-based policy analysis is to evaluate alternative policies affecting the capacities of people to enjoy a better life.
This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time series.
The generalized method of moments (GMM) estimation has emerged over the past decade as providing a ready to use, flexible tool of application to a large number of econometric and economic models by relying on mild, plausible assumptions. The principal objective of this volume, the first devoted entirely to the GMM methodology, is to offer a complete and up to date presentation of the theory of GMM estimation as well as insights into the use of these methods in empirical studies. It is also designed to serve as a unified framework for teaching estimation theory in econometrics. Contributors to the volume include well-known authorities in the field based in North America, the UK/Europe, and Australia.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
This volume of Advances in Econometrics focuses on recent developments in the use of structural econometric models in empirical economics. The papers in this volume are divided in to three broad groups. The first part looks at recent developments in the estimation of dynamic discrete choice models. This includes using new estimation methods for these models based on Euler equations, estimation using sieve approximation of high dimensional state space, the identification of Markov dynamic games with persistent unobserved state variables and developing test of monotone comparative static in models of multiple equilibria. The second part looks at recent advances in the area empirical matching models. The papers in this section look at developing estimators for matching models based on stability conditions, estimating matching surplus functions using generalized entropy functions, solving for the fixed point in the Choo-Siow matching model using a contraction mapping formulation. While the issue of incomplete, or partial identification of model parameters is touched upon in some of the foregoing chapters, two chapters focus on this issue, in the context of testing for monotone comparative statics in models with multiple equilibria, and estimation of supermodular games under the restrictions that players' strategies be rationalizable. The last group of three papers looks at empirical applications using structural econometric models. Two applications applies matching models to solve endogenous matching to the loan spread equation and to endogenize marriage in the collective model of intrahousehold allocation. Another applications looks at market power of condominium developers in the Japanese housing market in the 1990s. |
You may like...
5G NR and Enhancements - From R15 to R16
Hai Tang, Ning Yang, …
Paperback
R2,810
Discovery Miles 28 100
Continuum Mechanics and Applications in…
Brian Straughan, Ralf Greve, …
Hardcover
R2,871
Discovery Miles 28 710
Statistical Physics, Automata Networks…
E. Goles, Servet Martinez
Hardcover
R1,523
Discovery Miles 15 230
|