![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy.
This book presents selected peer-reviewed contributions from the International Conference on Time Series and Forecasting, ITISE 2018, held in Granada, Spain, on September 19-21, 2018. The first three parts of the book focus on the theory of time series analysis and forecasting, and discuss statistical methods, modern computational intelligence methodologies, econometric models, financial forecasting, and risk analysis. In turn, the last three parts are dedicated to applied topics and include papers on time series analysis in the earth sciences, energy time series forecasting, and time series analysis and prediction in other real-world problems. The book offers readers valuable insights into the different aspects of time series analysis and forecasting, allowing them to benefit both from its sophisticated and powerful theory, and from its practical applications, which address real-world problems in a range of disciplines. The ITISE conference series provides a valuable forum for scientists, engineers, educators and students to discuss the latest advances and implementations in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
This completely restructured, updated third edition of the volume first published in 1992 provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross section and time series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone.
Part I is concerned with the fundamentals of panel data econometrics, both linear and non linear; Part II deals with more advanced topics such as dynamic models, simultaneity and measurement errors, unit roots and co integration, incomplete panels and selectivity, duration and count models, etc. This volume also provides insights into the use of panel data in empirical studies. Part III deals with surveys in several major fields of applied economics, such as investment demand, foreign direct investment and international trade, production efficiency, labour supply, transitions on the labour market, etc. Six new chapters about R&D and innovation, wages, health economics, policy evaluation, growth empirics and the impact of monetary policy have been included.
Shedding light on some of the most pressing open questions in the analysis of high frequency data, this volume presents cutting-edge developments in high frequency financial econometrics. Coverage spans a diverse range of topics, including market microstructure, tick-by-tick data, bond and foreign exchange markets, and large dimensional volatility modeling. The volume is of interest to graduate students, researchers, and industry professionals.
Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy.
In January 2005, the German government enacted a substantial reform of the welfare system, the so-called "Hartz IV reform." This book evaluates key characteristics of the reform from a microeconometric perspective. It investigates whether a centralized or decentralized organization of welfare administration is more successful to integrate welfare recipients into employment. Moreover, it analyzes the employment effects of an intensified use of benefit sanctions and evaluates the effectiveness and efficiency of the most frequently assigned Active Labor Market Programs. The analyses have a focus on immigrants, who are highly over-represented in the German welfare system. "
Focuses on the assumptions underlying the algorithms rather than their statistical properties Presents cutting-edge analysis of factor models and finite mixture models. Uses a hands-on approach to examine the assumptions made by the models and when the models fail to estimate accurately Utilizes interesting real-world data sets that can be used to analyze important microeconomic problems Introduces R programming concepts throughout the book. Includes appendices that discuss many of the concepts introduced in the book, as well as measures of uncertainty in microeconometrics.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
Dynamic factor models (DFM) constitute an active and growing area of research, both in econometrics, in macroeconomics, and in finance. Many applications lie at the center of policy questions raised by the recent financial crises, such as the connections between yields on government debt, credit risk, inflation, and economic growth. This volume collects a key selection of up-to-date contributions that cover a wide range of issues in the context of dynamic factor modeling, such as specification, estimation, and application of DFMs. Examples include further developments in DFM for mixed-frequency data settings, extensions to time-varying parameters and structural breaks, for multi-level factors associated with subsets of variables, in factor augmented error correction models, and in many other related aspects. A number of contributions propose new estimation procedures for DFM, such as spectral expectation-maximization algorithms and Bayesian approaches. Numerous applications are discussed, including the dating of business cycles, implied volatility surfaces, professional forecaster survey data, and many more.
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. International Financial Markets: Volume I provides a key repository on the current state of knowledge, the latest debates and recent literature on international financial markets. Against the background of the "financialization of commodities" since the 2008 sub-primes crisis, section one contains recent contributions on commodity and financial markets, pushing the frontiers of applied econometrics techniques. The second section is devoted to exchange rate and current account dynamics in an environment characterized by large global imbalances. Part three examines the latest research in the field of meta-analysis in economics and finance. This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
Microsimulation Modelling involves the application of simulation methods to micro data for the purposes of evaluating the effectiveness and improving the design of public policy. The field has existed for over 50 years and has been applied to many different policy areas and is a methodology that is applied within both government and academia. This handbook brings together leading authors in the field to describe and discuss the main current issues within the field. The handbook provides an overview of current developments across each of the sub-fields of microsimulation modelling such as tax-benefit, pensions, spatial, health, labour, consumption, transport and land use policy as well as macro-micro, environmental and demographic issues. It focuses also on the modelling different micro units such as households, firms and farms. Each chapter discusses its sub-field under the following headings: the main methodologies of the sub-field; survey the literature in the area; critique the literature; and propose future directions for research within the sub-field.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.
This book provides the reader with user-friendly applications of normal distribution. In several variables it is called the multinormal distribution which is often handled using matrices for convenience. The author seeks to make the arguments less abstract and hence, starts with the univariate case and moves progressively toward the vector and matrix cases. The approach used in the book is a gradual one, going from one scalar variable to a vector variable and to a matrix variable. The author presents the unified aspect of normal distribution, as well as addresses several other issues, including random matrix theory in physics. Other well-known applications, such as Herrnstein and Murray's argument that human intelligence is substantially influenced by both inherited and environmental factors, will be discussed in this book. It is a better predictor of many personal dynamics - including financial income, job performance, birth out of wedlock, and involvement in crime - than are an individual's parental socioeconomic status, or education level, and deserve to be mentioned and discussed.
This is Volume 24 of the monograph series International Symposia in Economic Theory and Econometrics. ISETE publishes proceedings of conferences and symposia, as well as research monographs of the highest quality and importance. All articles published in these volumes are refereed relative to the standards of the best journals, therefore not all papers presented at related symposia are published in these proceedings volumes. The topics chosen for these volumes are those of particular research importance at the time of the selection of the topic.
China's reform and opening-up have contributed to its long-term and rapid economic development, resulting in a much stronger economic strength and much better life for its people. Meanwhile, the deepening economic integration between China and the world has resulted in an increasingly complex environment, growing influencing factors and severe challenges to China's economic development. Under the "new normal" of the Chinese economy, accurate analysis of the economic situation is essential to scientific decision-making, sustainable and healthy economic development and to build a moderately prosperous society in all respects. By applying statistical and national economic accounting methods, and based on detailed statistics and national economic accounting data, this book presents an in-depth analysis of the key economic fields, such as real estate economy, automotive industry, high-tech industry, investment, opening-up, income distribution of residents, economic structure, balance of payments structure and financial operation, since the reform and opening-up, especially in recent years. It aims to depict the performance and characteristics of these key economic fields and their roles in the development of national economy, thus providing useful suggestions for economic decision-making, and facilitating the sustainable and healthy development of the economy and the realization of the goal of building a moderately prosperous society in all respects.
Co-integration, equilibrium and equilibrium correction are key
concepts in modern applications of econometrics to real world
problems. This book provides direction and guidance to the now vast
literature facing students and graduate economists. Econometric
theory is linked to practical issues such as how to identify
equilibrium relationships, how to deal with structural breaks
associated with regime changes and what to do when variables are of
different orders of integration.
Predicting foreign exchange rates has presented a long-standing challenge for economists. However, the recent advances in computational techniques, statistical methods, newer datasets on emerging market currencies, etc., offer some hope. While we are still unable to beat a driftless random walk model, there has been serious progress in the field. This book provides an in-depth assessment of the use of novel statistical approaches and machine learning tools in predicting foreign exchange rate movement. First, it offers a historical account of how exchange rate regimes have evolved over time, which is critical to understanding turning points in a historical time series. It then presents an overview of the previous attempts at modeling exchange rates, and how different methods fared during this process. At the core sections of the book, the author examines the time series characteristics of exchange rates and how contemporary statistics and machine learning can be useful in improving predictive power, compared to previous methods used. Exchange rate determination is an active research area, and this book will appeal to graduate-level students of international economics, international finance, open economy macroeconomics, and management. The book is written in a clear, engaging, and straightforward way, and will greatly improve access to this much-needed knowledge in the field.
Managers are often under great pressure to improve the performance of their organizations. To improve performance, one needs to constantly evaluate operations or processes related to producing products, providing services, and marketing and selling products. Performance evaluation and benchmarking are a widely used method to identify and adopt best practices as a means to improve performance and increase productivity, and are particularly valuable when no objective or engineered standard is available to define efficient and effective performance. For this reason, benchmarking is often used in managing service operations, because service standards (benchmarks) are more difficult to define than manufacturing standards. Benchmarks can be established but they are somewhat limited as they work with single measurements one at a time. It is difficult to evaluate an organization's performance when there are multiple inputs and outputs to the system. The difficulties are further enhanced when the relationships between the inputs and the outputs are complex and involve unknown tradeoffs. It is critical to show benchmarks where multiple measurements exist. The current book introduces the methodology of data envelopment analysis (DEA) and its uses in performance evaluation and benchmarking under the context of multiple performance measures.
"Students of econometrics and their teachers will find this book to be the best introduction to the subject at the graduate and advanced undergraduate level. Starting with least squares regression, Hayashi provides an elegant exposition of all the standard topics of econometrics, including a detailed discussion of stationary and non-stationary time series. The particular strength of the book is the excellent balance between econometric theory and its applications, using GMM as an organizing principle throughout. Each chapter includes a detailed empirical example taken from classic and current applications of econometrics."--Dale Jorgensen, Harvard University ""Econometrics" will be a very useful book for intermediate and advanced graduate courses. It covers the topics with an easy to understand approach while at the same time offering a rigorous analysis. The computer programming tips and problems should also be useful to students. I highly recommend this book for an up-to-date coverage and thoughtful discussion of topics in the methodology and application of econometrics."--Jerry A. Hausman, Massachusetts Institute of Technology ""Econometrics" covers both modern and classic topics without shifting gears. The coverage is quite advanced yet the presentation is simple. Hayashi brings students to the frontier of applied econometric practice through a careful and efficient discussion of modern economic theory. The empirical exercises are very useful. . . . The projects are carefully crafted and have been thoroughly debugged."--Mark W. Watson, Princeton University ""Econometrics" strikes a good balance between technical rigor and clear exposition. . . . The use of empiricalexamples is well done throughout. I very much like the use of old 'classic' examples. It gives students a sense of history--and shows that great empirical econometrics is a matter of having important ideas and good data, not just fancy new methods. . . . The style is just great, informal and engaging."--James H. Stock, John F. Kennedy School of Government, Harvard University
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book. |
You may like...
Heat Transfer Engineering - Fundamentals…
C. Balaji, Balaji Srinivasan, …
Paperback
R2,850
Discovery Miles 28 500
Practical Grounding, Bonding, Shielding…
G. Vijayaraghavan, Mark Brown, …
Paperback
R1,427
Discovery Miles 14 270
Orphan Drugs - Understanding the Rare…
Elizabeth Hernberg-Stahl, Miroslav Reljanović
Hardcover
R4,210
Discovery Miles 42 100
Practical Strategies to Reduce Childhood…
Belinda M Alexander-Ashley
Hardcover
R5,370
Discovery Miles 53 700
Theories of School Psychology - Critical…
Kristy K. Kelly, S. Andrew Garbacz, …
Paperback
R1,255
Discovery Miles 12 550
|