![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Originally published in 1951, this volume reprints the classic work
written by one of the leading global econometricians.
Practically all donor countries that give aid claim to do so on the
basis on the recipient's good governance, but do these claims have
a real impact on the allocation of aid? Are democratic, human
rights-respecting, countries with low levels of corruption and
military expenditures actually likely to receive more aid than
other countries?
In recent years econometricians have examined the problems of diagnostic testing, specification testing, semiparametric estimation and model selection. In addition researchers have considered whether to use model testing and model selection procedures to decide the models that best fit a particular dataset. This book explores both issues with application to various regression models, including the arbitrage pricing theory models. It is ideal as a reference for statistical sciences postgraduate students, academic researchers and policy makers in understanding the current status of model building and testing techniques.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
This book focuses on quantitative survey methodology, data collection and cleaning methods. Providing starting tools for using and analyzing a file once a survey has been conducted, it addresses fields as diverse as advanced weighting, editing, and imputation, which are not well-covered in corresponding survey books. Moreover, it presents numerous empirical examples from the author's extensive research experience, particularly real data sets from multinational surveys.
This concise textbook presents students with all they need for advancing in mathematical economics. Detailed yet student-friendly, Vohra's book contains chapters in, amongst others: * Feasibility Higher level undergraduates as well as postgraduate students in mathematical economics will find this book extremely useful in their development as economists.
This concise textbook presents students with all they need for advancing in mathematical economics. Detailed yet student-friendly, Vohra's book contains chapters in, amongst others: * Feasibility Higher level undergraduates as well as postgraduate students in mathematical economics will find this book extremely useful in their development as economists.
Originally published in 1951, this volume reprints the classic work
written by one of the leading global econometricians.
The volume examines the state-of-the-art of productivity and efficiency analysis. It brings together a selection of the best papers from the 10th North American Productivity Workshop. By analyzing world-wide perspectives on challenges that local economies and institutions may face when changes in productivity are observed, readers can quickly assess the impact of productivity measurement, productivity growth, dynamics of productivity change, measures of labor productivity, measures of technical efficiency in different sectors, frontier analysis, measures of performance, industry instability and spillover effects. The contributions in this volume focus on the theory and application of economics, econometrics, statistics, management science and operational research related to problems in the areas of productivity and efficiency measurement. Popular techniques and methodologies including stochastic frontier analysis and data envelopment analysis are represented. Chapters also cover broader issues related to measuring, understanding, incentivizing and improving the productivity and performance of firms, public services, and industries.
Computationally-intensive tools play an increasingly important role in financial decisions. Many financial problems-ranging from asset allocation to risk management and from option pricing to model calibration-can be efficiently handled using modern computational techniques. Numerical Methods and Optimization in Finance presents such computational techniques, with an emphasis on simulation and optimization, particularly so-called heuristics. This book treats quantitative analysis as an essentially computational discipline in which applications are put into software form and tested empirically. This revised edition includes two new chapters, a self-contained tutorial on implementing and using heuristics, and an explanation of software used for testing portfolio-selection models. Postgraduate students, researchers in programs on quantitative and computational finance, and practitioners in banks and other financial companies can benefit from this second edition of Numerical Methods and Optimization in Finance.
Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization The finance industry is seeing increased interest in new risk measures and techniques for portfolio optimization when parameters of the model are uncertain. This groundbreaking book extends traditional approaches of risk measurement and portfolio optimization by combining distributional models with risk or performance measures into one framework. Throughout these pages, the expert authors explain the fundamentals of probability metrics, outline new approaches to portfolio optimization, and discuss a variety of essential risk measures. Using numerous examples, they illustrate a range of applications to optimal portfolio choice and risk theory, as well as applications to the area of computational finance that may be useful to financial engineers. They also clearly show how stochastic models, risk assessment, and optimization are essential to mastering risk, uncertainty, and performance measurement. Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization provides quantitative portfolio managers (including hedge fund managers), financial engineers, consultants, and?academic researchers with answers to the key question of which risk measure is best for any given problem.
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
The analysis prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader.
This is the very first book to offer seven substantial econometric models of the Chinese economy with the statistical data used, so that the reader will be able to reproduce them all and test them for any policy alternatives. The book presents up-to-date models produced both inside and outside China, so that readers can understand most of the advanced studies of the Chinese economy by Chinese experts at the present time. This is an invaluable reference for graduate students and scholars working on Chinese economic problems.
It has been held that when economic policy makers use economic models, there is a one way flow of information from the models to policy analysis. This text challenges this assumption, recognizing that in practice the requirements and questions of policy makers play an important role in the development and revision of those very models. Written by highly-placed practitioners and academic economists, it provides a picture of how modellers and policy makers interact with depth, insight and conviction. It offers international case studies of particular interactions between models and policy making, exploring questions such as: how does interaction work? What roles do different professional groups play in interaction? What strategies make the use of models in policy preparation successful? What insights can sociologists and historians give on the interaction between models and policy makers?
First Published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
Originally published in 1984. This book addresses the economics of the changing mineral industry, which is highly affected by energy economics. The study estimates, in quantitative terms, the short- to mid-term consequences of rising energy prices alongside falling ore quality for the copper and aluminum industries. The effects of changing cost factors on substitution between metals is assessed as is the potential for relying on increased recycling. Copper and aluminum industry problems should be representative of those faced by the mineral processing sector as a whole. Two complex econometric models presented here produce forecasts for the industries and the book discusses and reviews other econometric commodity models. |
You may like...
The Blinded City - Ten Years In…
Matthew Wilhelm-Solomon
Paperback
(1)
Residual Life Prediction and Optimal…
Changhua Hu, Hongdong Fan, …
Hardcover
R3,994
Discovery Miles 39 940
Geometric Control Theory and…
Gianna Stefani, Ugo Boscain, …
Hardcover
R3,509
Discovery Miles 35 090
|