![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Practically all donor countries that give aid claim to do so on the
basis on the recipient's good governance, but do these claims have
a real impact on the allocation of aid? Are democratic, human
rights-respecting, countries with low levels of corruption and
military expenditures actually likely to receive more aid than
other countries?
In recent years econometricians have examined the problems of diagnostic testing, specification testing, semiparametric estimation and model selection. In addition researchers have considered whether to use model testing and model selection procedures to decide the models that best fit a particular dataset. This book explores both issues with application to various regression models, including the arbitrage pricing theory models. It is ideal as a reference for statistical sciences postgraduate students, academic researchers and policy makers in understanding the current status of model building and testing techniques.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
Machine learning (ML) is progressively reshaping the fields of quantitative finance and algorithmic trading. ML tools are increasingly adopted by hedge funds and asset managers, notably for alpha signal generation and stocks selection. The technicality of the subject can make it hard for non-specialists to join the bandwagon, as the jargon and coding requirements may seem out of reach. Machine Learning for Factor Investing: R Version bridges this gap. It provides a comprehensive tour of modern ML-based investment strategies that rely on firm characteristics. The book covers a wide array of subjects which range from economic rationales to rigorous portfolio back-testing and encompass both data processing and model interpretability. Common supervised learning algorithms such as tree models and neural networks are explained in the context of style investing and the reader can also dig into more complex techniques like autoencoder asset returns, Bayesian additive trees, and causal models. All topics are illustrated with self-contained R code samples and snippets that are applied to a large public dataset that contains over 90 predictors. The material, along with the content of the book, is available online so that readers can reproduce and enhance the examples at their convenience. If you have even a basic knowledge of quantitative finance, this combination of theoretical concepts and practical illustrations will help you learn quickly and deepen your financial and technical expertise.
This book focuses on quantitative survey methodology, data collection and cleaning methods. Providing starting tools for using and analyzing a file once a survey has been conducted, it addresses fields as diverse as advanced weighting, editing, and imputation, which are not well-covered in corresponding survey books. Moreover, it presents numerous empirical examples from the author's extensive research experience, particularly real data sets from multinational surveys.
This concise textbook presents students with all they need for advancing in mathematical economics. Detailed yet student-friendly, Vohra's book contains chapters in, amongst others: * Feasibility Higher level undergraduates as well as postgraduate students in mathematical economics will find this book extremely useful in their development as economists.
This concise textbook presents students with all they need for advancing in mathematical economics. Detailed yet student-friendly, Vohra's book contains chapters in, amongst others: * Feasibility Higher level undergraduates as well as postgraduate students in mathematical economics will find this book extremely useful in their development as economists.
Quantitative Modeling of Derivative Securities demonstrates how to take the basic ideas of arbitrage theory and apply them - in a very concrete way - to the design and analysis of financial products. Based primarily (but not exclusively) on the analysis of derivatives, the book emphasizes relative-value and hedging ideas applied to different financial instruments. Using a "financial engineering approach," the theory is developed progressively, focusing on specific aspects of pricing and hedging and with problems that the technical analyst or trader has to consider in practice. More than just an introductory text, the reader who has mastered the contents of this one book will have breached the gap separating the novice from the technical and research literature.
Originally published in 1951, this volume reprints the classic work
written by one of the leading global econometricians.
The book examines the development and the dynamics of the personal distribution of income in Germany, Great Britain, Sweden and the United States and some other OECD countries. Starting with the distribution of labour income, the issue is then expanded to include all monetary incomes of private households and to adjust for household size by an equivalence scale. Some authors analyse one country in detail by decomposing aggregate inequality measures, other authors focus on direct comparisons of some features of the income distribution in Germany with those in Great Britain or in the United States. The results suggest dominant influences of unemployment as well as of tax and transfer policies and different welfare regimes, respectively, but also show that our knowledge about distributional processes is still limited.
Collecting and analyzing data on unemployment, inflation, and inequality help describe the complex world around us. When published by the government, such data are called official statistics. They are reported by the media, used by politicians to lend weight to their arguments, and by economic commentators to opine about the state of society. Despite such widescale use, explanations about how these measures are constructed are seldom provided for a non-technical reader. This Measuring Society book is a short, accessible guide to six topics: jobs, house prices, inequality, prices for goods and services, poverty, and deprivation. Each relates to concepts we use on a personal level to form an understanding of the society in which we live: We need a job, a place to live, and food to eat. Using data from the United States, we answer three basic questions: why, how, and for whom these statistics have been constructed. We add some context and flavor by discussing the historical background. This book provides the reader with a good grasp of these measures. Chaitra H. Nagaraja is an Associate Professor of Statistics at the Gabelli School of Business at Fordham University in New York. Her research interests include house price indices and inequality measurement. Prior to Fordham, Dr. Nagaraja was a researcher at the U.S. Census Bureau. While there, she worked on projects relating to the American Community Survey.
The volume examines the state-of-the-art of productivity and efficiency analysis. It brings together a selection of the best papers from the 10th North American Productivity Workshop. By analyzing world-wide perspectives on challenges that local economies and institutions may face when changes in productivity are observed, readers can quickly assess the impact of productivity measurement, productivity growth, dynamics of productivity change, measures of labor productivity, measures of technical efficiency in different sectors, frontier analysis, measures of performance, industry instability and spillover effects. The contributions in this volume focus on the theory and application of economics, econometrics, statistics, management science and operational research related to problems in the areas of productivity and efficiency measurement. Popular techniques and methodologies including stochastic frontier analysis and data envelopment analysis are represented. Chapters also cover broader issues related to measuring, understanding, incentivizing and improving the productivity and performance of firms, public services, and industries.
Computationally-intensive tools play an increasingly important role in financial decisions. Many financial problems-ranging from asset allocation to risk management and from option pricing to model calibration-can be efficiently handled using modern computational techniques. Numerical Methods and Optimization in Finance presents such computational techniques, with an emphasis on simulation and optimization, particularly so-called heuristics. This book treats quantitative analysis as an essentially computational discipline in which applications are put into software form and tested empirically. This revised edition includes two new chapters, a self-contained tutorial on implementing and using heuristics, and an explanation of software used for testing portfolio-selection models. Postgraduate students, researchers in programs on quantitative and computational finance, and practitioners in banks and other financial companies can benefit from this second edition of Numerical Methods and Optimization in Finance.
Key Topics in Clinical Research aims to provide a short, clear, highlighted reference to guide trainees and trainers through research and audit projects, from first idea, through to data collection and statistical analysis, to presentation and publication. This book is also designed to assist trainees in preparing for their specialty examinations by providing comprehensive, concise, easily accessible and easily understandable information on all aspects of clinical research and audit.
Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization The finance industry is seeing increased interest in new risk measures and techniques for portfolio optimization when parameters of the model are uncertain. This groundbreaking book extends traditional approaches of risk measurement and portfolio optimization by combining distributional models with risk or performance measures into one framework. Throughout these pages, the expert authors explain the fundamentals of probability metrics, outline new approaches to portfolio optimization, and discuss a variety of essential risk measures. Using numerous examples, they illustrate a range of applications to optimal portfolio choice and risk theory, as well as applications to the area of computational finance that may be useful to financial engineers. They also clearly show how stochastic models, risk assessment, and optimization are essential to mastering risk, uncertainty, and performance measurement. Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization provides quantitative portfolio managers (including hedge fund managers), financial engineers, consultants, and?academic researchers with answers to the key question of which risk measure is best for any given problem.
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
The analysis prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader. |
You may like...
Global Double Zero - The INF Treaty from…
George Rueckert
Hardcover
A Treatise on the Theory and Practice of…
Andrew Jackson Downing
Paperback
R711
Discovery Miles 7 110
|