![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Computationally-intensive tools play an increasingly important role in financial decisions. Many financial problems-ranging from asset allocation to risk management and from option pricing to model calibration-can be efficiently handled using modern computational techniques. Numerical Methods and Optimization in Finance presents such computational techniques, with an emphasis on simulation and optimization, particularly so-called heuristics. This book treats quantitative analysis as an essentially computational discipline in which applications are put into software form and tested empirically. This revised edition includes two new chapters, a self-contained tutorial on implementing and using heuristics, and an explanation of software used for testing portfolio-selection models. Postgraduate students, researchers in programs on quantitative and computational finance, and practitioners in banks and other financial companies can benefit from this second edition of Numerical Methods and Optimization in Finance.
Key Topics in Clinical Research aims to provide a short, clear, highlighted reference to guide trainees and trainers through research and audit projects, from first idea, through to data collection and statistical analysis, to presentation and publication. This book is also designed to assist trainees in preparing for their specialty examinations by providing comprehensive, concise, easily accessible and easily understandable information on all aspects of clinical research and audit.
In How to Make the World Add Up, Tim Harford draws on his experience as both an economist and presenter of the BBC's radio show 'More or Less' to take us deep into the world of disinformation and obfuscation, bad research and misplaced motivation to find those priceless jewels of data and analysis that make communicating with numbers so rewarding. Through vivid storytelling he reveals how we can evaluate the claims that surround us with confidence, curiosity and a healthy level of scepticism. It is a must-read for anyone who cares about understanding the world around them.
Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization The finance industry is seeing increased interest in new risk measures and techniques for portfolio optimization when parameters of the model are uncertain. This groundbreaking book extends traditional approaches of risk measurement and portfolio optimization by combining distributional models with risk or performance measures into one framework. Throughout these pages, the expert authors explain the fundamentals of probability metrics, outline new approaches to portfolio optimization, and discuss a variety of essential risk measures. Using numerous examples, they illustrate a range of applications to optimal portfolio choice and risk theory, as well as applications to the area of computational finance that may be useful to financial engineers. They also clearly show how stochastic models, risk assessment, and optimization are essential to mastering risk, uncertainty, and performance measurement. Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization provides quantitative portfolio managers (including hedge fund managers), financial engineers, consultants, and?academic researchers with answers to the key question of which risk measure is best for any given problem.
The Handbook is a definitive reference source and teaching aid for
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
The analysis prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader.
This is the very first book to offer seven substantial econometric models of the Chinese economy with the statistical data used, so that the reader will be able to reproduce them all and test them for any policy alternatives. The book presents up-to-date models produced both inside and outside China, so that readers can understand most of the advanced studies of the Chinese economy by Chinese experts at the present time. This is an invaluable reference for graduate students and scholars working on Chinese economic problems.
This book presents a critical review of the empirical literature that studies the efficiency of the forward and futures markets for foreign exchange. It provides a useful foundation for research in developing quantitative measures of risk and expected return in international finance.
It has been held that when economic policy makers use economic models, there is a one way flow of information from the models to policy analysis. This text challenges this assumption, recognizing that in practice the requirements and questions of policy makers play an important role in the development and revision of those very models. Written by highly-placed practitioners and academic economists, it provides a picture of how modellers and policy makers interact with depth, insight and conviction. It offers international case studies of particular interactions between models and policy making, exploring questions such as: how does interaction work? What roles do different professional groups play in interaction? What strategies make the use of models in policy preparation successful? What insights can sociologists and historians give on the interaction between models and policy makers?
Have configurations of labour-management practices become embedded in the British economy? Did the dramatic decline in trade union representation in the 1980s continue throughout the 1990s, leaving more employees without a voice? Were the vestiges of union organization at the workplace a hollow shell? These and other contemporary issues of employee relations are addressed in this report. The book reports the results from the series of workplace surveys conducted by the Department of Trade and Industry, the Economic and Social Research Council, The Advisory Conciliation and Arbitration Service, and the Policy Studies Institute. Its focus is on change, captured by gathering together the enormous bank of data from all four of the large-scale and highly respected surveys, and plotting trends from 1980 to 1999. In addition, a special panel of workplaces, surveyed in both 1990 and 1998, reveals the complex processes of change.;Comprehensive in scope, the results are statistically reliable and reveal the nature and extent of change in all bar the smallest British workplaces.
Have configurations of labour-management practices become embedded in the British economy? Did the dramatic decline in trade union representation in the 1980s continue throughout the 1990s, leaving more employees without a voice? Were the vestiges of union organization at the workplace a hollow shell? These and other contemporary issues of employee relations are addressed in this report. The book reports the results from the series of workplace surveys conducted by the Department of Trade and Industry, the Economic and Social Research Council, The Advisory Conciliation and Arbitration Service, and the Policy Studies Institute. Its focus is on change, captured by gathering together the enormous bank of data from all four of the large-scale and highly respected surveys, and plotting trends from 1980 to 1999. In addition, a special panel of workplaces, surveyed in both 1990 and 1998, reveals the complex processes of change.;Comprehensive in scope, the results are statistically reliable and reveal the nature and extent of change in all bar the smallest British workplaces.
Distributional issues may not have always been among the main
concerns of the economic profession. Today, in the beginning of the
2000s, the position is different. During the last quarter of a
century, economic growth proved to be unsteady and rather slow on
average. The situation of those at the bottom ceased to improve
regularly as in the preceding fast growth and full-employment
period. Europe has seen prolonged unemployment and there has been
widening wage dispersion in a number of OECD countries. Rising
affluence in rich countries coexists, in a number of such
countries, with the persistence of poverty. As a consequence, it is
difficult nowadays to think of an issue ranking high in the public
economic debate without some strong explicit distributive
implications. Monetary policy, fiscal policy, taxes, monetary or
trade union, privatisation, price and competition regulation, the
future of the Welfare State are all issues which are now often
perceived as conflictual because of their strong redistributive
content.
For more information on the Handbooks in Economics series, please see our home page on http: //www.elsevier.nl/locate/hes
First Published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
Business Statistics of the United States is a comprehensive and practical collection of data from as early as 1913 that reflects the nation's economic performance. It provides several years of annual, quarterly, and monthly data in industrial and demographic detail including key indicators such as: gross domestic product, personal income, spending, saving, employment, unemployment, the capital stock, and more. Business Statistics of the United States is the best place to find historical perspectives on the U.S. economy. Of equal importance to the data are the introductory highlights, extensive notes, and figures for each chapter that help users to understand the data, use them appropriately, and, if desired, seek additional information from the source agencies. Business Statistics of the United States provides a rich and deep picture of the American economy and contains approximately 3,500 time series in all. The data are predominately from federal government sources including: Board of Governors of the Federal Reserve System Bureau of Economic Analysis Bureau of Labor Statistics Census Bureau Employment and Training Administration Energy Information Administration Federal Housing Finance Agency U.S. Department of the Treasury
News Professor Cheng-Few Lee ranks #1 based on his publications in the 26 core finance journals, and #163 based on publications in the 7 leading finance journals (Source: Most Prolific Authors in the Finance Literature: 1959-2008 by Jean L Heck and Philip L Cooley (Saint Joseph's University and Trinity University).This is an extensively revised edition of a popular statistics textbook for business and economics students. The first edition has been adopted by universities and colleges worldwide, including New York University, Carnegie Mellon University and UCLA.Designed for upper-level undergraduates, MBA and other graduate students, this book closely integrates various statistical techniques with concepts from business, economics and finance and clearly demonstrates the power of statistical methods in the real world of business. While maintaining the essence of the first edition, the new edition places more emphasis on finance, economics and accounting concepts with updated sample data. Students will find this book very accessible with its straightforward language, ample cases, examples, illustrations and real-life applications. The book is also useful for financial analysts and portfolio managers.
The need for analytics skills is a source of the burgeoning growth in the number of analytics and decision science programs in higher education developed to feed the need for capable employees in this area. The very size and continuing growth of this need means that there is still space for new program development. Schools wishing to pursue business analytics programs intentionally assess the maturity level of their programs and take steps to close the gap. Teaching Data Analytics: Pedagogy and Program Design is a reference for faculty and administrators seeking direction about adding or enhancing analytics offerings at their institutions. It provides guidance by examining best practices from the perspectives of faculty and practitioners. By emphasizing the connection of data analytics to organizational success, it reviews the position of analytics and decision science programs in higher education, and to review the critical connection between this area of study and career opportunities. The book features: A variety of perspectives ranging from the scholarly theoretical to the practitioner applied An in-depth look into a wide breadth of skills from closely technology-focused to robustly soft human connection skills Resources for existing faculty to acquire and maintain additional analytics-relevant skills that can enrich their current course offerings. Acknowledging the dichotomy between data analytics and data science, this book emphasizes data analytics rather than data science, although the book does touch upon the data science realm. Starting with industry perspectives, the book covers the applied world of data analytics, covering necessary skills and applications, as well as developing compelling visualizations. It then dives into pedagogical and program design approaches in data analytics education and concludes with ideas for program design tactics. This reference is a launching point for discussions about how to connect industry's need for skilled data analysts to higher education's need to design a rigorous curriculum that promotes student critical thinking, communication, and ethical skills. It also provides insight into adding new elements to existing data analytics courses and for taking the next step in adding data analytics offerings, whether it be incorporating additional analytics assignments into existing courses, offering one course designed for undergraduates, or an integrated program designed for graduate students.
Originally published in 1984. This book addresses the economics of the changing mineral industry, which is highly affected by energy economics. The study estimates, in quantitative terms, the short- to mid-term consequences of rising energy prices alongside falling ore quality for the copper and aluminum industries. The effects of changing cost factors on substitution between metals is assessed as is the potential for relying on increased recycling. Copper and aluminum industry problems should be representative of those faced by the mineral processing sector as a whole. Two complex econometric models presented here produce forecasts for the industries and the book discusses and reviews other econometric commodity models.
Originally published in 1970; with a second edition in 1989. Empirical Bayes methods use some of the apparatus of the pure Bayes approach, but an actual prior distribution is assumed to generate the data sequence. It can be estimated thus producing empirical Bayes estimates or decision rules. In this second edition, details are provided of the derivation and the performance of empirical Bayes rules for a variety of special models. Attention is given to the problem of assessing the goodness of an empirical Bayes estimator for a given set of prior data. Chapters also focus on alternatives to the empirical Bayes approach and actual applications of empirical Bayes methods.
Originally published in 1929. This balanced combination of fieldwork, statistical measurement, and realistic applications shows a synthesis of economics and political science in a conception of an organic relationship between the two sciences that involves functional analysis, institutional interpretation, and a more workmanlike approach to questions of organization such as division of labour and the control of industry. The treatise applies the test of fact through statistical analysis to economic and political theories for the quantitative and institutional approach in solving social and industrial problems. It constructs a framework of concepts, combining both economic and political theory, to systematically produce an original statement in general terms of the principles and methods for statistical fieldwork. The separation into Parts allows selective reading for the methods of statistical measurement; the principles and fallacies of applying these measures to economic and political fields; and the resultant construction of a statistical economics and politics. Basic statistical concepts are described for application, with each method of statistical measurement illustrated with instances relevant to the economic and political theory discussed and a statistical glossary is included. |
You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)R751 Discovery Miles 7 510
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
The Leading Indicators - A Short History…
Zachary Karabell
Paperback
|