![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > Economic statistics
Machine learning (ML) is progressively reshaping the fields of quantitative finance and algorithmic trading. ML tools are increasingly adopted by hedge funds and asset managers, notably for alpha signal generation and stocks selection. The technicality of the subject can make it hard for non-specialists to join the bandwagon, as the jargon and coding requirements may seem out of reach. Machine Learning for Factor Investing: R Version bridges this gap. It provides a comprehensive tour of modern ML-based investment strategies that rely on firm characteristics. The book covers a wide array of subjects which range from economic rationales to rigorous portfolio back-testing and encompass both data processing and model interpretability. Common supervised learning algorithms such as tree models and neural networks are explained in the context of style investing and the reader can also dig into more complex techniques like autoencoder asset returns, Bayesian additive trees, and causal models. All topics are illustrated with self-contained R code samples and snippets that are applied to a large public dataset that contains over 90 predictors. The material, along with the content of the book, is available online so that readers can reproduce and enhance the examples at their convenience. If you have even a basic knowledge of quantitative finance, this combination of theoretical concepts and practical illustrations will help you learn quickly and deepen your financial and technical expertise.
Machine learning (ML) is progressively reshaping the fields of quantitative finance and algorithmic trading. ML tools are increasingly adopted by hedge funds and asset managers, notably for alpha signal generation and stocks selection. The technicality of the subject can make it hard for non-specialists to join the bandwagon, as the jargon and coding requirements may seem out of reach. Machine Learning for Factor Investing: R Version bridges this gap. It provides a comprehensive tour of modern ML-based investment strategies that rely on firm characteristics. The book covers a wide array of subjects which range from economic rationales to rigorous portfolio back-testing and encompass both data processing and model interpretability. Common supervised learning algorithms such as tree models and neural networks are explained in the context of style investing and the reader can also dig into more complex techniques like autoencoder asset returns, Bayesian additive trees, and causal models. All topics are illustrated with self-contained R code samples and snippets that are applied to a large public dataset that contains over 90 predictors. The material, along with the content of the book, is available online so that readers can reproduce and enhance the examples at their convenience. If you have even a basic knowledge of quantitative finance, this combination of theoretical concepts and practical illustrations will help you learn quickly and deepen your financial and technical expertise.
Quantitative Modeling of Derivative Securities demonstrates how to take the basic ideas of arbitrage theory and apply them - in a very concrete way - to the design and analysis of financial products. Based primarily (but not exclusively) on the analysis of derivatives, the book emphasizes relative-value and hedging ideas applied to different financial instruments. Using a "financial engineering approach," the theory is developed progressively, focusing on specific aspects of pricing and hedging and with problems that the technical analyst or trader has to consider in practice. More than just an introductory text, the reader who has mastered the contents of this one book will have breached the gap separating the novice from the technical and research literature.
The book examines the development and the dynamics of the personal distribution of income in Germany, Great Britain, Sweden and the United States and some other OECD countries. Starting with the distribution of labour income, the issue is then expanded to include all monetary incomes of private households and to adjust for household size by an equivalence scale. Some authors analyse one country in detail by decomposing aggregate inequality measures, other authors focus on direct comparisons of some features of the income distribution in Germany with those in Great Britain or in the United States. The results suggest dominant influences of unemployment as well as of tax and transfer policies and different welfare regimes, respectively, but also show that our knowledge about distributional processes is still limited.
This book contains the most complete set of the Chinese national income and its components based on system of national accounts. It points out some fundamental issues concerning the estimation of China's national income and it is intended to the students of the field of China study around the world.
Collecting and analyzing data on unemployment, inflation, and inequality help describe the complex world around us. When published by the government, such data are called official statistics. They are reported by the media, used by politicians to lend weight to their arguments, and by economic commentators to opine about the state of society. Despite such widescale use, explanations about how these measures are constructed are seldom provided for a non-technical reader. This Measuring Society book is a short, accessible guide to six topics: jobs, house prices, inequality, prices for goods and services, poverty, and deprivation. Each relates to concepts we use on a personal level to form an understanding of the society in which we live: We need a job, a place to live, and food to eat. Using data from the United States, we answer three basic questions: why, how, and for whom these statistics have been constructed. We add some context and flavor by discussing the historical background. This book provides the reader with a good grasp of these measures. Chaitra H. Nagaraja is an Associate Professor of Statistics at the Gabelli School of Business at Fordham University in New York. Her research interests include house price indices and inequality measurement. Prior to Fordham, Dr. Nagaraja was a researcher at the U.S. Census Bureau. While there, she worked on projects relating to the American Community Survey.
Originally published in 1970; with a second edition in 1989. Empirical Bayes methods use some of the apparatus of the pure Bayes approach, but an actual prior distribution is assumed to generate the data sequence. It can be estimated thus producing empirical Bayes estimates or decision rules. In this second edition, details are provided of the derivation and the performance of empirical Bayes rules for a variety of special models. Attention is given to the problem of assessing the goodness of an empirical Bayes estimator for a given set of prior data. Chapters also focus on alternatives to the empirical Bayes approach and actual applications of empirical Bayes methods.
Originally published in 1960 and 1966. This is an elementary introduction to the sources of economic statistics and their uses in answering economic questions. No mathematical knowledge is assumed, and no mathematical symbols are used. The book shows - by asking and answering a number of typical questions of applied economics - what the most useful statistics are, where they are found, and how they are to be interpreted and presented. The reader is introduced to the major British, European and American official sources, to the social accounts, to index numbers and averaging, and to elementary aids to inspection such as moving averages and scatter diagrams.
The need for analytics skills is a source of the burgeoning growth in the number of analytics and decision science programs in higher education developed to feed the need for capable employees in this area. The very size and continuing growth of this need means that there is still space for new program development. Schools wishing to pursue business analytics programs intentionally assess the maturity level of their programs and take steps to close the gap. Teaching Data Analytics: Pedagogy and Program Design is a reference for faculty and administrators seeking direction about adding or enhancing analytics offerings at their institutions. It provides guidance by examining best practices from the perspectives of faculty and practitioners. By emphasizing the connection of data analytics to organizational success, it reviews the position of analytics and decision science programs in higher education, and to review the critical connection between this area of study and career opportunities. The book features: A variety of perspectives ranging from the scholarly theoretical to the practitioner applied An in-depth look into a wide breadth of skills from closely technology-focused to robustly soft human connection skills Resources for existing faculty to acquire and maintain additional analytics-relevant skills that can enrich their current course offerings. Acknowledging the dichotomy between data analytics and data science, this book emphasizes data analytics rather than data science, although the book does touch upon the data science realm. Starting with industry perspectives, the book covers the applied world of data analytics, covering necessary skills and applications, as well as developing compelling visualizations. It then dives into pedagogical and program design approaches in data analytics education and concludes with ideas for program design tactics. This reference is a launching point for discussions about how to connect industry's need for skilled data analysts to higher education's need to design a rigorous curriculum that promotes student critical thinking, communication, and ethical skills. It also provides insight into adding new elements to existing data analytics courses and for taking the next step in adding data analytics offerings, whether it be incorporating additional analytics assignments into existing courses, offering one course designed for undergraduates, or an integrated program designed for graduate students.
Key Topics in Clinical Research aims to provide a short, clear, highlighted reference to guide trainees and trainers through research and audit projects, from first idea, through to data collection and statistical analysis, to presentation and publication. This book is also designed to assist trainees in preparing for their specialty examinations by providing comprehensive, concise, easily accessible and easily understandable information on all aspects of clinical research and audit.
This book presents a critical review of the empirical literature that studies the efficiency of the forward and futures markets for foreign exchange. It provides a useful foundation for research in developing quantitative measures of risk and expected return in international finance.
Business Statistics of the United States is a comprehensive and practical collection of data from as early as 1913 that reflects the nation's economic performance. It provides several years of annual, quarterly, and monthly data in industrial and demographic detail including key indicators such as: gross domestic product, personal income, spending, saving, employment, unemployment, the capital stock, and more. Business Statistics of the United States is the best place to find historical perspectives on the U.S. economy. Of equal importance to the data are the introductory highlights, extensive notes, and figures for each chapter that help users to understand the data, use them appropriately, and, if desired, seek additional information from the source agencies. Business Statistics of the United States provides a rich and deep picture of the American economy and contains approximately 3,500 time series in all. The data are predominately from federal government sources including: Board of Governors of the Federal Reserve System Bureau of Economic Analysis Bureau of Labor Statistics Census Bureau Employment and Training Administration Energy Information Administration Federal Housing Finance Agency U.S. Department of the Treasury
First published in 1995. In the current, increasingly global economy, investors require quick access to a wide range of financial and investment-related statistics to assist them in better understanding the macroeconomic environment in which their investments will operate. The International Financial Statistics Locator eliminates the need to search though a number of sources to identify those that contain much of this statistical information. It is intended for use by librarians, students, individual investors, and the business community and provides access to twenty-two resources, print and electronic, that contain current and historical financial and economic statistics investors need to appreciate and profit from evolving and established international markets.
Originally published in 1929. This balanced combination of fieldwork, statistical measurement, and realistic applications shows a synthesis of economics and political science in a conception of an organic relationship between the two sciences that involves functional analysis, institutional interpretation, and a more workmanlike approach to questions of organization such as division of labour and the control of industry. The treatise applies the test of fact through statistical analysis to economic and political theories for the quantitative and institutional approach in solving social and industrial problems. It constructs a framework of concepts, combining both economic and political theory, to systematically produce an original statement in general terms of the principles and methods for statistical fieldwork. The separation into Parts allows selective reading for the methods of statistical measurement; the principles and fallacies of applying these measures to economic and political fields; and the resultant construction of a statistical economics and politics. Basic statistical concepts are described for application, with each method of statistical measurement illustrated with instances relevant to the economic and political theory discussed and a statistical glossary is included.
Originally published in 1984. This book brings together a reasonably complete set of results regarding the use of Constraint Item estimation procedures under the assumption of accurate specification. The analysis covers the case of all explanatory variables being non-stochastic as well as the case of identified simultaneous equations, with error terms known and unknown. Particular emphasis is given to the derivation of criteria for choosing the Constraint Item. Part 1 looks at the best CI estimators and Part 2 examines equation by equation estimation, considering forecasting accuracy.
Have configurations of labour-management practices become embedded in the British economy? Did the dramatic decline in trade union representation in the 1980s continue throughout the 1990s, leaving more employees without a voice? Were the vestiges of union organization at the workplace a hollow shell? These and other contemporary issues of employee relations are addressed in this report. The book reports the results from the series of workplace surveys conducted by the Department of Trade and Industry, the Economic and Social Research Council, The Advisory Conciliation and Arbitration Service, and the Policy Studies Institute. Its focus is on change, captured by gathering together the enormous bank of data from all four of the large-scale and highly respected surveys, and plotting trends from 1980 to 1999. In addition, a special panel of workplaces, surveyed in both 1990 and 1998, reveals the complex processes of change.;Comprehensive in scope, the results are statistically reliable and reveal the nature and extent of change in all bar the smallest British workplaces.
Have configurations of labour-management practices become embedded in the British economy? Did the dramatic decline in trade union representation in the 1980s continue throughout the 1990s, leaving more employees without a voice? Were the vestiges of union organization at the workplace a hollow shell? These and other contemporary issues of employee relations are addressed in this report. The book reports the results from the series of workplace surveys conducted by the Department of Trade and Industry, the Economic and Social Research Council, The Advisory Conciliation and Arbitration Service, and the Policy Studies Institute. Its focus is on change, captured by gathering together the enormous bank of data from all four of the large-scale and highly respected surveys, and plotting trends from 1980 to 1999. In addition, a special panel of workplaces, surveyed in both 1990 and 1998, reveals the complex processes of change.;Comprehensive in scope, the results are statistically reliable and reveal the nature and extent of change in all bar the smallest British workplaces.
This book provides a broad, mature, and systematic introduction to current financial econometric models and their applications to modeling and prediction of financial time series data. It utilizes real-world examples and real financial data throughout the book to apply the models and methods described. The author begins with basic characteristics of financial time series data before covering three main topics: Analysis and application of univariate financial time seriesThe return series of multiple assetsBayesian inference in finance methods Key features of the new edition include additional coverage of modern day topics such as arbitrage, pair trading, realized volatility, and credit risk modeling; a smooth transition from S-Plus to R; and expanded empirical financial data sets. The overall objective of the book is to provide some knowledge of financial time series, introduce some statistical tools useful for analyzing these series and gain experience in financial applications of various econometric methods.
Human behavior often violates the predictions of rational choice
theory. This realization has caused many social psychologists and
experimental economists to attempt to develop an
experimentally-based variant of game theory as an alternative
descriptive model. The impetus for this book is the interest in the
development of such a theory that combines elements from both
disciplines and appeals to both.
This work examines theoretical issues, as well as practical developments in statistical inference related to econometric models and analysis. This work offers discussions on such areas as the function of statistics in aggregation, income inequality, poverty, health, spatial econometrics, panel and survey data, bootstrapping and time series.
This is a classical reprint edition of the original 1971 edition of An Introduction to Bayesian Inference in Economics. This historical volume is an early introduction to Bayesian inference and methodology which still has lasting value for today's statistician and student. The coverage ranges from the fundamental concepts and operations of Bayesian inference to analysis of applications in specific econometric problems and the testing of hypotheses and models.
How we pay is so fundamental that it underpins everything – from trade to taxation, stocks and savings to salaries, pensions and pocket money. Rich or poor, criminal, communist or capitalist, we all rely on the same payments system, day in, day out. It sits between us and not just economic meltdown, but a total breakdown in law and order. Why then do we know so little about how that system really works? Leibbrandt and de Terán shine a light on the hidden workings of the humble payment – and reveal both how our payment habits are determined by history as well as where we might go next. From national customs to warring nation states, geopolitics will shape the future of payments every bit as much as technology. Challenging our understanding about where financial power really lies, The Pay Off shows us that the most important thing about money is the way we move it.
Originally published in 1987. This collection of original papers deals with various issues of specification in the context of the linear statistical model. The volume honours the early econometric work of Donald Cochrane, late Dean of Economics and Politics at Monash University in Australia. The chapters focus on problems associated with autocorrelation of the error term in the linear regression model and include appraisals of early work on this topic by Cochrane and Orcutt. The book includes an extensive survey of autocorrelation tests; some exact finite-sample tests; and some issues in preliminary test estimation. A wide range of other specification issues is discussed, including the implications of random regressors for Bayesian prediction; modelling with joint conditional probability functions; and results from duality theory. There is a major survey chapter dealing with specification tests for non-nested models, and some of the applications discussed by the contributors deal with the British National Accounts and with Australian financial and housing markets.
Originally published in 1984. This book brings together a reasonably complete set of results regarding the use of Constraint Item estimation procedures under the assumption of accurate specification. The analysis covers the case of all explanatory variables being non-stochastic as well as the case of identified simultaneous equations, with error terms known and unknown. Particular emphasis is given to the derivation of criteria for choosing the Constraint Item. Part 1 looks at the best CI estimators and Part 2 examines equation by equation estimation, considering forecasting accuracy.
Business students need the ability to think statistically about how to deal with uncertainty and its effect on decision-making in business and management. Traditional statistics courses and textbooks tend to focus on probability, mathematical detail, and heavy computation, and thus fail to meet the needs of future managers. Statistical Thinking in Business, Second Edition responds to the growing recognition that we must change the way business statistics is taught. It shows how statistics is important in all aspects of business and equips students with the skills they need to make sensible use of data and other information. The authors take an interactive, scenario-based approach and use almost no mathematical formulas, opting to use Excel for the technical work. This allows them to focus on using statistics to aid decision-making rather than how to perform routine calculations. New in the Second Edition A completely revised chapter on forecasting Re-arrangement of the material on data presentation with the inclusion of histograms and cumulative line plots A more thorough discussion of the analysis of attribute data Coverage of variable selection and model building in multiple regression End-of-chapter summaries More end-of-chapter problems A variety of case studies throughout the book The second edition also comes with a wealth of ancillary materials provided on downloadable resources packaged with the book. These include automatically-marked multiple-choice questions, answers to questions in the text, data sets, Excel experiments and demonstrations, an introduction to Excel, and the StiBstat Add-In for stem and leaf plots, box plots, distribution plots, control charts and summary statistics. |
You may like...
Conservative Century - The Conservative…
Anthony Seldon, Stuart Ball
Hardcover
R1,847
Discovery Miles 18 470
Electrospinning: Nanofabrication and…
Binding, Xianfeng Wang, …
Paperback
R3,671
Discovery Miles 36 710
Functionalization of 2D Materials and…
Waleed A. El-Said, Nabil Ahmed Abdel Ghany
Paperback
R4,674
Discovery Miles 46 740
Understanding Elephants - Guidelines for…
Elephant Specialist Advisory Group
Paperback
|