![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > Economic statistics
This book is based on two Sir Richard Stone lectures at the Bank of England and the National Institute for Economic and Social Research. Largely non-technical, the first part of the book covers some of the broader issues involved in Stone's and others' work in statistics. It explores the more philosophical issues attached to statistics, econometrics and forecasting and describes the paradigm shift back to the Bayesian approach to scientific inference. The first part concludes with simple examples from the different worlds of educational management and golf clubs. The second, more technical part covers in detail the structural econometric time series analysis (SEMTSA) approach to statistical and econometric modeling.
Introduction to Financial Mathematics: Option Valuation, Second Edition is a well-rounded primer to the mathematics and models used in the valuation of financial derivatives. The book consists of fifteen chapters, the first ten of which develop option valuation techniques in discrete time, the last five describing the theory in continuous time. The first half of the textbook develops basic finance and probability. The author then treats the binomial model as the primary example of discrete-time option valuation. The final part of the textbook examines the Black-Scholes model. The book is written to provide a straightforward account of the principles of option pricing and examines these principles in detail using standard discrete and stochastic calculus models. Additionally, the second edition has new exercises and examples, and includes many tables and graphs generated by over 30 MS Excel VBA modules available on the author's webpage https://home.gwu.edu/~hdj/.
In order to make informed decisions, there are three important elements: intuition, trust, and analytics. Intuition is based on experiential learning and recent research has shown that those who rely on their "gut feelings" may do better than those who don't. Analytics, however, are important in a data-driven environment to also inform decision making. The third element, trust, is critical for knowledge sharing to take place. These three elements-intuition, analytics, and trust-make a perfect combination for decision making. This book gathers leading researchers who explore the role of these three elements in the process of decision-making.
Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of stochastic processes with continuous and discontinuous paths. It also covers a wide selection of popular models in finance and insurance, from Black-Scholes to stochastic volatility to interest rate to dynamic mortality. Through its many numerical and graphical illustrations and simple, insightful examples, this book provides a deep understanding of the scope of Monte Carlo methods and their use in various financial situations. The intuitive presentation encourages readers to implement and further develop the simulation methods.
Originally published in 1970; with a second edition in 1989. Empirical Bayes methods use some of the apparatus of the pure Bayes approach, but an actual prior distribution is assumed to generate the data sequence. It can be estimated thus producing empirical Bayes estimates or decision rules. In this second edition, details are provided of the derivation and the performance of empirical Bayes rules for a variety of special models. Attention is given to the problem of assessing the goodness of an empirical Bayes estimator for a given set of prior data. Chapters also focus on alternatives to the empirical Bayes approach and actual applications of empirical Bayes methods.
Originally published in 1985. Mathematical methods and models to facilitate the understanding of the processes of economic dynamics and prediction were refined considerably over the period before this book was written. The field had grown; and many of the techniques involved became extremely complicated. Areas of particular interest include optimal control, non-linear models, game-theoretic approaches, demand analysis and time-series forecasting. This book presents a critical appraisal of developments and identifies potentially productive new directions for research. It synthesises work from mathematics, statistics and economics and includes a thorough analysis of the relationship between system understanding and predictability.
Originally published in 1960 and 1966. This is an elementary introduction to the sources of economic statistics and their uses in answering economic questions. No mathematical knowledge is assumed, and no mathematical symbols are used. The book shows - by asking and answering a number of typical questions of applied economics - what the most useful statistics are, where they are found, and how they are to be interpreted and presented. The reader is introduced to the major British, European and American official sources, to the social accounts, to index numbers and averaging, and to elementary aids to inspection such as moving averages and scatter diagrams.
Originally published in 1929. This balanced combination of fieldwork, statistical measurement, and realistic applications shows a synthesis of economics and political science in a conception of an organic relationship between the two sciences that involves functional analysis, institutional interpretation, and a more workmanlike approach to questions of organization such as division of labour and the control of industry. The treatise applies the test of fact through statistical analysis to economic and political theories for the quantitative and institutional approach in solving social and industrial problems. It constructs a framework of concepts, combining both economic and political theory, to systematically produce an original statement in general terms of the principles and methods for statistical fieldwork. The separation into Parts allows selective reading for the methods of statistical measurement; the principles and fallacies of applying these measures to economic and political fields; and the resultant construction of a statistical economics and politics. Basic statistical concepts are described for application, with each method of statistical measurement illustrated with instances relevant to the economic and political theory discussed and a statistical glossary is included.
Originally published in 1978. This book is designed to enable students on main courses in economics to comprehend literature which employs econometric techniques as a method of analysis, to use econometric techniques themselves to test hypotheses about economic relationships and to understand some of the difficulties involved in interpreting results. While the book is mainly aimed at second-year undergraduates undertaking courses in applied economics, its scope is sufficiently wide to take in students at postgraduate level who have no background in econometrics - it integrates fully the mathematical and statistical techniques used in econometrics with micro- and macroeconomic case studies.
A theft amounting to GBP1 was a capital offence in 1260 and a judge in 1610 affirmed the law could not then be applied since GBP1 was no longer what it was. Such association of money with a date is well recognized for its importance in very many connections. Thus arises the need to know how to convert an amount at one date into the right amount at another date: in other words, a price index. The longstanding question concerning how such an index should be constructed is known as 'The Index Number Problem'. The ordinary consumer price index represents a practical response to this need. However the search for a true price index has given rise to extensive thought and theory to which an impressive number of economists have each contributed a word, or volume. However, there have been hold-ups at a basic level, which are addressed in this book. The approach brings the subject into involvement with utility construction on the basis of finite data, in a form referred to as 'Afriat's Theorem' but now with utility subject to constant (and also possibly approximate) returns.
The rapidly increasing importance of China, India, Indonesia, Japan, South Korea and Taiwan both in Asia and in the world economy, represents a trend that is set to continue into the 21st century. This book provides an authoritative assessment of the 20th century performance of these countries, and in particular the factors contributing to the acceleration of Asian growth in the latter part of the century. The contributors look at Asia within a global perspective and detailed comparisons are drawn with Australia and the USA. Contributions from leading experts offer a comprehensive review of the procedures necessary to establish valid international comparisons for countries with very different economic histories and levels of development. These include methods of growth performance measurement and techniques of growth accounting. The Asian Economies in the Twentieth Century will be an indispensable new tool for policy analysts, international agencies and academic researchers.
This important three volume set is a collection of Edgeworth's published writings in the areas of statistics and probability. There is a newly-emerging interest in probability theory as a basis for economic thought and this collection makes the writings of Edgeworth more accessible.A new introduction written by the editor covers the biographical details, a brief abstract of each of the articles and the basis of their selection is also included.
-Up-to-date with cutting edge topics -Suitable for professional quants and as library reference for students of finance and financial mathematics
Factor Analysis and Dimension Reduction in R provides coverage, with worked examples, of a large number of dimension reduction procedures along with model performance metrics to compare them. Factor analysis in the form of principal components analysis (PCA) or principal factor analysis (PFA) is familiar to most social scientists. However, what is less familiar is understanding that factor analysis is a subset of the more general statistical family of dimension reduction methods. The social scientist's toolkit for factor analysis problems can be expanded to include the range of solutions this book presents. In addition to covering FA and PCA with orthogonal and oblique rotation, this book's coverage includes higher-order factor models, bifactor models, models based on binary and ordinal data, models based on mixed data, generalized low-rank models, cluster analysis with GLRM, models involving supplemental variables or observations, Bayesian factor analysis, regularized factor analysis, testing for unidimensionality, and prediction with factor scores. The second half of the book deals with other procedures for dimension reduction. These include coverage of kernel PCA, factor analysis with multidimensional scaling, locally linear embedding models, Laplacian eigenmaps, diffusion maps, force directed methods, t-distributed stochastic neighbor embedding, independent component analysis (ICA), dimensionality reduction via regression (DRR), non-negative matrix factorization (NNMF), Isomap, Autoencoder, uniform manifold approximation and projection (UMAP) models, neural network models, and longitudinal factor analysis models. In addition, a special chapter covers metrics for comparing model performance. Features of this book include: Numerous worked examples with replicable R code Explicit comprehensive coverage of data assumptions Adaptation of factor methods to binary, ordinal, and categorical data Residual and outlier analysis Visualization of factor results Final chapters that treat integration of factor analysis with neural network and time series methods Presented in color with R code and introduction to R and RStudio, this book will be suitable for graduate-level and optional module courses for social scientists, and on quantitative methods and multivariate statistics courses.
A Handbook of Statistical Analyses Using SPSS clearly describes how to conduct a range of univariate and multivariate statistical analyses using the latest version of the Statistical Package for the Social Sciences, SPSS 11. Each chapter addresses a different type of analytical procedure applied to one or more data sets, primarily from the social and behavioral sciences areas. Each chapter also contains exercises relating to the data sets introduced, providing readers with a means to develop both their SPSS and statistical skills. Model answers to the exercises are also provided. Readers can download all of the data sets from a companion Web site furnished by the authors.
Teaches the principles of sampling with examples from social sciences, public opinion research, public health, business, agriculture, and ecology. Has been thoroughly revised to incorporate recent research and applications. Includes a new chapter on nonprobability samples, and more than 200 new examples and exercises have been added.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
Explores the Origin of the Recent Banking Crisis and how to Preclude Future Crises Shedding new light on the recent worldwide banking debacle, The Banking Crisis Handbook presents possible remedies as to what should have been done prior, during, and after the crisis. With contributions from well-known academics and professionals, the book contains exclusive, new research that will undoubtedly assist bank executives, risk management departments, and other financial professionals to attain a clear picture of the banking crisis and prevent future banking collapses. The first part of the book explains how the crisis originated. It discusses the role of subprime mortgages, shadow banks, ineffective risk management, poor financial regulations, and hedge funds in causing the collapse of financial systems. The second section examines how the crisis affected the global market as well as individual countries and regions, such as Asia and Greece. In the final part, the book explores short- and long-term solutions, including government intervention, financial regulations, efficient bank default risk approaches, and methods to evaluate credit risk. It also looks at when government intervention in financial markets can be ethically justified.
Economic history is the most quantitative branch of history, reflecting the interests and profiting from the techniques and concepts of economics. This essay, first published in 1977, provides an extensive contribution to quantitative historiography by delivering a critical guide to the sources of the numerical data of the period 1700 to 1850. This title will be of interest to students of history, finance and economics.
Thijs ten Raa, author of the acclaimed text The Economics of Input-Output Analysis, now takes the reader to the forefront of the field. This volume collects and unifies his and his co-authors' research papers on national accounting, input-output coefficients, economic theory, dynamic models, stochastic analysis, and performance analysis. The research is driven by the task to analyze national economies. The final part of the book scrutinizes the emerging Asian economies in the light of international competition.
The purpose of this book is to introduce novice researchers to the tools of meta-analysis and meta-regression analysis and to summarize the state of the art for existing practitioners. Meta-regression analysis addresses the rising "Tower of Babel" that current economics and business research has become. Meta-analysis is the statistical analysis of previously published, or reported, research findings on a given hypothesis, empirical effect, phenomenon, or policy intervention. It is a systematic review of all the relevant scientific knowledge on a specific subject and is an essential part of the evidence-based practice movement in medicine, education and the social sciences. However, research in economics and business is often fundamentally different from what is found in the sciences and thereby requires different methods for its synthesis-meta-regression analysis. This book develops, summarizes, and applies these meta-analytic methods.
Introduction to Statistics with SPSS offers an introduction to statistics that can be used before, during or after a course on statistics. Covering a wide range of terms and techniques, including simple and multiple regressions, this book guides the student to enter data from a simple research project into a computer, provide an adequate analysis of the data and present a report on the findings.
This short book introduces the main ideas of statistical inference in a way that is both user friendly and mathematically sound. Particular emphasis is placed on the common foundation of many models used in practice. In addition, the book focuses on the formulation of appropriate statistical models to study problems in business, economics, and the social sciences, as well as on how to interpret the results from statistical analyses. The book will be useful to students who are interested in rigorous applications of statistics to problems in business, economics and the social sciences, as well as students who have studied statistics in the past, but need a more solid grounding in statistical techniques to further their careers. Jacco Thijssen is professor of finance at the University of York, UK. He holds a PhD in mathematical economics from Tilburg University, Netherlands. His main research interests are in applications of optimal stopping theory, stochastic calculus, and game theory to problems in economics and finance. Professor Thijssen has earned several awards for his statistics teaching.
News Professor Cheng-Few Lee ranks #1 based on his publications in the 26 core finance journals, and #163 based on publications in the 7 leading finance journals (Source: Most Prolific Authors in the Finance Literature: 1959-2008 by Jean L Heck and Philip L Cooley (Saint Joseph's University and Trinity University).This is an extensively revised edition of a popular statistics textbook for business and economics students. The first edition has been adopted by universities and colleges worldwide, including New York University, Carnegie Mellon University and UCLA.Designed for upper-level undergraduates, MBA and other graduate students, this book closely integrates various statistical techniques with concepts from business, economics and finance and clearly demonstrates the power of statistical methods in the real world of business. While maintaining the essence of the first edition, the new edition places more emphasis on finance, economics and accounting concepts with updated sample data. Students will find this book very accessible with its straightforward language, ample cases, examples, illustrations and real-life applications. The book is also useful for financial analysts and portfolio managers. |
![]() ![]() You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
R2,252
Discovery Miles 22 520
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R1,807
Discovery Miles 18 070
|