|
|
Books > Business & Economics > Economics > Econometrics > Economic statistics
'A statistical national treasure' Jeremy Vine, BBC Radio 2
'Required reading for all politicians, journalists, medics and
anyone who tries to influence people (or is influenced) by
statistics. A tour de force' Popular Science Do busier hospitals
have higher survival rates? How many trees are there on the planet?
Why do old men have big ears? David Spiegelhalter reveals the
answers to these and many other questions - questions that can only
be addressed using statistical science. Statistics has played a
leading role in our scientific understanding of the world for
centuries, yet we are all familiar with the way statistical claims
can be sensationalised, particularly in the media. In the age of
big data, as data science becomes established as a discipline, a
basic grasp of statistical literacy is more important than ever. In
The Art of Statistics, David Spiegelhalter guides the reader
through the essential principles we need in order to derive
knowledge from data. Drawing on real world problems to introduce
conceptual issues, he shows us how statistics can help us determine
the luckiest passenger on the Titanic, whether serial killer Harold
Shipman could have been caught earlier, and if screening for
ovarian cancer is beneficial. 'Shines a light on how we can use the
ever-growing deluge of data to improve our understanding of the
world' Nature
This study analyses the newly available statistical evidence on
income distribution in the former Soviet Union both by social group
and by republic, and considers the significance of inequalities as
a factor contributing to the demise of the Communist regime. Among
the topics covered are wage distribution (interbranch and skill
differentials and distribution in terms of gender, education, and
age), income distribution for the former USSR as a whole, and wage
and income distribution patterns for each republic, with analysis
of regional differences.
Originally published in 1984. This book brings together a
reasonably complete set of results regarding the use of Constraint
Item estimation procedures under the assumption of accurate
specification. The analysis covers the case of all explanatory
variables being non-stochastic as well as the case of identified
simultaneous equations, with error terms known and unknown.
Particular emphasis is given to the derivation of criteria for
choosing the Constraint Item. Part 1 looks at the best CI
estimators and Part 2 examines equation by equation estimation,
considering forecasting accuracy.
Originally published in 1987. This collection of original papers
deals with various issues of specification in the context of the
linear statistical model. The volume honours the early econometric
work of Donald Cochrane, late Dean of Economics and Politics at
Monash University in Australia. The chapters focus on problems
associated with autocorrelation of the error term in the linear
regression model and include appraisals of early work on this topic
by Cochrane and Orcutt. The book includes an extensive survey of
autocorrelation tests; some exact finite-sample tests; and some
issues in preliminary test estimation. A wide range of other
specification issues is discussed, including the implications of
random regressors for Bayesian prediction; modelling with joint
conditional probability functions; and results from duality theory.
There is a major survey chapter dealing with specification tests
for non-nested models, and some of the applications discussed by
the contributors deal with the British National Accounts and with
Australian financial and housing markets.
Since most datasets contain a number of variables, multivariate
methods are helpful in answering a variety of research questions.
Accessible to students and researchers without a substantial
background in statistics or mathematics, Essentials of Multivariate
Data Analysis explains the usefulness of multivariate methods in
applied research. Unlike most books on multivariate methods, this
one makes straightforward analyses easy to perform for those who
are unfamiliar with advanced mathematical formulae. An easily
understood dataset is used throughout to illustrate the techniques.
The accompanying add-in for Microsoft Excel can be used to carry
out the analyses in the text. The dataset and Excel add-in are
available for download on the book's CRC Press web page. Providing
a firm foundation in the most commonly used multivariate
techniques, this text helps readers choose the appropriate method,
learn how to apply it, and understand how to interpret the results.
It prepares them for more complex analyses using software such as
Minitab, R, SAS, SPSS, and Stata.
This short book introduces the main ideas of statistical inference
in a way that is both user friendly and mathematically sound.
Particular emphasis is placed on the common foundation of many
models used in practice. In addition, the book focuses on the
formulation of appropriate statistical models to study problems in
business, economics, and the social sciences, as well as on how to
interpret the results from statistical analyses. The book will be
useful to students who are interested in rigorous applications of
statistics to problems in business, economics and the social
sciences, as well as students who have studied statistics in the
past, but need a more solid grounding in statistical techniques to
further their careers. Jacco Thijssen is professor of finance at
the University of York, UK. He holds a PhD in mathematical
economics from Tilburg University, Netherlands. His main research
interests are in applications of optimal stopping theory,
stochastic calculus, and game theory to problems in economics and
finance. Professor Thijssen has earned several awards for his
statistics teaching.
In order to make informed decisions, there are three important
elements: intuition, trust, and analytics. Intuition is based on
experiential learning and recent research has shown that those who
rely on their "gut feelings" may do better than those who don't.
Analytics, however, are important in a data-driven environment to
also inform decision making. The third element, trust, is critical
for knowledge sharing to take place. These three
elements-intuition, analytics, and trust-make a perfect combination
for decision making. This book gathers leading researchers who
explore the role of these three elements in the process of
decision-making.
Written in a highly accessible style, A Factor Model Approach to
Derivative Pricing lays a clear and structured foundation for the
pricing of derivative securities based upon simple factor model
related absence of arbitrage ideas. This unique and unifying
approach provides for a broad treatment of topics and models,
including equity, interest-rate, and credit derivatives, as well as
hedging and tree-based computational methods, but without reliance
on the heavy prerequisites that often accompany such topics. Key
features A single fundamental absence of arbitrage relationship
based on factor models is used to motivate all the results in the
book A structured three-step procedure is used to guide the
derivation of absence of arbitrage equations and illuminate core
underlying concepts Brownian motion and Poisson process driven
models are treated together, allowing for a broad and cohesive
presentation of topics The final chapter provides a new approach to
risk neutral pricing that introduces the topic as a seamless and
natural extension of the factor model approach Whether being used
as text for an intermediate level course in derivatives, or by
researchers and practitioners who are seeking a better
understanding of the fundamental ideas that underlie derivative
pricing, readers will appreciate the book's ability to unify many
disparate topics and models under a single conceptual theme. James
A Primbs is an Associate Professor of Finance at the Mihaylo
College of Business and Economics at California State University,
Fullerton.
Customer and Business Analytics: Applied Data Mining for Business
Decision Making Using R explains and demonstrates, via the
accompanying open-source software, how advanced analytical tools
can address various business problems. It also gives insight into
some of the challenges faced when deploying these tools.
Extensively classroom-tested, the text is ideal for students in
customer and business analytics or applied data mining as well as
professionals in small- to medium-sized organizations. The book
offers an intuitive understanding of how different analytics
algorithms work. Where necessary, the authors explain the
underlying mathematics in an accessible manner. Each technique
presented includes a detailed tutorial that enables hands-on
experience with real data. The authors also discuss issues often
encountered in applied data mining projects and present the
CRISP-DM process model as a practical framework for organizing
these projects. Showing how data mining can improve the performance
of organizations, this book and its R-based software provide the
skills and tools needed to successfully develop advanced analytics
capabilities.
'Refreshingly clear and engaging' Tim Harford 'Delightful . . .
full of unique insights' Prof Sir David Spiegelhalter There's no
getting away from statistics. We encounter them every day. We are
all users of statistics whether we like it or not. Do missed
appointments really cost the NHS GBP1bn per year? What's the
difference between the mean gender pay gap and the median gender
pay gap? How can we work out if a claim that we use 42 billion
single-use plastic straws per year in the UK is accurate? What did
the Vote Leave campaign's GBP350m bus really mean? How can we tell
if the headline 'Public pensions cost you GBP4,000 a year' is
correct? Does snow really cost the UK economy GBP1bn per day? But
how do we distinguish statistical fact from fiction? What can we do
to decide whether a number, claim or news story is accurate?
Without an understanding of data, we cannot truly understand what
is going on in the world around us. Written by Anthony Reuben, the
BBC's first head of statistics, Statistical is an accessible and
empowering guide to challenging the numbers all around us.
This book is about learning from data using the Generalized
Additive Models for Location, Scale and Shape (GAMLSS). GAMLSS
extends the Generalized Linear Models (GLMs) and Generalized
Additive Models (GAMs) to accommodate large complex datasets, which
are increasingly prevalent. In particular, the GAMLSS statistical
framework enables flexible regression and smoothing models to be
fitted to the data. The GAMLSS model assumes that the response
variable has any parametric (continuous, discrete or mixed)
distribution which might be heavy- or light-tailed, and positively
or negatively skewed. In addition, all the parameters of the
distribution (location, scale, shape) can be modelled as linear or
smooth functions of explanatory variables. Key Features: Provides a
broad overview of flexible regression and smoothing techniques to
learn from data whilst also focusing on the practical application
of methodology using GAMLSS software in R. Includes a comprehensive
collection of real data examples, which reflect the range of
problems addressed by GAMLSS models and provide a practical
illustration of the process of using flexible GAMLSS models for
statistical learning. R code integrated into the text for ease of
understanding and replication. Supplemented by a website with code,
data and extra materials. This book aims to help readers understand
how to learn from data encountered in many fields. It will be
useful for practitioners and researchers who wish to understand and
use the GAMLSS models to learn from data and also for students who
wish to learn GAMLSS through practical examples.
A classic text for accuracy and statistical precision. Statistics
for Business and Economics enables readers to conduct serious
analysis of applied problems rather than running simple "canned"
applications. This text is also at a mathematically higher level
than most business statistics texts and provides readers with the
knowledge they need to become stronger analysts for future
managerial positions. The eighth edition of this book has been
revised and updated to provide readers with improved problem
contexts for learning how statistical methods can improve their
analysis and understanding of business and economics.
Despite numerous books on research methodology, many have failed to
present a complete, hands-on, practical book to lead college
classes or individuals through the research process. We are seeing
more and more scientific papers from all research fields that fail
to meet the basic criteria in terms of research methods, as well as
the structure, writing style and presentation of results. This book
aims to address this gap in the market by providing an
authoritative, easy to follow guide to research methods and how to
apply them. Qualitative Methods in Economics is focused not only on
the research methods/techniques but also the methodology. The main
objective of this book is to discuss qualitative methods and their
use in economics and social science research. Chapters identify
several of the research approaches commonly used in social studies,
from the importance of the role of science through to the
techniques of data collection. Using an example research paper to
examine the methods used to present the research, the second half
of this book breaks down how to present and format your results
successfully. This book will be of use to students and researchers
who want to improve their research methods and read up on the new
and cutting edge advances in research methods, as well as those who
like to study ways to improve the research process.
Mastering the basic concepts of mathematics is the key to
understanding other subjects such as Economics, Finance,
Statistics, and Accounting. Mathematics for Finance, Business and
Economics is written informally for easy comprehension. Unlike
traditional textbooks it provides a combination of explanations,
exploration and real-life applications of major concepts.
Mathematics for Finance, Business and Economics discusses
elementary mathematical operations, linear and non-linear functions
and equations, differentiation and optimization, economic
functions, summation, percentages and interest, arithmetic and
geometric series, present and future values of annuities, matrices
and Markov chains. Aided by the discussion of real-world problems
and solutions, students across the business and economics
disciplines will find this textbook perfect for gaining an
understanding of a core plank of their studies.
Economic evaluation has become an essential component of clinical
trial design to show that new treatments and technologies offer
value to payers in various healthcare systems. Although many books
exist that address the theoretical or practical aspects of
cost-effectiveness analysis, this book differentiates itself from
the competition by detailing how to apply health economic
evaluation techniques in a clinical trial context, from both
academic and pharmaceutical/commercial perspectives. It also
includes a special chapter for clinical trials in Cancer. Design
& Analysis of Clinical Trials for Economic Evaluation &
Reimbursement is not just about performing cost-effectiveness
analyses. It also emphasizes the strategic importance of economic
evaluation and offers guidance and advice on the complex factors at
play before, during, and after an economic evaluation. Filled with
detailed examples, the book bridges the gap between applications of
economic evaluation in industry (mainly pharmaceutical) and what
students may learn in university courses. It provides readers with
access to SAS and STATA code. In addition, Windows-based software
for sample size and value of information analysis is available free
of charge-making it a valuable resource for students considering a
career in this field or for those who simply wish to know more
about applying economic evaluation techniques. The book includes
coverage of trial design, case report form design, quality of life
measures, sample sizes, submissions to regulatory authorities for
reimbursement, Markov models, cohort models, and decision trees.
Examples and case studies are provided at the end of each chapter.
Presenting first-hand insights into how economic evaluations are
performed from a drug development perspective, the book supplies
readers with the foundation required to succeed in an environment
where clinical trials and cost-effectiveness of new treatments are
central. It also includes thought-provoking exercises for use in
classroom and seminar discussions.
Principles of Copula Theory explores the state of the art on
copulas and provides you with the foundation to use copulas in a
variety of applications. Throughout the book, historical remarks
and further readings highlight active research in the field,
including new results, streamlined presentations, and new proofs of
old results. After covering the essentials of copula theory, the
book addresses the issue of modeling dependence among components of
a random vector using copulas. It then presents copulas from the
point of view of measure theory, compares methods for the
approximation of copulas, and discusses the Markov product for
2-copulas. The authors also examine selected families of copulas
that possess appealing features from both theoretical and applied
viewpoints. The book concludes with in-depth discussions on two
generalizations of copulas: quasi- and semi-copulas. Although
copulas are not the solution to all stochastic problems, they are
an indispensable tool for understanding several problems about
stochastic dependence. This book gives you the solid and formal
mathematical background to apply copulas to a range of mathematical
areas, such as probability, real analysis, measure theory, and
algebraic structures.
A unique and comprehensive source of information, this book is the
only international publication providing economists, planners,
policymakers and business people with worldwide statistics on
current performance and trends in the manufacturing sector. The
Yearbook is designed to facilitate international comparisons
relating to manufacturing activity and industrial development and
performance. It provides data which can be used to analyse patterns
of growth and related long term trends, structural change and
industrial performance in individual industries. Statistics on
employment patterns, wages, consumption and gross output and other
key indicators are also presented.
The main objective of politicians is to maximise economic growth,
which heavily drives political policy and decision-making. Critics
of the maximisation of growth as the central aim of economic policy
have argued that growth in itself is not necessarily a good thing,
particularly for the environment; however, what would replace the
system and how it would be measured are questions that have been
rarely answered satisfactorily. First published in 1991, this book
was the first to lay out an entirely new set of practical proposals
for developing new economic measurement tools, with the aim of
being sustainable, 'green' and human-centred. Victor Anderson
proposes that a whole set of indicators, rather than a single one,
should play all the roles that GNP (Gross National Product) is
responsible for. With a detailed overview of the central debates
between the advocates and opponents of continued economic growth
and an analysis of the various proposals for modification, this
title will be of particular value to students interested in the
diversity of measurement tools and the notion that economies should
also be evaluated by their social and environmental consequences.
A unique and comprehensive source of information, this book is the
only international publication providing economists, planners,
policymakers and business people with worldwide statistics on
current performance and trends in the manufacturing sector. The
Yearbook is designed to facilitate international comparisons
relating to manufacturing activity and industrial development and
performance. It provides data which can be used to analyse patterns
of growth and related long term trends, structural change and
industrial performance in individual industries. Statistics on
employment patterns, wages, consumption and gross output and other
key indicators are also presented.
In How to Make the World Add Up, Tim Harford draws on his experience as both an economist and presenter of the BBC's radio show 'More or Less' to take us deep into the world of disinformation and obfuscation, bad research and misplaced motivation to find those priceless jewels of data and analysis that make communicating with numbers so rewarding. Through vivid storytelling he reveals how we can evaluate the claims that surround us with confidence, curiosity and a healthy level of scepticism. It is a must-read for anyone who cares about understanding the world around them.
Economic Time Series: Modeling and Seasonality is a focused
resource on analysis of economic time series as pertains to
modeling and seasonality, presenting cutting-edge research that
would otherwise be scattered throughout diverse peer-reviewed
journals. This compilation of 21 chapters showcases the
cross-fertilization between the fields of time series modeling and
seasonal adjustment, as is reflected both in the contents of the
chapters and in their authorship, with contributors coming from
academia and government statistical agencies. For easier perusal
and absorption, the contents have been grouped into seven topical
sections: Section I deals with periodic modeling of time series,
introducing, applying, and comparing various seasonally periodic
models Section II examines the estimation of time series components
when models for series are misspecified in some sense, and the
broader implications this has for seasonal adjustment and business
cycle estimation Section III examines the quantification of error
in X-11 seasonal adjustments, with comparisons to error in
model-based seasonal adjustments Section IV discusses some
practical problems that arise in seasonal adjustment: developing
asymmetric trend-cycle filters, dealing with both temporal and
contemporaneous benchmark constraints, detecting trading-day
effects in monthly and quarterly time series, and using diagnostics
in conjunction with model-based seasonal adjustment Section V
explores outlier detection and the modeling of time series
containing extreme values, developing new procedures and extending
previous work Section VI examines some alternative models and
inference procedures for analysis of seasonal economic time series
Section VII deals with aspects of modeling, estimation, and
forecasting for nonseasonal economic time series By presenting new
methodological developments as well as pertinent empirical analyses
and reviews of established methods, the book provides much that is
stimulating and practically useful for the serious researcher and
analyst of economic time series.
Delivering cutting-edge coverage that includes the latest thinking
and practices from the field, QUALITY AND PERFORMANCE EXCELLENCE,
8e presents the basic principles and tools associated with quality
and performance excellence. Packed with relevant, real-world
examples, the text thoroughly illustrates how these principles and
methods have been put into effect in a variety of organizations. It
also highlights the relationship between basic principles and the
popular theories and models studied in management courses. The
eighth edition reflects the 2015-16 Baldrige criteria and includes
new boxed features, experiential exercises, and up-to-date case
studies that give you practical experience working with real-world
issues. Many cases focus on large and small companies in
manufacturing and service industries in North and South America,
Europe, and Asia-Pacific. In addition, chapters now open with a
"Performance Excellence Profile" highlighting a recent Baldrige
recipient.
The chapters in this book describe various aspects of the
application of statistical methods in finance. It will interest and
attract statisticians to this area, illustrate some of the many
ways that statistical tools are used in financial applications, and
give some indication of problems which are still outstanding. The
statisticians will be stimulated to learn more about the kinds of
models and techniques outlined in the book - both the domain of
finance and the science of statistics will benefit from increased
awareness by statisticians of the problems, models, and techniques
applied in financial applications. For this reason, extensive
references are given. The level of technical detail varies between
the chapters. Some present broad non-technical overviews of an
area, while others describe the mathematical niceties. This
illustrates both the range of possibilities available in the area
for statisticians, while simultaneously giving a flavour of the
different kinds of mathematical and statistical skills required.
Whether you favour data analysis or mathematical manipulation, if
you are a statistician there are problems in finance which are
appropriate to your skills.
|
You may like...
Home Body
Rupi Kaur
Paperback
(1)
R347
R317
Discovery Miles 3 170
|