|
Showing 1 - 7 of
7 matches in All Departments
During the past half-century, exponential families have attained a
position at the center of parametric statistical inference.
Theoretical advances have been matched, and more than matched, in
the world of applications, where logistic regression by itself has
become the go-to methodology in medical statistics, computer-based
prediction algorithms, and the social sciences. This book is based
on a one-semester graduate course for first year Ph.D. and advanced
master's students. After presenting the basic structure of
univariate and multivariate exponential families, their application
to generalized linear models including logistic and Poisson
regression is described in detail, emphasizing geometrical ideas,
computational practice, and the analogy with ordinary linear
regression. Connections are made with a variety of current
statistical methodologies: missing data, survival analysis and
proportional hazards, false discovery rates, bootstrapping, and
empirical Bayes analysis. The book connects exponential family
theory with its applications in a way that doesn't require advanced
mathematical preparation.
Statistics is a subject of many uses and surprisingly few effective practitioners. The traditional road to statistical knowledge is blocked, for most, by a formidable wall of mathematics. The approach in An Introduction to the Bootstrap avoids that wall. It arms scientists and engineers, as well as statisticians, with the computational techniques they need to analyze and understand complicated data sets.
The twenty-first century has seen a breathtaking expansion of
statistical methodology, both in scope and influence. 'Data
science' and 'machine learning' have become familiar terms in the
news, as statistical methods are brought to bear upon the enormous
data sets of modern science and commerce. How did we get here? And
where are we going? How does it all fit together? Now in paperback
and fortified with exercises, this book delivers a concentrated
course in modern statistical thinking. Beginning with classical
inferential theories - Bayesian, frequentist, Fisherian -
individual chapters take up a series of influential topics:
survival analysis, logistic regression, empirical Bayes, the
jackknife and bootstrap, random forests, neural networks, Markov
Chain Monte Carlo, inference after model selection, and dozens
more. The distinctly modern approach integrates methodology and
algorithms with statistical inference. Each chapter ends with
class-tested exercises, and the book concludes with speculation on
the future direction of statistics and data science.
We live in a new age for statistical inference, where modern
scientific technology such as microarrays and fMRI machines
routinely produce thousands and sometimes millions of parallel data
sets, each with its own estimation or testing problem. Doing
thousands of problems at once is more than repeated application of
classical methods. Taking an empirical Bayes approach, Bradley
Efron, inventor of the bootstrap, shows how information accrues
across problems in a way that combines Bayesian and frequentist
ideas. Estimation, testing, and prediction blend in this framework,
producing opportunities for new methodologies of increased power.
New difficulties also arise, easily leading to flawed inferences.
This book takes a careful look at both the promise and pitfalls of
large-scale statistical inference, with particular attention to
false discovery rates, the most successful of the new statistical
techniques. Emphasis is on the inferential ideas underlying
technical developments, illustrated using a large number of real
examples.
The twenty-first century has seen a breathtaking expansion of
statistical methodology, both in scope and in influence. 'Big
data', 'data science', and 'machine learning' have become familiar
terms in the news, as statistical methods are brought to bear upon
the enormous data sets of modern science and commerce. How did we
get here? And where are we going? This book takes us on an
exhilarating journey through the revolution in data analysis
following the introduction of electronic computation in the 1950s.
Beginning with classical inferential theories - Bayesian,
frequentist, Fisherian - individual chapters take up a series of
influential topics: survival analysis, logistic regression,
empirical Bayes, the jackknife and bootstrap, random forests,
neural networks, Markov chain Monte Carlo, inference after model
selection, and dozens more. The distinctly modern approach
integrates methodology and algorithms with statistical inference.
The book ends with speculation on the future direction of
statistics and data science.
During the past half-century, exponential families have attained a
position at the center of parametric statistical inference.
Theoretical advances have been matched, and more than matched, in
the world of applications, where logistic regression by itself has
become the go-to methodology in medical statistics, computer-based
prediction algorithms, and the social sciences. This book is based
on a one-semester graduate course for first year Ph.D. and advanced
master's students. After presenting the basic structure of
univariate and multivariate exponential families, their application
to generalized linear models including logistic and Poisson
regression is described in detail, emphasizing geometrical ideas,
computational practice, and the analogy with ordinary linear
regression. Connections are made with a variety of current
statistical methodologies: missing data, survival analysis and
proportional hazards, false discovery rates, bootstrapping, and
empirical Bayes analysis. The book connects exponential family
theory with its applications in a way that doesn't require advanced
mathematical preparation.
The jackknife and the bootstrap are nonparametric methods for
assessing the errors in a statistical estimation problem. They
provide several advantages over the traditional parametric
approach: the methods are easy to describe and they apply to
arbitrarily complicated situations; distribution assumptions, such
as normality, are never made. This monograph connects the
jackknife, the bootstrap, and many other related ideas such as
cross-validation, random subsampling, and balanced repeated
replications into a unified exposition. The theoretical development
is at an easy mathematical level and is supplemented by a large
number of numerical examples. The methods described in this
monograph form a useful set of tools for the applied statistician.
They are particularly useful in problem areas where complicated
data structures are common, for example, in censoring, missing
data, and highly multivariate situations.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R391
R362
Discovery Miles 3 620
Loot
Nadine Gordimer
Paperback
(2)
R391
R362
Discovery Miles 3 620
|