|
|
Showing 1 - 8 of
8 matches in All Departments
Discover New Methods for Dealing with High-Dimensional Data A
sparse statistical model has only a small number of nonzero
parameters or weights; therefore, it is much easier to estimate and
interpret than a dense model. Statistical Learning with Sparsity:
The Lasso and Generalizations presents methods that exploit
sparsity to help recover the underlying signal in a set of data.
Top experts in this rapidly evolving field, the authors describe
the lasso for linear regression and a simple coordinate descent
algorithm for its computation. They discuss the application of 1
penalties to generalized linear models and support vector machines,
cover generalized penalties such as the elastic net and group
lasso, and review numerical methods for optimization. They also
present statistical inference methods for fitted (lasso) models,
including the bootstrap, Bayesian methods, and recently developed
approaches. In addition, the book examines matrix decomposition,
sparse multivariate analysis, graphical models, and compressed
sensing. It concludes with a survey of theoretical results for the
lasso. In this age of big data, the number of features measured on
a person or object can be large and might be larger than the number
of observations. This book shows how the sparsity assumption allows
us to tackle these problems and extract useful and reproducible
patterns from big datasets. Data analysts, computer scientists, and
theorists will appreciate this thorough and up-to-date treatment of
sparse statistical modeling.
An Introduction to Statistical Learning provides an accessible
overview of the field of statistical learning, an essential toolset
for making sense of the vast and complex data sets that have
emerged in fields ranging from biology to finance to marketing to
astrophysics in the past twenty years. This book presents some of
the most important modeling and prediction techniques, along with
relevant applications. Topics include linear regression,
classification, resampling methods, shrinkage approaches,
tree-based methods, support vector machines, clustering, deep
learning, survival analysis, multiple testing, and more. Color
graphics and real-world examples are used to illustrate the methods
presented. Since the goal of this textbook is to facilitate the use
of these statistical learning techniques by practitioners in
science, industry, and other fields, each chapter contains a
tutorial on implementing the analyses and methods presented in R,
an extremely popular open source statistical software platform. Two
of the authors co-wrote The Elements of Statistical Learning
(Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular
reference book for statistics and machine learning researchers. An
Introduction to Statistical Learning covers many of the same
topics, but at a level accessible to a much broader audience. This
book is targeted at statisticians and non-statisticians alike who
wish to use cutting-edge statistical learning techniques to analyze
their data. The text assumes only a previous course in linear
regression and no knowledge of matrix algebra. This Second Edition
features new chapters on deep learning, survival analysis, and
multiple testing, as well as expanded treatments of naive Bayes,
generalized linear models, Bayesian additive regression trees, and
matrix completion. R code has been updated throughout to ensure
compatibility.
The twenty-first century has seen a breathtaking expansion of
statistical methodology, both in scope and influence. 'Data
science' and 'machine learning' have become familiar terms in the
news, as statistical methods are brought to bear upon the enormous
data sets of modern science and commerce. How did we get here? And
where are we going? How does it all fit together? Now in paperback
and fortified with exercises, this book delivers a concentrated
course in modern statistical thinking. Beginning with classical
inferential theories - Bayesian, frequentist, Fisherian -
individual chapters take up a series of influential topics:
survival analysis, logistic regression, empirical Bayes, the
jackknife and bootstrap, random forests, neural networks, Markov
Chain Monte Carlo, inference after model selection, and dozens
more. The distinctly modern approach integrates methodology and
algorithms with statistical inference. Each chapter ends with
class-tested exercises, and the book concludes with speculation on
the future direction of statistics and data science.
Discover New Methods for Dealing with High-Dimensional Data A
sparse statistical model has only a small number of nonzero
parameters or weights; therefore, it is much easier to estimate and
interpret than a dense model. Statistical Learning with Sparsity:
The Lasso and Generalizations presents methods that exploit
sparsity to help recover the underlying signal in a set of data.
Top experts in this rapidly evolving field, the authors describe
the lasso for linear regression and a simple coordinate descent
algorithm for its computation. They discuss the application of 1
penalties to generalized linear models and support vector machines,
cover generalized penalties such as the elastic net and group
lasso, and review numerical methods for optimization. They also
present statistical inference methods for fitted (lasso) models,
including the bootstrap, Bayesian methods, and recently developed
approaches. In addition, the book examines matrix decomposition,
sparse multivariate analysis, graphical models, and compressed
sensing. It concludes with a survey of theoretical results for the
lasso. In this age of big data, the number of features measured on
a person or object can be large and might be larger than the number
of observations. This book shows how the sparsity assumption allows
us to tackle these problems and extract useful and reproducible
patterns from big datasets. Data analysts, computer scientists, and
theorists will appreciate this thorough and up-to-date treatment of
sparse statistical modeling.
An Introduction to Statistical Learning provides an
accessible overview of the field of statistical learning, an
essential toolset for making sense of the vast and complex data
sets that have emerged in fields ranging from biology to finance,
marketing, and astrophysics in the past twenty years. This
book presents some of the most important modeling and prediction
techniques, along with relevant applications. Topics include linear
regression, classification, resampling methods, shrinkage
approaches, tree-based methods, support vector machines,
clustering, deep learning, survival analysis, multiple testing, and
more. Color graphics and real-world examples are used to illustrate
the methods presented. This book is targeted at statisticians and
non-statisticians alike, who wish to use cutting-edge statistical
learning techniques to analyze their data. Four of the authors
co-wrote An Introduction to Statistical Learning, With
Applications in R (ISLR), which has become a mainstay of
undergraduate and graduate classrooms worldwide, as well as an
important reference book for data scientists. One of the keys to
its success was that each chapter contains a tutorial on
implementing the analyses and methods presented in the R scientific
computing environment. However, in recent years Python has become a
popular language for data science, and there has been increasing
demand for a Python-based alternative to ISLR. Hence, this book
(ISLP) covers the same materials as ISLR but with labs implemented
in Python. These labs will be useful both for Python novices, as
well as experienced users.
An Introduction to Statistical Learning provides an accessible
overview of the field of statistical learning, an essential toolset
for making sense of the vast and complex data sets that have
emerged in fields ranging from biology to finance to marketing to
astrophysics in the past twenty years. This book presents some of
the most important modeling and prediction techniques, along with
relevant applications. Topics include linear regression,
classification, resampling methods, shrinkage approaches,
tree-based methods, support vector machines, clustering, deep
learning, survival analysis, multiple testing, and more. Color
graphics and real-world examples are used to illustrate the methods
presented. Since the goal of this textbook is to facilitate the use
of these statistical learning techniques by practitioners in
science, industry, and other fields, each chapter contains a
tutorial on implementing the analyses and methods presented in R,
an extremely popular open source statistical software platform. Two
of the authors co-wrote The Elements of Statistical Learning
(Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular
reference book for statistics and machine learning researchers. An
Introduction to Statistical Learning covers many of the same
topics, but at a level accessible to a much broader audience. This
book is targeted at statisticians and non-statisticians alike who
wish to use cutting-edge statistical learning techniques to analyze
their data. The text assumes only a previous course in linear
regression and no knowledge of matrix algebra. This Second Edition
features new chapters on deep learning, survival analysis, and
multiple testing, as well as expanded treatments of naive Bayes,
generalized linear models, Bayesian additive regression trees, and
matrix completion. R code has been updated throughout to ensure
compatibility.
The twenty-first century has seen a breathtaking expansion of
statistical methodology, both in scope and in influence. 'Big
data', 'data science', and 'machine learning' have become familiar
terms in the news, as statistical methods are brought to bear upon
the enormous data sets of modern science and commerce. How did we
get here? And where are we going? This book takes us on an
exhilarating journey through the revolution in data analysis
following the introduction of electronic computation in the 1950s.
Beginning with classical inferential theories - Bayesian,
frequentist, Fisherian - individual chapters take up a series of
influential topics: survival analysis, logistic regression,
empirical Bayes, the jackknife and bootstrap, random forests,
neural networks, Markov chain Monte Carlo, inference after model
selection, and dozens more. The distinctly modern approach
integrates methodology and algorithms with statistical inference.
The book ends with speculation on the future direction of
statistics and data science.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R367
R340
Discovery Miles 3 400
Loot
Nadine Gordimer
Paperback
(2)
R367
R340
Discovery Miles 3 400
Loot
Nadine Gordimer
Paperback
(2)
R367
R340
Discovery Miles 3 400
Let's Rock
The Black Keys
CD
R229
Discovery Miles 2 290
|