0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • R2,500 - R5,000 (4)
  • -
Status
Brand

Showing 1 - 8 of 8 matches in All Departments

The Science of Bradley Efron - Selected Papers (Hardcover, 2008 ed.): Carl N. Morris, Robert Tibshirani The Science of Bradley Efron - Selected Papers (Hardcover, 2008 ed.)
Carl N. Morris, Robert Tibshirani
R4,879 R4,312 Discovery Miles 43 120 Save R567 (12%) Ships in 12 - 17 working days

Nature didn't design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn't been made?) Carl Morris and Rob Tibshirani, the two colleagues I've worked most closely with, both 't my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

Statistical Learning with Sparsity - The Lasso and Generalizations (Paperback): Trevor Hastie, Robert Tibshirani, Martin... Statistical Learning with Sparsity - The Lasso and Generalizations (Paperback)
Trevor Hastie, Robert Tibshirani, Martin Wainwright
R1,285 Discovery Miles 12 850 Ships in 12 - 17 working days

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

An Introduction to Statistical Learning - with Applications in R (Paperback, 2nd ed. 2021): Gareth James, Daniela Witten,... An Introduction to Statistical Learning - with Applications in R (Paperback, 2nd ed. 2021)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani
R1,563 Discovery Miles 15 630 Ships in 12 - 17 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. This Second Edition features new chapters on deep learning, survival analysis, and multiple testing, as well as expanded treatments of naive Bayes, generalized linear models, Bayesian additive regression trees, and matrix completion. R code has been updated throughout to ensure compatibility.

The Elements of Statistical Learning - Data Mining, Inference, and Prediction, Second Edition (Hardcover, 2nd ed. 2009, Corr.... The Elements of Statistical Learning - Data Mining, Inference, and Prediction, Second Edition (Hardcover, 2nd ed. 2009, Corr. 9th printing 2017)
Trevor Hastie, Robert Tibshirani, Jerome Friedman
R1,858 Discovery Miles 18 580 Ships in 12 - 17 working days

During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for wide'' data (p bigger than n), including multiple testing and false discovery rates.

Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.

Statistical Learning with Sparsity - The Lasso and Generalizations (Hardcover): Trevor Hastie, Robert Tibshirani, Martin... Statistical Learning with Sparsity - The Lasso and Generalizations (Hardcover)
Trevor Hastie, Robert Tibshirani, Martin Wainwright
R3,057 Discovery Miles 30 570 Ships in 12 - 17 working days

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

The Science of Bradley Efron - Selected Papers (Paperback, Softcover reprint of hardcover 1st ed. 2008): Carl N. Morris, Robert... The Science of Bradley Efron - Selected Papers (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Carl N. Morris, Robert Tibshirani
R3,135 Discovery Miles 31 350 Ships in 10 - 15 working days

Nature didn't design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn't been made?) Carl Morris and Rob Tibshirani, the two colleagues I've worked most closely with, both 't my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

An Introduction to Statistical Learning - with Applications in Python (1st ed. 2023): Gareth James, Daniela Witten, Trevor... An Introduction to Statistical Learning - with Applications in Python (1st ed. 2023)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani, Jonathan Taylor
R2,776 Discovery Miles 27 760 Ships in 12 - 17 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance, marketing, and  astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. This book is targeted at statisticians and non-statisticians alike, who wish to use cutting-edge statistical learning techniques to analyze their data. Four of the authors co-wrote An Introduction to Statistical Learning, With Applications in R (ISLR), which has become a mainstay of undergraduate and graduate classrooms worldwide, as well as an important reference book for data scientists. One of the keys to its success was that each chapter contains a tutorial on implementing the analyses and methods presented in the R scientific computing environment. However, in recent years Python has become a popular language for data science, and there has been increasing demand for a Python-based alternative to ISLR. Hence, this book (ISLP) covers the same materials as ISLR but with labs implemented in Python. These labs will be useful both for Python novices, as well as experienced users.

An Introduction to Statistical Learning - with Applications in R (Hardcover, 2nd ed. 2021): Gareth James, Daniela Witten,... An Introduction to Statistical Learning - with Applications in R (Hardcover, 2nd ed. 2021)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani
R2,354 R2,096 Discovery Miles 20 960 Save R258 (11%) Ships in 12 - 17 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. This Second Edition features new chapters on deep learning, survival analysis, and multiple testing, as well as expanded treatments of naive Bayes, generalized linear models, Bayesian additive regression trees, and matrix completion. R code has been updated throughout to ensure compatibility.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Ergo Height Adjustable Monitor Stand
R439 R329 Discovery Miles 3 290
And So I Roar
Abi Dare Paperback R415 R289 Discovery Miles 2 890
Prosperplast Wheaty Pot - White (128 x…
R35 Discovery Miles 350
ZA Cute Butterfly Earrings and Necklace…
R712 R499 Discovery Miles 4 990
Hoe Ek Dit Onthou
Francois Van Coke, Annie Klopper Paperback R300 R219 Discovery Miles 2 190
Beast
Idris Elba, Sharlto Copley DVD R103 Discovery Miles 1 030
LEGO Race Cars
Editors of Klutz Paperback R479 Discovery Miles 4 790
Bug-A-Salt 3.0 Black Fly
 (1)
R999 Discovery Miles 9 990
By Way Of Deception
Amir Tsarfati, Steve Yohn Paperback  (1)
R250 R185 Discovery Miles 1 850
Alcolin Super Glue 3 X 3G
R64 Discovery Miles 640

 

Partners