0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (2)
  • R2,500 - R5,000 (7)
  • -
Status
Brand

Showing 1 - 9 of 9 matches in All Departments

Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric... Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric Statistics (Hardcover, 2014 ed.)
Michael G. Akritas, S. N. Lahiri, Dimitris N. Politis
R4,481 R2,032 Discovery Miles 20 320 Save R2,449 (55%) Ships in 12 - 17 working days

This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world. The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the world, and contributes to the further development of the field.The conference program included over 250 talks, including special invited talks, plenary talks, and contributed talks on all areas of nonparametric statistics. Out of these talks, some of the most pertinent ones have been refereed and developed into chapters that share both research and developments in the field.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015): Dimitris N.... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015)
Dimitris N. Politis
R3,018 Discovery Miles 30 180 Ships in 12 - 17 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Selected Works of Murray Rosenblatt (Hardcover, Edition.): Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis Selected Works of Murray Rosenblatt (Hardcover, Edition.)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R4,882 R4,314 Discovery Miles 43 140 Save R568 (12%) Ships in 12 - 17 working days

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus.

This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio."

Subsampling (Hardcover, 1999 ed.): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Hardcover, 1999 ed.)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R4,339 Discovery Miles 43 390 Ships in 12 - 17 working days

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Time Series - A First Course with Bootstrap Starter (Paperback): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Paperback)
Tucker S McElroy, Dimitris N. Politis
R1,244 Discovery Miles 12 440 Ships in 12 - 17 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Time Series - A First Course with Bootstrap Starter (Hardcover): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Hardcover)
Tucker S McElroy, Dimitris N. Politis
R2,574 Discovery Miles 25 740 Ships in 12 - 17 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011): Richard A. Davis, Keh-Shin... Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R4,335 Discovery Miles 43 350 Ships in 10 - 15 working days

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus. This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the original 1st ed. 2015)
Dimitris N. Politis
R3,471 Discovery Miles 34 710 Ships in 10 - 15 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R3,290 Discovery Miles 32 900 Ships in 10 - 15 working days

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
By Way Of Deception
Amir Tsarfati, Steve Yohn Paperback  (1)
R250 R185 Discovery Miles 1 850
Canon 445 Black and 446 Tri-Colour…
R1,400 R660 Discovery Miles 6 600
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100
Cadac 47cm Paella Pan
R1,215 Discovery Miles 12 150
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100
White Glo Smokers' Formula Toothpaste…
R60 R54 Discovery Miles 540
Bostik Double-Sided Tape (18mm x 10m…
 (1)
R31 Discovery Miles 310
Percy Jackson And The Olympians - 5-Book…
Rick Riordan Paperback R622 Discovery Miles 6 220
Wagworld Pet Blankie (Blue) - X Large…
R309 R159 Discovery Miles 1 590
Anamino Beef Protein (250g)
R289 R189 Discovery Miles 1 890

 

Partners