0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (2)
  • R2,500 - R5,000 (7)
  • -
Status
Brand

Showing 1 - 9 of 9 matches in All Departments

Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric... Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric Statistics (Hardcover, 2014 ed.)
Michael G. Akritas, S. N. Lahiri, Dimitris N. Politis
R4,664 R2,150 Discovery Miles 21 500 Save R2,514 (54%) Ships in 12 - 17 working days

This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world. The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the world, and contributes to the further development of the field.The conference program included over 250 talks, including special invited talks, plenary talks, and contributed talks on all areas of nonparametric statistics. Out of these talks, some of the most pertinent ones have been refereed and developed into chapters that share both research and developments in the field.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015): Dimitris N.... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015)
Dimitris N. Politis
R3,207 Discovery Miles 32 070 Ships in 12 - 17 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Selected Works of Murray Rosenblatt (Hardcover, Edition.): Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis Selected Works of Murray Rosenblatt (Hardcover, Edition.)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R5,081 R4,599 Discovery Miles 45 990 Save R482 (9%) Ships in 12 - 17 working days

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus.

This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio."

Subsampling (Hardcover, 1999 ed.): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Hardcover, 1999 ed.)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R4,625 Discovery Miles 46 250 Ships in 12 - 17 working days

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Time Series - A First Course with Bootstrap Starter (Paperback): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Paperback)
Tucker S McElroy, Dimitris N. Politis
R1,322 Discovery Miles 13 220 Ships in 12 - 17 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Time Series - A First Course with Bootstrap Starter (Hardcover): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Hardcover)
Tucker S McElroy, Dimitris N. Politis
R2,741 Discovery Miles 27 410 Ships in 12 - 17 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011): Richard A. Davis, Keh-Shin... Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R4,409 Discovery Miles 44 090 Out of stock

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus. This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the original 1st ed. 2015)
Dimitris N. Politis
R2,611 R2,472 Discovery Miles 24 720 Save R139 (5%) Out of stock

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R3,215 R3,030 Discovery Miles 30 300 Save R185 (6%) Out of stock

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Hidden Figures - The Untold Story of the…
Margot Lee Shetterly Paperback  (1)
R323 R256 Discovery Miles 2 560
The History of Mexico
Francesco Saverio Clavigero Paperback R729 Discovery Miles 7 290
Sooner Safer Happier - Antipatterns and…
Jonathan Smart Hardcover R708 Discovery Miles 7 080
MANOPOLY!?- The Persian Affair
James Phillip Phil Jones Hardcover R738 Discovery Miles 7 380
Universal History, Ancient and Modern…
William Fordyce Mavor Paperback R572 Discovery Miles 5 720
Research Anthology on Decision Support…
Information R Management Association Hardcover R22,675 Discovery Miles 226 750
One Day - The Extraordinary Story of an…
Gene Weingarten Paperback R495 R403 Discovery Miles 4 030
Full S.T.E.A.M. Ahead - Triumphant Tales…
Gayle Keller Hardcover R947 Discovery Miles 9 470
A Crown That Lasts - You Are Not Your…
Demi-Leigh Tebow Paperback R320 R235 Discovery Miles 2 350
Memoranda of a Residence at the Court of…
Richard Rush Paperback R654 Discovery Miles 6 540

 

Partners