0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (2)
  • R2,500 - R5,000 (7)
  • -
Status
Brand

Showing 1 - 9 of 9 matches in All Departments

Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric... Topics in Nonparametric Statistics - Proceedings of the First Conference of the International Society for Nonparametric Statistics (Hardcover, 2014 ed.)
Michael G. Akritas, S. N. Lahiri, Dimitris N. Politis
R4,572 R2,166 Discovery Miles 21 660 Save R2,406 (53%) Ships in 12 - 19 working days

This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world. The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the world, and contributes to the further development of the field.The conference program included over 250 talks, including special invited talks, plenary talks, and contributed talks on all areas of nonparametric statistics. Out of these talks, some of the most pertinent ones have been refereed and developed into chapters that share both research and developments in the field.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015): Dimitris N.... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015)
Dimitris N. Politis
R3,229 Discovery Miles 32 290 Ships in 12 - 19 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Selected Works of Murray Rosenblatt (Hardcover, Edition.): Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis Selected Works of Murray Rosenblatt (Hardcover, Edition.)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R4,496 Discovery Miles 44 960 Ships in 10 - 15 working days

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus.

This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio."

Subsampling (Hardcover, 1999 ed.): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Hardcover, 1999 ed.)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R4,660 Discovery Miles 46 600 Ships in 12 - 19 working days

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Time Series - A First Course with Bootstrap Starter (Paperback): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Paperback)
Tucker S McElroy, Dimitris N. Politis
R1,436 Discovery Miles 14 360 Ships in 12 - 19 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Time Series - A First Course with Bootstrap Starter (Hardcover): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Hardcover)
Tucker S McElroy, Dimitris N. Politis
R2,920 Discovery Miles 29 200 Ships in 12 - 19 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011): Richard A. Davis, Keh-Shin... Selected Works of Murray Rosenblatt (Paperback, Softcover reprint of the original 1st ed. 2011)
Richard A. Davis, Keh-Shin Lii, Dimitris N. Politis
R4,456 Discovery Miles 44 560 Ships in 10 - 15 working days

During the second half of the 20th century, Murray Rosenblatt was one of the most celebrated and leading figures in probability and statistics. Among his many contributions, Rosenblatt conducted seminal work on density estimation, central limit theorems under strong mixing conditions, spectral domain methodology, long memory processes and Markov processes. He has published over 130 papers and 5 books, many as relevant today as when they first appeared decades ago. Murray Rosenblatt was one of the founding members of the Department of Mathematics at the University of California at San Diego (UCSD) and served as advisor to over twenty PhD students. He maintains a close association with UCSD in his role as Professor Emeritus. This volume is a celebration of Murray Rosenblatt's stellar research career that spans over six decades, and includes some of his most interesting and influential papers. Several leading experts provide commentary and reflections on various directions of Murray's research portfolio.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Paperback, Softcover reprint of the original 1st ed. 2015)
Dimitris N. Politis
R3,579 Discovery Miles 35 790 Ships in 10 - 15 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999): Dimitris N. Politis, Joseph P. Romano, Michael Wolf Subsampling (Paperback, Softcover reprint of the original 1st ed. 1999)
Dimitris N. Politis, Joseph P. Romano, Michael Wolf
R3,608 Discovery Miles 36 080 Ships in 10 - 15 working days

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
The Psychology of Humor - An Integrative…
Rod A. Martin, Thomas Ford Paperback R1,605 Discovery Miles 16 050
The Limits of the Law of Obligations
Daniel Visser Paperback R374 Discovery Miles 3 740
The Music Treatises of Thomas…
Ross W. Duffin Hardcover R4,479 Discovery Miles 44 790
From Chasing Violations to Managing…
Florentin Blanc Hardcover R3,879 Discovery Miles 38 790
The Science of Perception and Memory - A…
Daniel Reisberg Hardcover R2,639 Discovery Miles 26 390
The Merchant of Venice
John O'Connor, Stuart Eames Paperback R385 Discovery Miles 3 850
Expressive Sketchbooks - Developing…
Helen Wells Paperback R532 R492 Discovery Miles 4 920
Computational Models in Engineering
Konstantin Volkov Hardcover R3,321 Discovery Miles 33 210
Code of Federal Regulations, Title 44…
Office of the Federal Register (U S ) Paperback R1,303 Discovery Miles 13 030
Haptic Rendering for Simulation of Fine…
Dangxiao Wang, Jing Xiao, … Hardcover R3,284 R1,952 Discovery Miles 19 520

 

Partners