|
Showing 1 - 3 of
3 matches in All Departments
This volume presents the latest advances and trends in
nonparametric statistics, and gathers selected and peer-reviewed
contributions from the 3rd Conference of the International Society
for Nonparametric Statistics (ISNPS), held in Avignon, France on
June 11-16, 2016. It covers a broad range of nonparametric
statistical methods, from density estimation, survey sampling,
resampling methods, kernel methods and extreme values, to
statistical learning and classification, both in the standard
i.i.d. case and for dependent data, including big data. The
International Society for Nonparametric Statistics is uniquely
global, and its international conferences are intended to foster
the exchange of ideas and the latest advances among researchers
from around the world, in cooperation with established statistical
societies such as the Institute of Mathematical Statistics, the
Bernoulli Society and the International Statistical Institute. The
3rd ISNPS conference in Avignon attracted more than 400 researchers
from around the globe, and contributed to the further development
and dissemination of nonparametric statistics knowledge.
This book gives an account of recent developments in the field of
probability and statistics for dependent data. It covers a wide
range of topics from Markov chain theory and weak dependence with
an emphasis on some recent developments on dynamical systems, to
strong dependence in times series and random fields. There is a
section on statistical estimation problems and specific
applications. The book is written as a succession of papers by
field specialists, alternating general surveys, mostly at a level
accessible to graduate students in probability and statistics, and
more general research papers mainly suitable to researchers in the
field.
INTRODUCTION 1) Introduction In 1979, Efron introduced the
bootstrap method as a kind of universal tool to obtain
approximation of the distribution of statistics. The now well known
underlying idea is the following : consider a sample X of Xl ' n
independent and identically distributed H.i.d.) random variables
(r. v,'s) with unknown probability measure (p.m.) P . Assume we are
interested in approximating the distribution of a statistical
functional T(P ) the -1 nn empirical counterpart of the functional
T(P) , where P n := n l:i=l aX. is 1 the empirical p.m. Since in
some sense P is close to P when n is large, n * * LLd. from P and
builds the empirical p.m. if one samples Xl ' ... , Xm n n -1 mn *
* P T(P ) conditionally on := mn l: i =1 a * ' then the behaviour
of P m n,m n n n X. 1 T(P ) should imitate that of when n and mn
get large. n This idea has lead to considerable investigations to
see when it is correct, and when it is not. When it is not, one
looks if there is any way to adapt it.
|
You may like...
Morgan
Kate Mara, Jennifer Jason Leigh, …
Blu-ray disc
(1)
R70
Discovery Miles 700
|