Statistical learning theory is aimed at analyzing complex data
with necessarily approximate models. This book is intended for an
audience with a graduate background in probability theory and
statistics. It will be useful to any reader wondering why it may be
a good idea, to use as is often done in practice a notoriously
"wrong'' (i.e. over-simplified) model to predict, estimate or
classify. This point of view takes its roots in three fields:
information theory, statistical mechanics, and PAC-Bayesian
theorems. Results on the large deviations of trajectories of Markov
chains with rare transitions are also included. They are meant to
provide a better understanding of stochastic optimization
algorithms of common use in computing estimators. The author
focuses on non-asymptotic bounds of the statistical risk, allowing
one to choose adaptively between rich and structured families of
models and corresponding estimators. Two mathematical objects
pervade the book: entropy and Gibbs measures. The goal is to show
how to turn them into versatile and efficient technical tools, that
will stimulate further studies and results.
General
Imprint: |
Springer-Verlag
|
Country of origin: |
Germany |
Series: |
Ecole d'Ete de Probabilites de Saint-Flour, 1851 |
Release date: |
August 2004 |
First published: |
2004 |
Editors: |
Jean Picard
|
Authors: |
Olivier Catoni
|
Dimensions: |
235 x 155 x 15mm (L x W x T) |
Format: |
Paperback
|
Pages: |
284 |
Edition: |
2004 ed. |
ISBN-13: |
978-3-540-22572-0 |
Categories: |
Books >
Science & Mathematics >
Mathematics >
Probability & statistics
|
LSN: |
3-540-22572-2 |
Barcode: |
9783540225720 |
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!