|
Showing 1 - 2 of
2 matches in All Departments
Every mathematical discipline goes through three periods of
development: the naive, the formal, and the critical. David Hilbert
The goal of this book is to explain the principles that made
support vector machines (SVMs) a successful modeling and prediction
tool for a variety of applications. We try to achieve this by
presenting the basic ideas of SVMs together with the latest
developments and current research questions in a uni?ed style. In a
nutshell, we identify at least three reasons for the success of
SVMs: their ability to learn well with only a very small number of
free parameters, their robustness against several types of model
violations and outliers, and last but not least their computational
e?ciency compared with several other methods. Although there are
several roots and precursors of SVMs, these methods gained
particular momentum during the last 15 years since Vapnik (1995,
1998) published his well-known textbooks on statistical learning
theory with aspecialemphasisonsupportvectormachines.
Sincethen,the?eldofmachine
learninghaswitnessedintenseactivityinthestudyofSVMs,whichhasspread
moreandmoretootherdisciplinessuchasstatisticsandmathematics. Thusit
seems fair to say that several communities are currently working on
support vector machines and on related kernel-based methods.
Although there are many interactions between these communities, we
think that there is still
roomforadditionalfruitfulinteractionandwouldbegladifthistextbookwere
found helpful in stimulating further research. Many of the results
presented in this book have previously been scattered in the
journal literature or are still under review. As a consequence,
these results have been accessible only to a
relativelysmallnumberofspecialists,sometimesprobablyonlytopeoplefrom
one community but not the others.
Every mathematical discipline goes through three periods of
development: the naive, the formal, and the critical. David Hilbert
The goal of this book is to explain the principles that made
support vector machines (SVMs) a successful modeling and prediction
tool for a variety of applications. We try to achieve this by
presenting the basic ideas of SVMs together with the latest
developments and current research questions in a uni?ed style. In a
nutshell, we identify at least three reasons for the success of
SVMs: their ability to learn well with only a very small number of
free parameters, their robustness against several types of model
violations and outliers, and last but not least their computational
e?ciency compared with several other methods. Although there are
several roots and precursors of SVMs, these methods gained
particular momentum during the last 15 years since Vapnik (1995,
1998) published his well-known textbooks on statistical learning
theory with aspecialemphasisonsupportvectormachines. Sincethen,
the?eldofmachine
learninghaswitnessedintenseactivityinthestudyofSVMs, whichhasspread
moreandmoretootherdisciplinessuchasstatisticsandmathematics. Thusit
seems fair to say that several communities are currently working on
support vector machines and on related kernel-based methods.
Although there are many interactions between these communities, we
think that there is still
roomforadditionalfruitfulinteractionandwouldbegladifthistextbookwere
found helpful in stimulating further research. Many of the results
presented in this book have previously been scattered in the
journal literature or are still under review. As a consequence,
these results have been accessible only to a
relativelysmallnumberofspecialists,
sometimesprobablyonlytopeoplefrom one community but not the othe
|
You may like...
Southpaw
Jake Gyllenhaal, Forest Whitaker, …
DVD
R99
R24
Discovery Miles 240
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|