|
Showing 1 - 2 of
2 matches in All Departments
This book presents the Statistical Learning Theory in a detailed
and easy to understand way, by using practical examples, algorithms
and source codes. It can be used as a textbook in graduation or
undergraduation courses, for self-learners, or as reference with
respect to the main theoretical concepts of Machine Learning.
Fundamental concepts of Linear Algebra and Optimization applied to
Machine Learning are provided, as well as source codes in R, making
the book as self-contained as possible. It starts with an
introduction to Machine Learning concepts and algorithms such as
the Perceptron, Multilayer Perceptron and the Distance-Weighted
Nearest Neighbors with examples, in order to provide the necessary
foundation so the reader is able to understand the Bias-Variance
Dilemma, which is the central point of the Statistical Learning
Theory. Afterwards, we introduce all assumptions and formalize the
Statistical Learning Theory, allowing the practical study of
different classification algorithms. Then, we proceed with
concentration inequalities until arriving to the Generalization and
the Large-Margin bounds, providing the main motivations for the
Support Vector Machines. From that, we introduce all necessary
optimization concepts related to the implementation of Support
Vector Machines. To provide a next stage of development, the book
finishes with a discussion on SVM kernels as a way and motivation
to study data spaces and improve classification results.
This book presents the Statistical Learning Theory in a detailed
and easy to understand way, by using practical examples, algorithms
and source codes. It can be used as a textbook in graduation or
undergraduation courses, for self-learners, or as reference with
respect to the main theoretical concepts of Machine Learning.
Fundamental concepts of Linear Algebra and Optimization applied to
Machine Learning are provided, as well as source codes in R, making
the book as self-contained as possible. It starts with an
introduction to Machine Learning concepts and algorithms such as
the Perceptron, Multilayer Perceptron and the Distance-Weighted
Nearest Neighbors with examples, in order to provide the necessary
foundation so the reader is able to understand the Bias-Variance
Dilemma, which is the central point of the Statistical Learning
Theory. Afterwards, we introduce all assumptions and formalize the
Statistical Learning Theory, allowing the practical study of
different classification algorithms. Then, we proceed with
concentration inequalities until arriving to the Generalization and
the Large-Margin bounds, providing the main motivations for the
Support Vector Machines. From that, we introduce all necessary
optimization concepts related to the implementation of Support
Vector Machines. To provide a next stage of development, the book
finishes with a discussion on SVM kernels as a way and motivation
to study data spaces and improve classification results.
|
You may like...
She Said
Carey Mulligan, Zoe Kazan, …
DVD
R93
Discovery Miles 930
|