|
|
Showing 1 - 1 of
1 matches in All Departments
Kernel methods are among the most popular techniques in machine
learning. From a regularization theory perspective, they provide a
natural choice for the hypotheses space and the regularization
functional through the notion of reproducing kernel Hilbert spaces.
From a probabilistic theory perspective, they are the key in the
context of Gaussian processes, where the kernel function is known
as the covariance function. The theory of kernel methods for
single-valued functions is well established by now, and indeed
there has been a considerable amount of work devoted to designing
and learning kernels. More recently there has been an increasing
interest in methods that deal with multiple outputs, motivated
partly by frameworks like multitask learning. Applications of
kernels for vector-valued functions include sensor networks,
geostatistics, computer graphics and several more. Kernels for
Vector-Valued Functions: A Review looks at different methods to
design or learn valid kernel functions for multiple outputs, paying
particular attention to the connection between probabilistic and
regularization methods. Kernels for Vector-Valued Functions: A
Review is aimed at researchers with an interest in the theory and
application of kernels for vector-valued functions in areas such as
statistics, computer science and engineering. One of its goals is
to provide a unified framework and a common terminology for
researchers working in machine learning and statistics.
|
You may like...
Hoe Ek Dit Onthou
Francois Van Coke, Annie Klopper
Paperback
R320
R286
Discovery Miles 2 860
Johnny Carson
Henry Bushkin
Paperback
R448
R417
Discovery Miles 4 170
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.