|
|
Showing 1 - 3 of
3 matches in All Departments
This monograph demonstrates a new approach to the classical mode
decomposition problem through nonlinear regression models, which
achieve near-machine precision in the recovery of the modes. The
presentation includes a review of generalized additive models,
additive kernels/Gaussian processes, generalized Tikhonov
regularization, empirical mode decomposition, and Synchrosqueezing,
which are all related to and generalizable under the proposed
framework. Although kernel methods have strong theoretical
foundations, they require the prior selection of a good kernel.
While the usual approach to this kernel selection problem is
hyperparameter tuning, the objective of this monograph is to
present an alternative (programming) approach to the kernel
selection problem while using mode decomposition as a prototypical
pattern recognition problem. In this approach, kernels are
programmed for the task at hand through the programming of
interpretable regression networks in the context of additive
Gaussian processes. It is suitable for engineers, computer
scientists, mathematicians, and students in these fields working on
kernel methods, pattern recognition, and mode decomposition
problems.
The topic of Uncertainty Quantification (UQ) has witnessed massive
developments in response to the promise of achieving risk
mitigation through scientific prediction. It has led to the
integration of ideas from mathematics, statistics and engineering
being used to lend credence to predictive assessments of risk but
also to design actions (by engineers, scientists and investors)
that are consistent with risk aversion. The objective of this
Handbook is to facilitate the dissemination of the forefront of UQ
ideas to their audiences. We recognize that these audiences are
varied, with interests ranging from theory to application, and from
research to development and even execution.
Although numerical approximation and statistical inference are
traditionally covered as entirely separate subjects, they are
intimately connected through the common purpose of making
estimations with partial information. This book explores these
connections from a game and decision theoretic perspective, showing
how they constitute a pathway to developing simple and general
methods for solving fundamental problems in both areas. It
illustrates these interplays by addressing problems related to
numerical homogenization, operator adapted wavelets, fast solvers,
and Gaussian processes. This perspective reveals much of their
essential anatomy and greatly facilitates advances in these areas,
thereby appearing to establish a general principle for guiding the
process of scientific discovery. This book is designed for graduate
students, researchers, and engineers in mathematics, applied
mathematics, and computer science, and particularly researchers
interested in drawing on and developing this interface between
approximation, inference, and learning.
|
You may like...
In At The Kill
Gerald Seymour
Paperback
R445
R409
Discovery Miles 4 090
|