|
Showing 1 - 2 of
2 matches in All Departments
The implications for philosophy and cognitive science of
developments in statistical learning theory. In Reliable Reasoning,
Gilbert Harman and Sanjeev Kulkarni-a philosopher and an
engineer-argue that philosophy and cognitive science can benefit
from statistical learning theory (SLT), the theory that lies behind
recent advances in machine learning. The philosophical problem of
induction, for example, is in part about the reliability of
inductive reasoning, where the reliability of a method is measured
by its statistically expected percentage of errors-a central topic
in SLT. After discussing philosophical attempts to evade the
problem of induction, Harman and Kulkarni provide an admirably
clear account of the basic framework of SLT and its implications
for inductive reasoning. They explain the Vapnik-Chervonenkis (VC)
dimension of a set of hypotheses and distinguish two kinds of
inductive reasoning. The authors discuss various topics in machine
learning, including nearest-neighbor methods, neural networks, and
support vector machines. Finally, they describe transductive
reasoning and suggest possible new models of human reasoning
suggested by developments in SLT.
Entropy, mutual information and divergence measure the randomness,
dependence and dissimilarity, respectively, of random objects. In
addition to their prominent role in information theory, they have
found numerous applications, among others, in probability theory
statistics, physics, chemistry, molecular biology, ecology,
bioinformatics, neuroscience, machine learning, linguistics, and
finance. Many of these applications require a universal estimate of
information measures which does not assume knowledge of the
statistical properties of the observed data. Over the past few
decades, several nonparametric algorithms have been proposed to
estimate information measures. Universal Estimation of Information
Measures for Analog Sources presents a comprehensive survey of
universal estimation of information measures for memoryless analog
(real-valued or real vector-valued) sources with an emphasis on the
estimation of mutual information and divergence and their
applications. The book reviews the consistency of the universal
algorithms and the corresponding sufficient conditions as well as
their speed of convergence. Universal Estimation of Information
Measures for Analog Sources provides a comprehensive review of an
increasingly important topic in Information Theory. It will be of
interest to students, practitioners and researchers working in
Information Theory
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R391
R362
Discovery Miles 3 620
|