0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R500 - R1,000 (1)
  • R1,000 - R2,500 (1)
  • -
Status
Brand

Showing 1 - 2 of 2 matches in All Departments

Reliable Reasoning - Induction and Statistical Learning Theory (Paperback): Gilbert Harman, Sanjeev Kulkarni Reliable Reasoning - Induction and Statistical Learning Theory (Paperback)
Gilbert Harman, Sanjeev Kulkarni
R799 Discovery Miles 7 990 Ships in 10 - 15 working days

The implications for philosophy and cognitive science of developments in statistical learning theory. In Reliable Reasoning, Gilbert Harman and Sanjeev Kulkarni-a philosopher and an engineer-argue that philosophy and cognitive science can benefit from statistical learning theory (SLT), the theory that lies behind recent advances in machine learning. The philosophical problem of induction, for example, is in part about the reliability of inductive reasoning, where the reliability of a method is measured by its statistically expected percentage of errors-a central topic in SLT. After discussing philosophical attempts to evade the problem of induction, Harman and Kulkarni provide an admirably clear account of the basic framework of SLT and its implications for inductive reasoning. They explain the Vapnik-Chervonenkis (VC) dimension of a set of hypotheses and distinguish two kinds of inductive reasoning. The authors discuss various topics in machine learning, including nearest-neighbor methods, neural networks, and support vector machines. Finally, they describe transductive reasoning and suggest possible new models of human reasoning suggested by developments in SLT.

Universal Estimation of Information Measures for Analog Sources (Paperback): Qing Wang, Sanjeev Kulkarni, Sergio Verdu Universal Estimation of Information Measures for Analog Sources (Paperback)
Qing Wang, Sanjeev Kulkarni, Sergio Verdu
R1,969 Discovery Miles 19 690 Ships in 10 - 15 working days

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
UHU Contact Liquid Glue (30g)
R37 Discovery Miles 370
Elecstor 18W In-Line UPS (Black)
R999 R698 Discovery Miles 6 980
Peptine Pro Equine Hydrolysed Collagen…
 (2)
R359 R249 Discovery Miles 2 490
Parlux Maria Sharapova Eau De Parfum…
 (2)
R1,082 R545 Discovery Miles 5 450
Loot
Nadine Gordimer Paperback  (2)
R391 R362 Discovery Miles 3 620
Pigeon 6548 Cotton Buds (100's)
R26 Discovery Miles 260
CyberPulse Gaming Chair (Black)
R3,999 R2,514 Discovery Miles 25 140
Sellotape Double-Sided Tape (12mm x 33m)
R61 Discovery Miles 610
Silver Strings
Iain Anderson CD R516 Discovery Miles 5 160
Tower Magnetic License Disc Holder (Pets…
R78 R63 Discovery Miles 630

 

Partners