Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
|
Buy Now
New Foundations for Information Theory - Logical Entropy and Shannon Entropy (Paperback, 1st ed. 2021)
Loot Price: R1,781
Discovery Miles 17 810
|
|
New Foundations for Information Theory - Logical Entropy and Shannon Entropy (Paperback, 1st ed. 2021)
Series: SpringerBriefs in Philosophy
Expected to ship within 10 - 15 working days
|
This monograph offers a new foundation for information theory that
is based on the notion of information-as-distinctions, being
directly measured by logical entropy, and on the re-quantification
as Shannon entropy, which is the fundamental concept for the theory
of coding and communications. Information is based on distinctions,
differences, distinguishability, and diversity. Information sets
are defined that express the distinctions made by a partition,
e.g., the inverse-image of a random variable so they represent the
pre-probability notion of information. Then logical entropy is a
probability measure on the information sets, the probability that
on two independent trials, a distinction or "dit" of the partition
will be obtained. The formula for logical entropy is a new
derivation of an old formula that goes back to the early twentieth
century and has been re-derived many times in different contexts.
As a probability measure, all the compound notions of joint,
conditional, and mutual logical entropy are immediate. The Shannon
entropy (which is not defined as a measure in the sense of measure
theory) and its compound notions are then derived from a non-linear
dit-to-bit transform that re-quantifies the distinctions of a
random variable in terms of bits-so the Shannon entropy is the
average number of binary distinctions or bits necessary to make all
the distinctions of the random variable. And, using a linearization
method, all the set concepts in this logical information theory
naturally extend to vector spaces in general-and to Hilbert spaces
in particular-for quantum logical information theory which provides
the natural measure of the distinctions made in quantum
measurement. Relatively short but dense in content, this work can
be a reference to researchers and graduate students doing
investigations in information theory, maximum entropy methods in
physics, engineering, and statistics, and to all those with a
special interest in a new approach to quantum information theory.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!
|
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.