This book is about the definition of the Shannon measure of
Information, and some derived quantities such as conditional
information and mutual information. Unlike many books, which refer
to the Shannon's Measure of information (SMI) as 'Entropy,' this
book makes a clear distinction between the SMI and Entropy.In the
last chapter, Entropy is derived as a special case of SMI.Ample
examples are provided which help the reader in understanding the
different concepts discussed in this book. As with previous books
by the author, this book aims at a clear and mystery-free
presentation of the central concept in Information theory - the
Shannon's Measure of Information.This book presents the fundamental
concepts of Information theory in a friendly-simple language and is
devoid of all kinds of fancy and pompous statements made by authors
of popular science books who write on this subject. It is unique in
its presentation of Shannon's measure of information, and the clear
distinction between this concept and the thermodynamic
entropy.Although some mathematical knowledge is required by the
reader, the emphasis is on the concepts and their meaning rather on
the mathematical details of the theory.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!