|
Showing 1 - 4 of
4 matches in All Departments
Concentration inequalities have been the subject of exciting
developments during the last two decades, and have been intensively
studied and used as a powerful tool in various areas. These include
convex geometry, functional analysis, statistical physics,
mathematical statistics, pure and applied probability theory,
information theory, theoretical computer science, learning theory,
and dynamical systems. This book focuses on some of the key modern
mathematical tools that are used for the derivation of
concentration inequalities, on their links to information theory,
and on their various applications to communications and coding. In
addition to being a survey, it also includes various new recent
results derived by the authors.This third edition of the
bestselling book introduces the reader to the martingale method and
the Efron-Stein-Steele inequalities in completely new sections. A
new application of lossless source coding with side information is
described in detail. Finally, the references have been updated and
ones included that have been published since the original
publication. Concentration of Measure Inequalities in Information
Theory, Communications, and Coding is essential reading for all
researchers and scientists in information theory and coding.
This second edition includes several new sections and provides a
full update on all sections. This book was welcomed when it was
first published as an important comprehensive treatment of the
subject which is now brought fully up to date. Concentration
inequalities have been the subject of exciting developments during
the last two decades, and have been intensively studied and used as
a powerful tool in various areas. These include convex geometry,
functional analysis, statistical physics, mathematical statistics,
pure and applied probability theory (e.g., concentration of measure
phenomena in random graphs, random matrices, and percolation),
information theory, theoretical computer science, learning theory,
and dynamical systems. Concentration of Measure Inequalities in
Information Theory, Communications, and Coding focuses on some of
the key modern mathematical tools that are used for the derivation
of concentration inequalities, on their links to information
theory, and on their various applications to communications and
coding. In addition to being a survey, this monograph also includes
various new recent results derived by the authors. It is essential
reading for all researchers and scientists in information theory
and coding.
This book focuses on the performance evaluation of linear codes
under optimal maximum-likelihood (ML) decoding. Though the ML
decoding algorithm is prohibitively complex for most practical
codes, their performance analysis under ML decoding allows to
predict their performance without resorting to computer
simulations. It also provides a benchmark for testing the
sub-optimality of iterative (or other practical) decoding
algorithms. This analysis also establishes the goodness of linear
codes (or ensembles), determined by the gap between their
achievable rates under optimal ML decoding and information
theoretical limits. In this book, upper and lower bounds on the
error probability of linear codes under ML decoding are surveyed
and applied to codes and ensembles of codes on graphs. For upper
bounds, the authors discuss various bounds where focus is put on
Gallager bounding techniques and their relation to a variety of
other reported bounds. Within the class of lower bounds, they
address de Caen's based bounds and their improvements, and also
consider sphere-packing bounds with their recent improvements
targeting codes of moderate block lengths. Performance Analysis of
Linear Codes under Maximum-Likelihood Decoding is a comprehensive
introduction to this important topic for students, practitioners
and researchers working in communications and information theory.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R383
R310
Discovery Miles 3 100
|