|
Showing 1 - 6 of
6 matches in All Departments
When the 50th anniversary of the birth of Information Theory was
celebrated at the 1998 IEEE International Symposium on Informa tion
Theory in Boston, there was a great deal of reflection on the the
year 1993 as a critical year. As the years pass and more perspec
tive is gained, it is a fairly safe bet that we will view 1993 as
the year when the "early years" of error control coding came to an
end. This was the year in which Berrou, Glavieux and Thitimajshima
pre sented "Near Shannon Limit Error-Correcting Coding and
Decoding: Turbo Codes" at the International Conference on
Communications in Geneva. In their presentation, Berrou et al.
claimed that a combi nation of parallel concatenation and iterative
decoding can provide reliable communications at a signal to noise
ratio that is within a few tenths of a dB of the Shannon limit.
Nearly fifty years of striving to achieve the promise of Shannon's
noisy channel coding theorem had come to an end. The implications
of this result were immediately apparent to all -coding gains on
the order of 10 dB could be used to dramatically extend the range
of communication receivers, increase data rates and services, or
substantially reduce transmitter power levels. The 1993 ICC paper
set in motion several research efforts that have permanently
changed the way we look at error control coding."
Fundamentals of Codes, Graphs, and Iterative Decoding is an
explanation of how to introduce local connectivity, and how to
exploit simple structural descriptions. Chapter 1 provides an
overview of Shannon theory and the basic tools of complexity
theory, communication theory, and bounds on code construction.
Chapters 2 - 4 provide an overview of "classical" error control
coding, with an introduction to abstract algebra, and block and
convolutional codes. Chapters 5 - 9 then proceed to systematically
develop the key research results of the 1990s and early 2000s with
an introduction to graph theory, followed by chapters on algorithms
on graphs, turbo error control, low density parity check codes, and
low density generator codes.
Cellular technology has always been a surveillance technology, but
"cellular convergence" - the growing trend for all forms of
communication to consolidate onto the cellular handset - has
dramatically increased the impact of that surveillance. In Cellular
Convergence and the Death of Privacy, Stephen Wicker explores this
unprecedented threat to privacy from three distinct but overlapping
perspectives: the technical, the legal, and the social. Professor
Wicker first describes cellular technology and cellular
surveillance using language accessible to non-specialists. He then
examines current legislation and Supreme Court jurisprudence that
form the framework for discussions about rights in the context of
cellular surveillance. Lastly, he addresses the social impact of
surveillance on individual users. The story he tells is one of a
technology that is changing the face of politics and economics, but
in ways that remain highly uncertain.
Fundamentals of Codes, Graphs, and Iterative Decoding is an
explanation of how to introduce local connectivity, and how to
exploit simple structural descriptions. Chapter 1 provides an
overview of Shannon theory and the basic tools of complexity
theory, communication theory, and bounds on code construction.
Chapters 2 - 4 provide an overview of "classical" error control
coding, with an introduction to abstract algebra, and block and
convolutional codes. Chapters 5 - 9 then proceed to systematically
develop the key research results of the 1990s and early 2000s with
an introduction to graph theory, followed by chapters on algorithms
on graphs, turbo error control, low density parity check codes, and
low density generator codes.
When the 50th anniversary of the birth of Information Theory was
celebrated at the 1998 IEEE International Symposium on Informa tion
Theory in Boston, there was a great deal of reflection on the the
year 1993 as a critical year. As the years pass and more perspec
tive is gained, it is a fairly safe bet that we will view 1993 as
the year when the "early years" of error control coding came to an
end. This was the year in which Berrou, Glavieux and Thitimajshima
pre sented "Near Shannon Limit Error-Correcting Coding and
Decoding: Turbo Codes" at the International Conference on
Communications in Geneva. In their presentation, Berrou et al.
claimed that a combi nation of parallel concatenation and iterative
decoding can provide reliable communications at a signal to noise
ratio that is within a few tenths of a dB of the Shannon limit.
Nearly fifty years of striving to achieve the promise of Shannon's
noisy channel coding theorem had come to an end. The implications
of this result were immediately apparent to all -coding gains on
the order of 10 dB could be used to dramatically extend the range
of communication receivers, increase data rates and services, or
substantially reduce transmitter power levels. The 1993 ICC paper
set in motion several research efforts that have permanently
changed the way we look at error control coding."
This volume comprises a collection of papers presented at the
Workshop on Information Protection, held in Moscow, Russia in
December 1993. The 16 thoroughly refereed papers by internationally
known scientists selected for this volume offer an exciting
perspective on error control coding, cryptology, and speech
compression. In the former Soviet Union, research related to
information protection was often shielded from the international
scientific community. Therefore, the results presented by Russian
researchers and engineers at this first international workshop on
this topic are of particular interest; their work defines the
cutting edge of research in many areas of error control,
cryptology, and speech recognition.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R205
R168
Discovery Miles 1 680
Tenet
John David Washington, Robert Pattinson
Blu-ray disc
(1)
R50
Discovery Miles 500
|