|
Showing 1 - 3 of
3 matches in All Departments
This book introduces readers to essential tools for the measurement
and analysis of information loss in signal processing systems.
Employing a new information-theoretic systems theory, the book
analyzes various systems in the signal processing engineer's
toolbox: polynomials, quantizers, rectifiers, linear filters with
and without quantization effects, principal components analysis,
multirate systems, etc. The user benefit of signal processing is
further highlighted with the concept of relevant information loss.
Signal or data processing operates on the physical representation
of information so that users can easily access and extract that
information. However, a fundamental theorem in information
theory-data processing inequality-states that deterministic
processing always involves information loss. These measures form
the basis of a new information-theoretic systems theory, which
complements the currently prevailing approaches based on
second-order statistics, such as the mean-squared error or error
energy. This theory not only provides a deeper understanding but
also extends the design space for the applied engineer with a wide
range of methods rooted in information theory, adding to existing
methods based on energy or quadratic representations.
This book introduces readers to essential tools for the measurement
and analysis of information loss in signal processing systems.
Employing a new information-theoretic systems theory, the book
analyzes various systems in the signal processing engineer's
toolbox: polynomials, quantizers, rectifiers, linear filters with
and without quantization effects, principal components analysis,
multirate systems, etc. The user benefit of signal processing is
further highlighted with the concept of relevant information loss.
Signal or data processing operates on the physical representation
of information so that users can easily access and extract that
information. However, a fundamental theorem in information
theory-data processing inequality-states that deterministic
processing always involves information loss. These measures form
the basis of a new information-theoretic systems theory, which
complements the currently prevailing approaches based on
second-order statistics, such as the mean-squared error or error
energy. This theory not only provides a deeper understanding but
also extends the design space for the applied engineer with a wide
range of methods rooted in information theory, adding to existing
methods based on energy or quadratic representations.
|
|