![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
Showing 1 - 3 of 3 matches in All Departments
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer's toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory-data processing inequality-states that deterministic processing always involves information loss. These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer's toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory-data processing inequality-states that deterministic processing always involves information loss. These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
|
You may like...
Neuroimaging, Part I, Volume 135
Joseph C. Masdeu, R.Gilberto Gonzalez
Hardcover
The Morphosyntax of Portuguese and…
Mary A. Kato, Francisco Ordonez
Hardcover
R3,766
Discovery Miles 37 660
Raffaele Pettazzoni and Herbert Jennings…
Domenico Accorinti
Hardcover
R6,895
Discovery Miles 68 950
Learning and Teaching Writing Online…
Mary Deane, Teresa Guasch
Hardcover
R4,155
Discovery Miles 41 550
New Frontiers in Biomagnetism, ICS 1300…
Douglas Cheyne, Bernhard Ross, …
Hardcover
R6,620
Discovery Miles 66 200
Processing Instruction and Discourse
Alessandro G. Benati, James F. Lee
Hardcover
R5,280
Discovery Miles 52 800
|