0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • R2,500 - R5,000 (12)
  • -
Status
Brand

Showing 1 - 16 of 16 matches in All Departments

Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009)
Robert M. Gray
R4,383 Discovery Miles 43 830 Ships in 10 - 15 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists.

Highlights:

Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest.

Structures mathematics for an engineering audience, with emphasis on engineering applications.

New in the Second Edition:

Much of the material has been rearranged and revised for pedagogical reasons.

The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems.

The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals.

Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Stochastic Image Processing (Hardcover, 2004 ed.): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Hardcover, 2004 ed.)
Chee Sun Won, Robert M. Gray
R2,655 Discovery Miles 26 550 Ships in 18 - 22 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.): Robert M. Gray, Joseph W Goodman Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.)
Robert M. Gray, Joseph W Goodman
R3,312 Discovery Miles 33 120 Ships in 18 - 22 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental (primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included. Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Vector Quantization and Signal Compression (Hardcover, 1992 ed.): Allen Gersho, Robert M. Gray Vector Quantization and Signal Compression (Hardcover, 1992 ed.)
Allen Gersho, Robert M. Gray
R3,182 Discovery Miles 31 820 Ships in 10 - 15 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

Source Coding Theory (Hardcover, 1990 ed.): Robert M. Gray Source Coding Theory (Hardcover, 1990 ed.)
Robert M. Gray
R1,514 Discovery Miles 15 140 Ships in 18 - 22 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

Entropy and Information Theory (Hardcover, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Hardcover, 2nd ed. 2011)
Robert M. Gray
R4,723 Discovery Miles 47 230 Ships in 10 - 15 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.): Jia Li, Robert M. Gray Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.)
Jia Li, Robert M. Gray
R4,092 Discovery Miles 40 920 Ships in 18 - 22 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009)
Robert M. Gray
R4,255 Discovery Miles 42 550 Ships in 18 - 22 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists. Highlights: Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest. Structures mathematics for an engineering audience, with emphasis on engineering applications. New in the Second Edition: Much of the material has been rearranged and revised for pedagogical reasons. The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems. The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals. Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000): Jia... Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000)
Jia Li, Robert M. Gray
R3,975 Discovery Miles 39 750 Ships in 18 - 22 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004)
Chee Sun Won, Robert M. Gray
R2,623 Discovery Miles 26 230 Ships in 18 - 22 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995): Robert M. Gray,... Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995)
Robert M. Gray, Joseph W Goodman
R3,130 Discovery Miles 31 300 Ships in 18 - 22 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental ('primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included.Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990): Robert M. Gray Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990)
Robert M. Gray
R1,384 Discovery Miles 13 840 Ships in 18 - 22 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992): Allen Gersho, Robert M.... Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992)
Allen Gersho, Robert M. Gray
R3,004 Discovery Miles 30 040 Ships in 18 - 22 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

An Introduction to Statistical Signal Processing (Paperback): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Paperback)
Robert M. Gray, Lee D. Davisson
R1,527 Discovery Miles 15 270 Ships in 10 - 15 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

Entropy and Information Theory (Paperback, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Paperback, 2nd ed. 2011)
Robert M. Gray
R4,730 Discovery Miles 47 300 Ships in 18 - 22 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

An Introduction to Statistical Signal Processing (Hardcover, New): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Hardcover, New)
Robert M. Gray, Lee D. Davisson
R2,471 Discovery Miles 24 710 Ships in 10 - 15 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Introduction To Legal Pluralism In South…
C. Rautenbach Paperback  (1)
R1,274 R1,150 Discovery Miles 11 500
Progressive Exercises in English…
Richard Green Parker Paperback R497 Discovery Miles 4 970
Behind Prison Walls - Unlocking a Safer…
Edwin Cameron, Rebecca Gore, … Paperback R350 R312 Discovery Miles 3 120
Radioactive Aerosols
A.C. Chamberlain Paperback R2,034 Discovery Miles 20 340
Handbook of Microemulsion Science and…
Promod Kumar, K.L. Mittal Hardcover R11,498 Discovery Miles 114 980
Emulsions, Foams, and Thin Films
K.L. Mittal, Promod Kumar Hardcover R10,016 Discovery Miles 100 160
Surfaces, Interfaces and Colloids…
D Myers Hardcover R5,320 Discovery Miles 53 200
Relativistic Quantum Field Theory…
Michael Strickland Paperback R756 Discovery Miles 7 560
Semiconductor Nanostructures - Quantum…
Thomas Ihn Hardcover R4,537 Discovery Miles 45 370
Time's Arrow and Archimedes' Point - New…
Huw Price Hardcover R1,867 Discovery Miles 18 670

 

Partners