0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
Status
Brand

Showing 1 - 18 of 18 matches in All Departments

Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009)
Robert M. Gray
R4,760 Discovery Miles 47 600 Ships in 10 - 15 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists. Highlights: Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest. Structures mathematics for an engineering audience, with emphasis on engineering applications. New in the Second Edition: Much of the material has been rearranged and revised for pedagogical reasons. The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems. The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals. Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Entropy and Information Theory (Paperback, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Paperback, 2nd ed. 2011)
Robert M. Gray
R5,295 Discovery Miles 52 950 Ships in 10 - 15 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004)
Chee Sun Won, Robert M. Gray
R2,922 Discovery Miles 29 220 Ships in 10 - 15 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995): Robert M. Gray,... Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995)
Robert M. Gray, Joseph W Goodman
R3,495 Discovery Miles 34 950 Ships in 10 - 15 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental ('primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included.Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992): Allen Gersho, Robert M.... Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992)
Allen Gersho, Robert M. Gray
R3,358 Discovery Miles 33 580 Ships in 10 - 15 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000): Jia... Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000)
Jia Li, Robert M. Gray
R4,442 Discovery Miles 44 420 Ships in 10 - 15 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990): Robert M. Gray Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990)
Robert M. Gray
R1,530 Discovery Miles 15 300 Ships in 10 - 15 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

Entropy and Information Theory (Hardcover, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Hardcover, 2nd ed. 2011)
Robert M. Gray
R5,524 Discovery Miles 55 240 Ships in 10 - 15 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009)
Robert M. Gray
R4,963 Discovery Miles 49 630 Ships in 10 - 15 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists.

Highlights:

Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest.

Structures mathematics for an engineering audience, with emphasis on engineering applications.

New in the Second Edition:

Much of the material has been rearranged and revised for pedagogical reasons.

The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems.

The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals.

Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Stochastic Image Processing (Hardcover, 2004 ed.): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Hardcover, 2004 ed.)
Chee Sun Won, Robert M. Gray
R2,960 Discovery Miles 29 600 Ships in 10 - 15 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.): Jia Li, Robert M. Gray Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.)
Jia Li, Robert M. Gray
R4,580 Discovery Miles 45 800 Ships in 10 - 15 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.): Robert M. Gray, Joseph W Goodman Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.)
Robert M. Gray, Joseph W Goodman
R3,707 Discovery Miles 37 070 Ships in 10 - 15 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental (primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included. Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Vector Quantization and Signal Compression (Hardcover, 1992 ed.): Allen Gersho, Robert M. Gray Vector Quantization and Signal Compression (Hardcover, 1992 ed.)
Allen Gersho, Robert M. Gray
R3,692 Discovery Miles 36 920 Ships in 10 - 15 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

Source Coding Theory (Hardcover, 1990 ed.): Robert M. Gray Source Coding Theory (Hardcover, 1990 ed.)
Robert M. Gray
R1,393 Discovery Miles 13 930 Ships in 12 - 17 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

Toeplitz and Circulant Matrices - A Review (Paperback, New): Robert M. Gray Toeplitz and Circulant Matrices - A Review (Paperback, New)
Robert M. Gray
R999 Discovery Miles 9 990 Ships in 10 - 15 working days

Toeplitz and Circulant Matrices: A review derives in a tutorial manner the fundamental theorems on the asymptotic behavior of eigenvalues, inverses, and products of banded Toeplitz matrices and Toeplitz matrices with absolutely summable elements. Mathematical elegance and generality are sacrificed for conceptual simplicity and insight in the hope of making these results available to engineers lacking either the background or endurance to attack the mathematical literature on the subject. By limiting the generality of the matrices considered, the essential ideas and results can be conveyed in a more intuitive manner without the mathematical machinery required for the most general cases. As an application the results are applied to the study of the covariance matrices and their factors of linear models of discrete time random processes. Toeplitz and Circulant Matrices: A review is written for students and practicing engineers in an accessible manner bringing this important topic to a wider audience.

Linear Predictive Coding and the Internet Protocol (Hardcover, New): Robert M. Gray Linear Predictive Coding and the Internet Protocol (Hardcover, New)
Robert M. Gray
R1,056 Discovery Miles 10 560 Ships in 10 - 15 working days

In December 1974 the first realtime conversation on the ARPAnet took place between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts. This was the first successful application of realtime digital speech communication over a packet network and an early milestone in the explosion of realtime signal processing of speech, audio, images, and video that we all take for granted today. It could be considered as the first voice over Internet Protocol (VoIP), except that the Internet Protocol (IP) had not yet been established. In fact, the interest in realtime signal processing had an indirect, but major, impact on the development of IP. This is the story of the development of linear predictive coded (LPC) speech and how it came to be used in the first successful packet speech experiments. Several related stories are recounted as well. The history is preceded by a tutorial on linear prediction methods which incorporates a variety of views to provide context for the stories. This part is a technical survey of the fundamental ideas of linear prediction that are important for speech processing, but the development departs from traditional treatments and takes advantage of several shortcuts, simplifications, and unifications that come with years of hindsight. In particular, some of the key results are proved using short and simple techniques that are not as well known as they should be, and it also addresses some of the common assumptions made when modeling random signals. Linear Predictive Coding and the Internet Protocol is an insightful and comprehensive review of an underpinning technology of the internet and other packet switched networks. It will be enjoyed by everyone with an interest in past and present real time signal processing on the internet.

An Introduction to Statistical Signal Processing (Paperback): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Paperback)
Robert M. Gray, Lee D. Davisson
R2,241 Discovery Miles 22 410 Ships in 10 - 15 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

An Introduction to Statistical Signal Processing (Hardcover, New): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Hardcover, New)
Robert M. Gray, Lee D. Davisson
R3,833 Discovery Miles 38 330 Ships in 10 - 15 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Baby Dove Soap Bar Rich Moisture 75g
R20 Discovery Miles 200
Cacharel Anais Anais L'original Eau De…
 (1)
R2,317 R989 Discovery Miles 9 890
Multi Colour Jungle Stripe Neckerchief
R119 Discovery Miles 1 190
Gotcha Digital-Midsize 30 M-WR Ladies…
R250 R198 Discovery Miles 1 980
Fine Living E-Table (Black | White)
 (7)
R319 R199 Discovery Miles 1 990
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
HP 250 G9 15.6" Celeron Notebook…
R5,700 Discovery Miles 57 000
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Aerolatte Cappuccino Art Stencils (Set…
R110 R95 Discovery Miles 950

 

Partners