0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R500 - R1,000 (2)
  • R1,000 - R2,500 (4)
  • R2,500 - R5,000 (12)
  • -
Status
Brand

Showing 1 - 18 of 18 matches in All Departments

Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Hardcover, 2nd ed. 2009)
Robert M. Gray
R4,444 Discovery Miles 44 440 Ships in 12 - 17 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists.

Highlights:

Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest.

Structures mathematics for an engineering audience, with emphasis on engineering applications.

New in the Second Edition:

Much of the material has been rearranged and revised for pedagogical reasons.

The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems.

The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals.

Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Entropy and Information Theory (Hardcover, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Hardcover, 2nd ed. 2011)
Robert M. Gray
R4,791 Discovery Miles 47 910 Ships in 12 - 17 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Stochastic Image Processing (Hardcover, 2004 ed.): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Hardcover, 2004 ed.)
Chee Sun Won, Robert M. Gray
R2,791 Discovery Miles 27 910 Ships in 10 - 15 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.): Robert M. Gray, Joseph W Goodman Fourier Transforms - An Introduction for Engineers (Hardcover, 1995 ed.)
Robert M. Gray, Joseph W Goodman
R3,465 Discovery Miles 34 650 Ships in 12 - 17 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental (primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included. Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Vector Quantization and Signal Compression (Hardcover, 1992 ed.): Allen Gersho, Robert M. Gray Vector Quantization and Signal Compression (Hardcover, 1992 ed.)
Allen Gersho, Robert M. Gray
R3,222 Discovery Miles 32 220 Ships in 12 - 17 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

Source Coding Theory (Hardcover, 1990 ed.): Robert M. Gray Source Coding Theory (Hardcover, 1990 ed.)
Robert M. Gray
R1,584 Discovery Miles 15 840 Ships in 10 - 15 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.): Jia Li, Robert M. Gray Image Segmentation and Compression Using Hidden Markov Models (Hardcover, 2000 ed.)
Jia Li, Robert M. Gray
R4,316 Discovery Miles 43 160 Ships in 10 - 15 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Linear Predictive Coding and the Internet Protocol (Hardcover, New): Robert M. Gray Linear Predictive Coding and the Internet Protocol (Hardcover, New)
Robert M. Gray
R997 Discovery Miles 9 970 Ships in 10 - 15 working days

In December 1974 the first realtime conversation on the ARPAnet took place between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts. This was the first successful application of realtime digital speech communication over a packet network and an early milestone in the explosion of realtime signal processing of speech, audio, images, and video that we all take for granted today. It could be considered as the first voice over Internet Protocol (VoIP), except that the Internet Protocol (IP) had not yet been established. In fact, the interest in realtime signal processing had an indirect, but major, impact on the development of IP. This is the story of the development of linear predictive coded (LPC) speech and how it came to be used in the first successful packet speech experiments. Several related stories are recounted as well. The history is preceded by a tutorial on linear prediction methods which incorporates a variety of views to provide context for the stories. This part is a technical survey of the fundamental ideas of linear prediction that are important for speech processing, but the development departs from traditional treatments and takes advantage of several shortcuts, simplifications, and unifications that come with years of hindsight. In particular, some of the key results are proved using short and simple techniques that are not as well known as they should be, and it also addresses some of the common assumptions made when modeling random signals. Linear Predictive Coding and the Internet Protocol is an insightful and comprehensive review of an underpinning technology of the internet and other packet switched networks. It will be enjoyed by everyone with an interest in past and present real time signal processing on the internet.

Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009): Robert M. Gray Probability, Random Processes, and Ergodic Properties (Paperback, 2nd ed. 2009)
Robert M. Gray
R4,487 Discovery Miles 44 870 Ships in 10 - 15 working days

Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists. Highlights: Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest. Structures mathematics for an engineering audience, with emphasis on engineering applications. New in the Second Edition: Much of the material has been rearranged and revised for pedagogical reasons. The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems. The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals. Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.

Entropy and Information Theory (Paperback, 2nd ed. 2011): Robert M. Gray Entropy and Information Theory (Paperback, 2nd ed. 2011)
Robert M. Gray
R4,990 Discovery Miles 49 900 Ships in 10 - 15 working days

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004): Chee Sun Won, Robert M. Gray Stochastic Image Processing (Paperback, Softcover reprint of the original 1st ed. 2004)
Chee Sun Won, Robert M. Gray
R2,756 Discovery Miles 27 560 Ships in 10 - 15 working days

Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.

An Introduction to Statistical Signal Processing (Paperback): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Paperback)
Robert M. Gray, Lee D. Davisson
R1,477 Discovery Miles 14 770 Ships in 12 - 17 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992): Allen Gersho, Robert M.... Vector Quantization and Signal Compression (Paperback, Softcover reprint of the original 1st ed. 1992)
Allen Gersho, Robert M. Gray
R3,164 Discovery Miles 31 640 Ships in 10 - 15 working days

Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e., data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity."

Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000): Jia... Image Segmentation and Compression Using Hidden Markov Models (Paperback, Softcover reprint of the original 1st ed. 2000)
Jia Li, Robert M. Gray
R4,189 Discovery Miles 41 890 Ships in 10 - 15 working days

In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book. Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors. Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization. Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.

Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995): Robert M. Gray,... Fourier Transforms - An Introduction for Engineers (Paperback, Softcover reprint of the original 1st ed. 1995)
Robert M. Gray, Joseph W Goodman
R3,294 Discovery Miles 32 940 Ships in 10 - 15 working days

The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental ('primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included.Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.

Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990): Robert M. Gray Source Coding Theory (Paperback, Softcover reprint of the original 1st ed. 1990)
Robert M. Gray
R1,444 Discovery Miles 14 440 Ships in 10 - 15 working days

Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.

An Introduction to Statistical Signal Processing (Hardcover, New): Robert M. Gray, Lee D. Davisson An Introduction to Statistical Signal Processing (Hardcover, New)
Robert M. Gray, Lee D. Davisson
R2,575 R2,411 Discovery Miles 24 110 Save R164 (6%) Ships in 12 - 17 working days

This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.

Toeplitz and Circulant Matrices - A Review (Paperback, New): Robert M. Gray Toeplitz and Circulant Matrices - A Review (Paperback, New)
Robert M. Gray
R945 Discovery Miles 9 450 Ships in 10 - 15 working days

Toeplitz and Circulant Matrices: A review derives in a tutorial manner the fundamental theorems on the asymptotic behavior of eigenvalues, inverses, and products of banded Toeplitz matrices and Toeplitz matrices with absolutely summable elements. Mathematical elegance and generality are sacrificed for conceptual simplicity and insight in the hope of making these results available to engineers lacking either the background or endurance to attack the mathematical literature on the subject. By limiting the generality of the matrices considered, the essential ideas and results can be conveyed in a more intuitive manner without the mathematical machinery required for the most general cases. As an application the results are applied to the study of the covariance matrices and their factors of linear models of discrete time random processes. Toeplitz and Circulant Matrices: A review is written for students and practicing engineers in an accessible manner bringing this important topic to a wider audience.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Loot
Nadine Gordimer Paperback  (2)
R383 R318 Discovery Miles 3 180
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Croxley Create Wood Free Pencil Crayons…
R12 Discovery Miles 120
Snappy Tritan Bottle (1.5L)(Coral)
R229 R180 Discovery Miles 1 800
Joseph Joseph Index Mini (Graphite)
R642 Discovery Miles 6 420
Morbius
Jared Leto, Matt Smith, … DVD R172 Discovery Miles 1 720
Snappy Tritan Bottle (1.2L)(Blue)
 (2)
R239 R169 Discovery Miles 1 690
ShooAway Fly Repellent Fan (Black)
 (6)
R299 R259 Discovery Miles 2 590
The Personal History Of David…
Dev Patel, Peter Capaldi, … DVD  (1)
R63 Discovery Miles 630
Colleen Pencil Crayons - Assorted…
R127 Discovery Miles 1 270

 

Partners