Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Applications of computing > Signal processing
Amazon.com's Top-Selling DSP Book for Seven Straight Years--Now Fully Updated " ""Understanding Digital Signal Processing, Third Edition, "is quite simply the best resource for engineers and other technical professionals who want to master and apply today's latest DSP techniques. Richard G. Lyons has updated and expanded his best-selling second edition to reflect the newest technologies, building on the exceptionally readable coverage that made it the favorite of DSP professionals worldwide. He has also added hands-on problems to every chapter, giving students even more of the practical experience they need to succeed. Comprehensive in scope and clear in approach, this book achieves the perfect balance between theory and practice, keeps math at a tolerable level, and makes DSP exceptionally accessible to beginners without ever oversimplifying it. Readers can thoroughly grasp the basics and quickly move on to more sophisticated techniques. This edition adds extensive new coverage of FIR and IIR filter analysis techniques, digital differentiators, integrators, and matched filters. Lyons has significantly updated and expanded his discussions of multirate processing techniques, which are crucial to modern wireless and satellite communications. He also presents nearly twice as many DSP Tricks as in the second edition--including techniques even seasoned DSP professionals may have overlooked. Coverage includesNew homework problems that deepen your understanding and help you apply what you've learnedPractical, day-to-day DSP implementations and problem-solving throughoutUseful new guidance on generalized digital networks, including discrete differentiators, integrators, and matched filtersClear descriptions of statistical measures of signals, variance reduction by averaging, and real-world signal-to-noise ratio (SNR) computationA significantly expanded chapter on sample rate conversion (multirate systems) and associated filtering techniquesNew guidance on implementing fast convolution, IIR filter scaling, and moreEnhanced coverage of analyzing digital filter behavior and performance for diverse communications and biomedical applicationsDiscrete sequences/systems, periodic sampling, DFT, FFT, finite/infinite impulse response filters, quadrature (I/Q) processing, discrete Hilbert transforms, binary number formats, and much more
In Object Recognition through Invariant Indexing, Charles Rothwell provides a practical and accessible introduction to two-dimensional shape description using projective invariants while contrasting the various interpretations of the descriptors currently in use. He also surveys a number of new invariant descriptors for three-dimensional shapes that can be recovered from single images, showing how such measures can be used to ease the recognition of real objects by a computer. Rothwell then proceeds to describe a promising new architecture for a real recognition system. In reviewing a broad field of recognition theory, the book is unique in its deft synthesis of research and application. It will be welcomed by students and researchers in computer vision, robotics, pattern recognition, and image and signal processing.
In nonparametric and high-dimensional statistical models, the classical Gauss-Fisher-Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
Written in the intuitive yet rigorous style that readers of A Foundation in Digital Communication have come to expect, this second edition includes entirely new chapters on the radar problem (with Lyapunov's theorem) and intersymbol interference channels, new discussion of the baseband representation of passband noise, and a simpler, more geometric derivation of the optimal receiver for the additive white Gaussian noise channel. Other key topics covered include the definition of the power spectral density of nonstationary stochastic processes, the geometry of the space of energy-limited signals, the isometry properties of the Fourier transform, and complex sampling. Including over 500 homework problems and all the necessary mathematical background, this is the ideal text for one- or two-semester graduate courses on digital communications and courses on stochastic processes and detection theory. Solutions to problems and video lectures are available online.
"Once again, Harry Van Trees has written the definitive textbook and research reference." A comprehensive treatment of optimum array processing Array processing plays an important role in many diverse application areas, including radar, sonar, communications, seismology, radio astronomy, tomography, and cellular communications. Optimum Array Processing gives an integrated presentation of classical and statistical array processing. Classical analysis and synthesis techniques for linear and planar arrays are developed. A statistical characterization of space-time random processes is provided. Many different aspects of optimum array processing are covered, including waveform estimation, adaptive beamforming, parameter estimation, and signal detection. Both plane-wave signals and spatially spread signals are studied, and all results are developed in a pedagogically sound manner. This book provides a fundamental understanding of array processing that is ample preparation for research or implementation of actual array processing systems. It provides a comprehensive synthesis of the array processing literature and includes more than 2,000 references. Readers will find an extensive variety of models and criteria for study and comparison, realistic examples and practical applications of optimum algorithms, challenging problems that expand the book’s material, and detailed derivations of important results. A supplemental Web site is available that contains MATLAB scripts for most of the figures used in the book so readers can explore diverse scenarios. The book uses results from Parts I and III of Detection, Estimation, and Modulation Theory. These two books have been reprinted in paperback for availability. For students in signal processing or professionals looking for thorough understanding of array processing theory, Optimum Array Processing provides authoritative, comprehensive coverage in the same clear manner as the earlier parts of Detection, Estimation, and Modulation Theory.
Originally published in 1968, Harry Van Trees's Detection, Estimation, and Modulation Theory, Part I is one of the great time-tested classics in the field of signal processing. Highly readable and practically organized, it is as imperative today for professionals, researchers, and students in optimum signal processing as it was over thirty years ago. The second edition is a thorough revision and expansion almost doubling the size of the first edition and accounting for the new developments thus making it again the most comprehensive and up-to-date treatment of the subject. With a wide range of applications such as radar, sonar, communications, seismology, biomedical engineering, and radar astronomy, among others, the important field of detection and estimation has rarely been given such expert treatment as it is here. Each chapter includes section summaries, realistic examples, and a large number of challenging problems that provide excellent study material. This volume which is Part I of a set of four volumes is the most important and widely used textbook and professional reference in the field.
Das Buch vermittelt grundlegende Kenntnisse zur Synthese kombinatorischer (Schaltnetze) und sequentieller Schaltungen (Schaltwerke/Automaten) und wendet sich dabei vor allem an Studierende der Ingenieurwissenschaften.
Here's a thorough overview of the state-of-the-art in design and implementation of advanced tracking for single and multiple sensor systems. This practical resource provides modern system designers and analysts with in-depth evaluations of sensor management, kinematic and attribute data processing, data association, situation assessment, and modern tracking and data fusion methods as applied in both military and non-military arenas. Whether you desire background information to get you up-to-speed
or if you want to access only the most recently developed advanced
methods, the book's modular chapter structure makes its easy for
you to get the specific information you're looking for quickly. You
get full coverage of tracking topics such as:
Probability and Random Processes, Second Edition presents pertinent applications to signal processing and communications, two areas of key interest to students and professionals in today's booming communications industry. The book includes unique chapters on narrowband random processes and simulation techniques. It also describes applications in digital communications, information theory, coding theory, image processing, speech analysis, synthesis and recognition, and others. Exceptional exposition and numerous worked out problems make this book extremely readable and accessible. The authors connect the applications discussed in class to the textbook. The new edition contains more real world signal processing and communications applications. It introduces the reader to the basics of probability theory and explores topics ranging from random variables, distributions and density functions to operations on a single random variable. There are also discussions on pairs of random variables; multiple random variables; random sequences and series; random processes in linear systems; Markov processes; and power spectral density. This book is intended for practicing engineers and students in graduate-level courses in the topic.
This work addresses this problem in the short-time Fourier transform (STFT) domain. We divide the general problem into five basic categories depending on the number of microphones being used and whether the interframe or interband correlation is considered. The first category deals with the single-channel problem where STFT coefficients at different frames and frequency bands are assumed to be independent. In this case, the noise reduction filter in each frequency band is basically a real gain. Since a gain does not improve the signal-to-noise ratio (SNR) for any given subband and frame, the noise reduction is basically achieved by liftering the subbands and frames that are less noisy while weighing down on those that are more noisy. The second category also concerns the single-channel problem. The difference is that now the interframe correlation is taken into account and a filter is applied in each subband instead of just a gain. The advantage of using the interframe correlation is that we can improve not only the long-time fullband SNR, but the frame-wise subband SNR as well. The third and fourth classes discuss the problem of multichannel noise reduction in the STFT domain with and without interframe correlation, respectively. In the last category, we consider the interband correlation in the design of the noise reduction filters. We illustrate the basic principle for the single-channel case as an example, while this concept can be generalized to other scenarios. In all categories, we propose different optimization cost functions from which we derive the optimal filters and we also define the performance measures that help analyzing them.
Digital Signal Processing: Fundamentals and Applications, Third Edition, not only introduces students to the fundamental principles of DSP, it also provides a working knowledge that they take with them into their engineering careers. Many instructive, worked examples are used to illustrate the material, and the use of mathematics is minimized for an easier grasp of concepts. As such, this title is also useful as a reference for non-engineering students and practicing engineers. The book goes beyond DSP theory, showing the implementation of algorithms in hardware and software. Additional topics covered include adaptive filtering with noise reduction and echo cancellations, speech compression, signal sampling, digital filter realizations, filter design, multimedia applications, over-sampling, etc. More advanced topics are also covered, such as adaptive filters, speech compression such as PCM, -law, ADPCM, and multi-rate DSP, over-sampling ADC subband coding, and wavelet transform.
Scaling is a mathematical transformation that enlarges or diminishes objects. The technique is used in a variety of areas, including finance and image processing. This book is organized around the notions of scaling phenomena and scale invariance. The various stochastic models commonly used to describe scaling ? self-similarity, long-range dependence and multi-fractals ? are introduced. These models are compared and related to one another. Next, fractional integration, a mathematical tool closely related to the notion of scale invariance, is discussed, and stochastic processes with prescribed scaling properties (self-similar processes, locally self-similar processes, fractionally filtered processes, iterated function systems) are defined. A number of applications where the scaling paradigm proved fruitful are detailed: image processing, financial and stock market fluctuations, geophysics, scale relativity, and fractal time-space.
The aim of this book is the study of signals and deterministic systems, linear, time-invariant, finite dimensions and causal. A set of useful tools is selected for the automatic and signal processing and methods of representation of dynamic linear systems are exposed, and analysis of their behavior. Finally we discuss the estimation, identification and synthesis of control laws for the purpose of stabilization and regulation. The study of signal characteristics and properties systems and knowledge of mathematical tools and treatment methods and analysis, are lately more and more importance and continue to evolve. The reason is that the current state of technology, particularly electronics and computing, enables the production of very advanced processing systems, effective and less expensive despite the complexity.
This book is devoted to the application of advanced signal processing on event-related potentials (ERPs) in the context of electroencephalography (EEG) for the cognitive neuroscience. ERPs are usually produced through averaging single-trials of preprocessed EEG, and then, the interpretation of underlying brain activities is based on the ordinarily averaged EEG. We find that randomly fluctuating activities and artifacts can still present in the averaged EEG data, and that constant brain activities over single trials can overlap with each other in time, frequency and spatial domains. Therefore, before interpretation, it will be beneficial to further separate the averaged EEG into individual brain activities. The book proposes systematic approaches pre-process wavelet transform (WT), independent component analysis (ICA), and nonnegative tensor factorization (NTF) to filter averaged EEG in time, frequency and space domains to sequentially and simultaneously obtain the pure ERP of interest. Software of the proposed approaches will be open-accessed.
Computer vision seeks a process that starts with a noisy, ambiguous signal from a TV camera and ends with a high-level description of discrete objects located in 3-dimensional space and identified in a human classification. This book addresses the process at several levels. First to be treated are the low-level image-processing issues of noise removaland smoothing while preserving important lines and singularities in an image. At a slightly higher level, a robust contour tracing algorithm is described that produces a cartoon of the important lines in the image. Thirdis the high-level task of reconstructing the geometry of objects in the scene. The book has two aims: to give the computer vision community a new approach to early visual processing, in the form of image segmentation that incorporates occlusion at a low level, and to introduce real computer algorithms that do a better job than what most vision programmers use currently. The algorithms are: - a nonlinear filter that reduces noise and enhances edges, - an edge detector that also finds corners and produces smoothed contours rather than bitmaps, - an algorithm for filling gaps in contours.
Over the last 50 years there have been an increasing number of applications of algebraic tools to solve problems in communications, in particular in the fields of error-control codes and cryptography. More recently, broader applications have emerged, requiring quite sophisticated algebra - for example, the Alamouti scheme in MIMO communications is just Hamilton's quaternions in disguise and has spawned the use of PhD-level algebra to produce generalizations. Likewise, in the absence of credible alternatives, the industry has in many cases been forced to adopt elliptic curve cryptography. In addition, algebra has been successfully applied to problems in signal processing such as face recognition, biometrics, control design, and signal design for radar. This book introduces the reader to the algebra they need to appreciate these developments and to various problems solved by these techniques.
Compressed Sensing in Li-Fi and Wi-Fi Networks features coverage of the first applications in optical telecommunications and wireless. After extensive development of basic theory, many techniques are presented, such as non-asymptotic analysis of random matrices, adaptive detection, greedy algorithms, and the use of graphical models. The book can be used as a comprehensive manual for teaching and research in courses covering advanced signal processing, efficient data processing algorithms, and telecommunications. After a thorough review of the basic theory of compressed sensing, many mathematical techniques are presented, including advanced signal modeling, Nyquist sub-sampling of analog signals, the non-asymptotic analysis of random matrices, adaptive detection, greedy algorithms, and the use of graphical models.
The considerable growth of RFID is currently accompanied by the development of numerous identification technologies that complement those already available while seeking to answer new problems. Chipless RFID is one example. The goal is to both significantly reduce the price of the tag and increase the amount of information it contains, in order to compete with the barcode while retaining the benefits of a flexible reading approach based on radio communication. To solve the problem of the number of bits, this book describes the possibility of coding the information at the level of the overall shape of the RCS of the tag, which would facilitate reaching very large quantities. The design of the tags then returns to the resolution of the inverse problem of the electromagnetic signature. The proposed design methodology regularizes the problem by decomposing the signature on a basis of elementary patterns whose signature is chosen in advance.
Optimal filtering applied to stationary and non-stationary signals provides the most efficient means of dealing with problems arising from the extraction of noise signals. Moreover, it is a fundamental feature in a range of applications, such as in navigation in aerospace and aeronautics, filter processing in the telecommunications industry, etc. This book provides a comprehensive overview of this area, discussing random and Gaussian vectors, outlining the results necessary for the creation of Wiener and adaptive filters used for stationary signals, as well as examining Kalman filters which are used in relation to non-stationary signals. Exercises with solutions feature in each chapter to demonstrate the practical application of these ideas using MATLAB.
In diesem Band der Reihe Fachwissen Technische Akustik wird das Verfahren der experimentellen Modalanalyse vorgestellt. Mit diesem Verfahren koennen die von der Ausbreitung von Luft- und Koerperschall bestimmten dynamischen Eigenschaften von Systemen untersucht werden. Beispiele fur solche Systeme sind Strukturen im Maschinen- und Fahrzeugbau oder auch kleinere Innenraume, deren akustischen Verhalten von Interesse ist. In einer Einfuhrung wird zunachst auf den Zusammenhang des physikalischen Modells und des systemtheoretischen Modells eingegangen sowie der Nutzen des modalen Modells fur die Beschreibung der Systemeigenschaften erlautert. Danach wird die dem modalen Modell zugrunde liegende Theorie sowie der Zusammenhang der modalen Parameter mit den im Systemmodell verwendeten Frequenzgangen dargestellt. Verschiedene Verfahren der experimentellen Modalanalyse werden diskutiert, darunter sowohl solche zur getrennten Bestimmung einzelner modaler Parameter als auch solche, bei denen eine Vielzahl modaler Parameter gleichzeitig aus den gemessenen Frequenzgangen ermittelt wird. Zusatzlich wird auf das praktische Vorgehen bei der Gewinnung der dazu notwendigen Messdaten und die Moeglichkeiten zur UEberprufung der Ergebnisse eingegangen. Zur Demonstration der verschiedenen Moeglichkeiten und Verfahren wird ein einfaches praktisches Beispiel ausfuhrlich behandelt. Das umfasst die Vorgehensweise bei der Messung ebenso wie die Anwendung unterschiedlich aufwandiger Verfahren zur Extraktion der modalen Parameter. Dazu werden zahlreiche Ergebnisse gezeigt, so dass Moeglichkeiten und Grenzen der experimentellen Modalanalyse deutlich werden.
Covering a period of about 25 years, during which time-frequency has undergone significant developments, this book is principally addressed to researchers and engineers interested in non-stationary signal analysis and processing. It is written by recognized experts in the field.
This easy-to-follow textbook presents an engaging introduction to the fascinating world of medical image analysis. Avoiding an overly mathematical treatment, the text focuses on intuitive explanations, illustrating the key algorithms and concepts in a way which will make sense to students from a broad range of different backgrounds. Topics and features: explains what light is, and how it can be captured by a camera and converted into an image, as well as how images can be compressed and stored; describes basic image manipulation methods for understanding and improving image quality, and a useful segmentation algorithm; reviews the basic image processing methods for segmenting or enhancing certain features in an image, with a focus on morphology methods for binary images; examines how to detect, describe, and recognize objects in an image, and how the nature of color can be used for segmenting objects; introduces a statistical method to determine what class of object the pixels in an image represent; describes how to change the geometry within an image, how to align two images so that they are as similar as possible, and how to detect lines and paths in images; provides further exercises and other supplementary material at an associated website. This concise and accessible textbook will be invaluable to undergraduate students of computer science, engineering, medicine, and any multi-disciplinary courses that combine topics on health with data science. Medical practitioners working with medical imaging devices will also appreciate this easy-to-understand explanation of the technology.
Power System Small Signal Stability Analysis and Control, Second Edition analyzes severe outages due to the sustained growth of small signal oscillations in modern interconnected power systems. This fully revised edition addresses the continued expansion of power systems and the rapid upgrade to smart grid technologies that call for the implementation of robust and optimal controls. With a new chapter on MATLAB programs, this book describes how the application of power system damping controllers such as Power System Stabilizers and Flexible Alternating Current Transmission System controllers-namely Static Var Compensator and Thyristor Controlled Series Compensator -can guard against system disruptions. Detailed mathematical derivations, illustrated case studies, the application of soft computation techniques, designs of robust controllers, and end-of-chapter exercises make it a useful resource to researchers, practicing engineers, and post-graduates in electrical engineering. |
You may like...
Advanced Signal Processing for Industry…
Irshad Ahmad Ansari, Varun Bajaj
Hardcover
R3,230
Discovery Miles 32 300
Structural Health Monitoring from…
Magd Abdel Wahab, Yun Lai Zhou, …
Hardcover
Signals and Systems - Pearson New…
Rodger Ziemer, William Tranter, …
Paperback
R2,180
Discovery Miles 21 800
Digital Signal Processing - Pearson New…
John Proakis, Dimitris Manolakis
Paperback
R2,604
Discovery Miles 26 040
Advances in Communication Systems and…
J. Jayakumari, George K. Karagiannidis, …
Hardcover
R5,628
Discovery Miles 56 280
The Handbook of Multimodal-Multisensor…
Sharon Oviatt, Bjoern Schuller, …
Hardcover
R3,063
Discovery Miles 30 630
Signals and Systems - Pearson New…
Alan Oppenheim, Alan Willsky, …
Paperback
R2,563
Discovery Miles 25 630
|