![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > Signal processing
For upper-level undergraduate courses in deterministic and stochastic signals and system engineering An Integrative Approach to Signals, Systems and Inference Signals, Systems and Inference is a comprehensive text that builds on introductory courses in time- and frequency-domain analysis of signals and systems, and in probability. Directed primarily to upper-level undergraduates and beginning graduate students in engineering and applied science branches, this new textbook pioneers a novel course of study. Instead of the usual leap from broad introductory subjects to highly specialized advanced subjects, this engaging and inclusive text creates a study track for a transitional course. Properties and representations of deterministic signals and systems are reviewed and elaborated on, including group delay and the structure and behavior of state-space models. The text also introduces and interprets correlation functions and power spectral densities for describing and processing random signals. Application contexts include pulse amplitude modulation, observer-based feedback control, optimum linear filters for minimum mean-square-error estimation, and matched filtering for signal detection. Model-based approaches to inference are emphasized, in particular for state estimation, signal estimation, and signal detection. The text explores ideas, methods and tools common to numerous fields involving signals, systems and inference: signal processing, control, communication, time-series analysis, financial engineering, biomedicine, and many others. Signals, Systems, and Inference is a long-awaited and flexible text that can be used for a rigorous course in a broad range of engineering and applied science curricula.
Discrete-time signal processing has had a momentous impact on advances in engineering and science over recent decades. The rapid progress of digital and mixed-signal integrated circuits in processing speed, functionality and cost-effectiveness has led to their ubiquitous employment in signal processing and transmission in diverse milieux. The absence of training or pilot signals from many kinds of transmission a" in, for example, speech analysis, seismic exploration and texture image analysis a" necessitates the widespread use of blind equalization and system identification. There have been a great many algorithms developed for these purposes, working with one- or two-dimensional (2-d) signals and with single-input single-output (SISO) or multiple-input multiple-output (MIMO), real or complex systems. It is now time for a unified treatment of this subject, pointing out the common characteristics and the sometimes close relations of these algorithms as well as learning from their different perspectives. Blind Equalization and System Identification provides such a unified treatment presenting theory, performance analysis, simulation, implementation and applications. Topics covered include: a [ SISO, MIMO and 2-d non-blind equalization (deconvolution) algorithms; a [ SISO, MIMO and 2-d blind equalization (deconvolution) algorithms; a [ SISO, MIMO and 2-d blind system identification algorithms; a [ algorithm analyses and improvements; a [ applications of SISO, MIMO and 2-d blind equalization/identification algorithms. Each chapter is completed by exercises and computer assignments designed to further understanding and to give practical experiencewith the algorithms discussed. This is a textbook for graduate-level courses in discrete-time random processes, statistical signal processing, and blind equalization and system identification. It contains material which will also interest researchers and practicing engineers working in digital communications, source separation, speech processing, image processing, seismic exploration, sonar, radar and other, similar applications.
Dealing with digital filtering methods for 1-D and 2-D signals,
this book provides the theoretical background in signal processing,
covering topics such as the z-transform, Shannon sampling theorem
and fast Fourier transform. An entire chapter is devoted to the
design of time-continuous filters which provides a useful
preliminary step for analog-to-digital filter conversion.
Semiconductor-based Ultra-Fast All-Optical Signal Processing Devices -a key technology for the next generation of ultrahigh bandwidth optical communication systems! The introduction of ultra-fast communication systems based on all-optical signal processing is considered to be one of the most promising ways to handle the rapidly increasing global communication traffic. Such systems will enable real time super-high definition moving pictures such as high reality TV-conference, remote diagnosis and surgery, cinema entertainment and many other applications with small power consumption. The key issue to realize such systems is to develop ultra-fast optical devices such as light sources, all-optical gates and wavelength converters. "Ultra-Fast All-Optical Signal Processing Devices" discusses the state of the art development of semiconductor-based ultrafast all-optical devices, and their various signal processing applications for bit-rates 100Gb/s to 1Tb/s. Ultra-Fast All-Optical Signal Processing Devices: Provides a thorough and in-depth treatment of the most recent achievements in ultrafast all-optical devices Discusses future networks with applications such as HD-TV and super-high definition moving screens as a motivating background for devices research Covers mode-locked semiconductor lasers, electro-absorption modulator based 160Gb/s signal sources, SOA based symmetric Mach-Zehnder type all-optical gates, intersubband transition gate device, and more Explains the technical issues behind turning the ultra-fast optical devices into practical working tools Examples of above 160Gb/s transmission experiments Discusses future prospects of the ultra-fastsignal processing devices This invaluable reference will provide device researchers and engineers in industry, researchers at universities (including graduate students, and post doctorial researchers and professors) and research institutes with a thorough understanding of ultrahigh bandwidth optical communication systems. Device and communication market watchers will also find this book useful.
Over the past decades a considerable interest has been concentrated on problems involving signals and systems that depend on more than one variable. 2-D signals and systems have been studied in relation to several modern engineering fields such as process control, multidimensional digital filtering, image enhancement, image deblurring, signal processing etc. Among the major results developed so far, 2-D digital filters are investigated as a description in frequency domain or as a convolution of the input and the unit response, which has a great potential for practical applications in 2-D image and signal processing. This monograph aims to address several problems of control and filtering of 2-D discrete systems. Specifically the problems of Hinfinity filtering, Hinfinity control, stabilization, Hinfinity model reduction as well as Hinfinity deconvolution filtering of 2-D linear discrete systems are treated.
Advanced Methods in Biomedical Signal Processing and Analysis presents state-of-the-art methods in biosignal processing, including recurrence quantification analysis, heart rate variability, analysis of the RRI time-series signals, joint time-frequency analyses, wavelet transforms and wavelet packet decomposition, empirical mode decomposition, modeling of biosignals, Gabor Transform, empirical mode decomposition. The book also gives an understanding of feature extraction, feature ranking, and feature selection methods, while also demonstrating how to apply artificial intelligence and machine learning to biosignal techniques.
The book contains the proceedings of the 8th Eurographics Rendering Workshop, which took place from 16th to 18th June, 1997, in Saint Etienne, France. After a series of seven successful events the workshop is now well established as the major international forum in the field of rendering and illumination techniques. It brought together the experts of this field. Their recent research results are compiled in this proceedings together with many color images that demonstrate new ideas and techniques. This year we received a total of 63 submissions of which 28 were selected for the workshop after a period of careful reviewing and evaluation by the 27 mem bers of the international program committee. The quality of the submissions was again very high and, unfortunately, many interesting papers had to be rejected. In addition to regular papers the program also contains two invited lectures by Shenchang Eric Chen (Live Picture) and Per Christensen (Mental Images). The papers in this proceedings contain new research results in the areas of Finite-Element and Monte-Carlo illumination algorithms, image-based render ing, outdoor and natural illumination, error metrics, perception, texture and color handling, data acquisition for rendering, and efficient use of hardware. While some contributions report results from more efficient or elegant algo rithms, others pursue new and experimental approaches to find better solutions to the open problems in rendering."
An introduction to a new design for nonlinear control systems—backstepping—written by its own architects. This innovative book breaks new ground in nonlinear and adaptive control design for systems with uncertainties. Introducing the recursive backstepping methodology, it shows—for the first time—how uncertain systems with severe nonlinearities can be successfully controlled with this new powerful design tool. Communicative and accessible at a level not usually present in research texts, Nonlinear and Adaptive Control Design can be used as either a stand-alone or a supplemental text in courses on nonlinear or adaptive control, as well as in control research and applications. It eases the reader into the subject matter, assuming only standard undergraduate knowledge of control theory, and provides a pedagogical presentation of the material, most of which is completely new and not available in other textbooks. Written by the creators of backstepping, the book:
Nonlinear and Adaptive Control Design is an absolute must for researchers and graduate students with an interest in nonlinear systems, adaptive control, stability and differential equations and for anyone who would like to find out about the new and exciting advances in these areas.
Get to grips with the principles and practice of signal processing used in mobile communications systems. Focusing particularly on speech, video, and modem signal processing, pioneering experts employ a detailed, top-down analytical approach to outline the network architectures and protocol structures of multiple generations of mobile communications systems, identify the logical ranges where media and radio signal processing occur, and analyze the procedures for capturing, compressing, transmitting, and presenting media. Chapters are uniquely structured to show the evolution of network architectures and technical elements between generations up to and including 5G, with an emphasis on maximizing service quality and network capacity through re-using existing infrastructure and technologies. Implementation examples and data taken from commercial networks provide an in-depth insight into the operation of real mobile communications systems, including GSM, cdma2000, W-CDMA, LTE, and LTE-A, making this a practical, hands-on guide for both practicing engineers and graduate students in wireless communications.
Signal processing is the analysis, interpretation, and manipulation of signals. Signals of interest include sound, images, biological signals such as ECG, radar signals, and many others. Processing of such signals includes storage and reconstruction, separation of information from noise (for example, aircraft identification by radar), compression (for example, image compression), and feature extraction (for example, speech-to-text conversion). This book presents the latest research in the field from around the world.
Gain a solid understanding of how information theoretic approaches can inform the design of more secure information systems and networks with this authoritative text. With a particular focus on theoretical models and analytical results, leading researchers show how techniques derived from the principles of source and channel coding can provide new ways of addressing issues of data security, embedded security, privacy, and authentication in modern information systems. A wide range of wireless and cyber-physical systems is considered, including 5G cellular networks, the Tactile Internet, biometric identification systems, online data repositories, and smart electricity grids. This is an invaluable guide for both researchers and graduate students working in communications engineering, and industry practitioners and regulators interested in improving security in the next generation of information systems.
The Second Edition of Quantum Information Processing, Quantum Computing, and Quantum Error Correction: An Engineering Approach presents a self-contained introduction to all aspects of the area, teaching the essentials such as state vectors, operators, density operators, measurements, and dynamics of a quantum system. In additional to the fundamental principles of quantum computation, basic quantum gates, basic quantum algorithms, and quantum information processing, this edition has been brought fully up to date, outlining the latest research trends. These include: Key topics include: Quantum error correction codes (QECCs), including stabilizer codes, Calderbank-Shor-Steane (CSS) codes, quantum low-density parity-check (LDPC) codes, entanglement-assisted QECCs, topological codes, and surface codes Quantum information theory, and quantum key distribution (QKD) Fault-tolerant information processing and fault-tolerant quantum error correction, together with a chapter on quantum machine learning. Both quantum circuits- and measurement-based quantum computational models are described The next part of the book is spent investigating physical realizations of quantum computers, encoders and decoders; including photonic quantum realization, cavity quantum electrodynamics, and ion traps In-depth analysis of the design and realization of a quantum information processing and quantum error correction circuits This fully up-to-date new edition will be of use to engineers, computer scientists, optical engineers, physicists and mathematicians.
This book is a companion to the original book by Johnson and Graham, High-Speed Digital Design: A Handbook of Black Magic, Prentice-Hall, 1993. The two books may be used separately, or together. They cover different material. KEY TOPICS: High-Speed Signal Propagation delves into the issues relevant to signal transmission at the upper limits of speed and distance. This book shows you how to transmit faster and further than ever before, considering today's digital networks and wireless devices. You'll find it packed with practical advice, including material never before published on the subject of high-speed design. Johnson also presents a complete and unified theory of signal propagation for all metallic media from cables to pcb traces to chips. It includes numerous examples, pictures, tables, and wide-ranging discussion of the high-speed properties of transmission lines. This is not yet another book on the subject of ringing and crosstalk. It's about long, high-speed channels operating at the upper limits of speed and distance. EDN Magazine will feature and 1- 1/2 page excerpt from Johnson's book each month, for seven months leading up to the book's publication. MARKET: The reader should know what a transmission line is, and have some general understanding of the relationship between the time domain and the frequency domain. Digital logic designers, system architects, EMC specialists, technicians, and printed wiring layout professionals. Anyone who works with long, high-speed signal channels fighting tradeoffs between speed and distance.
Das Buch vermittelt grundlegende Kenntnisse zur Synthese kombinatorischer (Schaltnetze) und sequentieller Schaltungen (Schaltwerke/Automaten) und wendet sich dabei vor allem an Studierende der Ingenieurwissenschaften.
This work addresses this problem in the short-time Fourier transform (STFT) domain. We divide the general problem into five basic categories depending on the number of microphones being used and whether the interframe or interband correlation is considered. The first category deals with the single-channel problem where STFT coefficients at different frames and frequency bands are assumed to be independent. In this case, the noise reduction filter in each frequency band is basically a real gain. Since a gain does not improve the signal-to-noise ratio (SNR) for any given subband and frame, the noise reduction is basically achieved by liftering the subbands and frames that are less noisy while weighing down on those that are more noisy. The second category also concerns the single-channel problem. The difference is that now the interframe correlation is taken into account and a filter is applied in each subband instead of just a gain. The advantage of using the interframe correlation is that we can improve not only the long-time fullband SNR, but the frame-wise subband SNR as well. The third and fourth classes discuss the problem of multichannel noise reduction in the STFT domain with and without interframe correlation, respectively. In the last category, we consider the interband correlation in the design of the noise reduction filters. We illustrate the basic principle for the single-channel case as an example, while this concept can be generalized to other scenarios. In all categories, we propose different optimization cost functions from which we derive the optimal filters and we also define the performance measures that help analyzing them.
Over the last 50 years there have been an increasing number of applications of algebraic tools to solve problems in communications, in particular in the fields of error-control codes and cryptography. More recently, broader applications have emerged, requiring quite sophisticated algebra - for example, the Alamouti scheme in MIMO communications is just Hamilton's quaternions in disguise and has spawned the use of PhD-level algebra to produce generalizations. Likewise, in the absence of credible alternatives, the industry has in many cases been forced to adopt elliptic curve cryptography. In addition, algebra has been successfully applied to problems in signal processing such as face recognition, biometrics, control design, and signal design for radar. This book introduces the reader to the algebra they need to appreciate these developments and to various problems solved by these techniques.
Stochastic differential equations are differential equations whose solutions are stochastic processes. They exhibit appealing mathematical properties that are useful in modeling uncertainties and noisy phenomena in many disciplines. This book is motivated by applications of stochastic differential equations in target tracking and medical technology and, in particular, their use in methodologies such as filtering, smoothing, parameter estimation, and machine learning. It builds an intuitive hands-on understanding of what stochastic differential equations are all about, but also covers the essentials of Ito calculus, the central theorems in the field, and such approximation schemes as stochastic Runge-Kutta. Greater emphasis is given to solution methods than to analysis of theoretical properties of the equations. The book's practical approach assumes only prior understanding of ordinary differential equations. The numerous worked examples and end-of-chapter exercises include application-driven derivations and computational assignments. MATLAB/Octave source code is available for download, promoting hands-on work with the methods.
Digital Signal Processing: Fundamentals and Applications, Third Edition, not only introduces students to the fundamental principles of DSP, it also provides a working knowledge that they take with them into their engineering careers. Many instructive, worked examples are used to illustrate the material, and the use of mathematics is minimized for an easier grasp of concepts. As such, this title is also useful as a reference for non-engineering students and practicing engineers. The book goes beyond DSP theory, showing the implementation of algorithms in hardware and software. Additional topics covered include adaptive filtering with noise reduction and echo cancellations, speech compression, signal sampling, digital filter realizations, filter design, multimedia applications, over-sampling, etc. More advanced topics are also covered, such as adaptive filters, speech compression such as PCM, -law, ADPCM, and multi-rate DSP, over-sampling ADC subband coding, and wavelet transform.
Following the successful PCS Auction conducted by the US Federal Communications Commission in 1994, auctions have replaced traditional ways of allocating valuable radio spectrum, a key resource for any mobile telecommunications operator. Spectrum auctions have raised billions of dollars worldwide and have become a role model for market-based approaches in the public and private sectors. The design of spectrum auctions is a central application of game theory and auction theory due to its importance in industry and the theoretical challenges it presents. Several auction formats have been developed with different properties addressing fundamental questions about efficiently selling multiple objects to a group of buyers. This comprehensive handbook features classic papers and new contributions by international experts on all aspects of spectrum auction design, including pros and cons of different auctions and lessons learned from theory, experiments, and the field, providing a valuable resource for regulators, telecommunications professionals, consultants, and researchers.
This book is devoted to the application of advanced signal processing on event-related potentials (ERPs) in the context of electroencephalography (EEG) for the cognitive neuroscience. ERPs are usually produced through averaging single-trials of preprocessed EEG, and then, the interpretation of underlying brain activities is based on the ordinarily averaged EEG. We find that randomly fluctuating activities and artifacts can still present in the averaged EEG data, and that constant brain activities over single trials can overlap with each other in time, frequency and spatial domains. Therefore, before interpretation, it will be beneficial to further separate the averaged EEG into individual brain activities. The book proposes systematic approaches pre-process wavelet transform (WT), independent component analysis (ICA), and nonnegative tensor factorization (NTF) to filter averaged EEG in time, frequency and space domains to sequentially and simultaneously obtain the pure ERP of interest. Software of the proposed approaches will be open-accessed.
Computer vision seeks a process that starts with a noisy, ambiguous signal from a TV camera and ends with a high-level description of discrete objects located in 3-dimensional space and identified in a human classification. This book addresses the process at several levels. First to be treated are the low-level image-processing issues of noise removaland smoothing while preserving important lines and singularities in an image. At a slightly higher level, a robust contour tracing algorithm is described that produces a cartoon of the important lines in the image. Thirdis the high-level task of reconstructing the geometry of objects in the scene. The book has two aims: to give the computer vision community a new approach to early visual processing, in the form of image segmentation that incorporates occlusion at a low level, and to introduce real computer algorithms that do a better job than what most vision programmers use currently. The algorithms are: - a nonlinear filter that reduces noise and enhances edges, - an edge detector that also finds corners and produces smoothed contours rather than bitmaps, - an algorithm for filling gaps in contours.
Originally published in 1968, Harry Van Trees's Detection, Estimation, and Modulation Theory, Part I is one of the great time-tested classics in the field of signal processing. Highly readable and practically organized, it is as imperative today for professionals, researchers, and students in optimum signal processing as it was over thirty years ago. The second edition is a thorough revision and expansion almost doubling the size of the first edition and accounting for the new developments thus making it again the most comprehensive and up-to-date treatment of the subject. With a wide range of applications such as radar, sonar, communications, seismology, biomedical engineering, and radar astronomy, among others, the important field of detection and estimation has rarely been given such expert treatment as it is here. Each chapter includes section summaries, realistic examples, and a large number of challenging problems that provide excellent study material. This volume which is Part I of a set of four volumes is the most important and widely used textbook and professional reference in the field.
Learn about the most recent theoretical and practical advances in radar signal processing using tools and techniques from compressive sensing. Providing a broad perspective that fully demonstrates the impact of these tools, the accessible and tutorial-like chapters cover topics such as clutter rejection, CFAR detection, adaptive beamforming, random arrays for radar, space-time adaptive processing, and MIMO radar. Each chapter includes coverage of theoretical principles, a detailed review of current knowledge, and discussion of key applications, and also highlights the potential benefits of using compressed sensing algorithms. A unified notation and numerous cross-references between chapters make it easy to explore different topics side by side. Written by leading experts from both academia and industry, this is the ideal text for researchers, graduate students and industry professionals working in signal processing and radar.
Combining clear explanations of elementary principles, advanced topics and applications with step-by-step mathematical derivations, this textbook provides a comprehensive yet accessible introduction to digital signal processing. All the key topics are covered, including discrete-time Fourier transform, z-transform, discrete Fourier transform and FFT, A/D conversion, and FIR and IIR filtering algorithms, as well as more advanced topics such as multirate systems, the discrete cosine transform and spectral signal processing. Over 600 full-color illustrations, 200 fully worked examples, hundreds of end-of-chapter homework problems and detailed computational examples of DSP algorithms implemented in MATLAB (R) and C aid understanding, and help put knowledge into practice. A wealth of supplementary material accompanies the book online, including interactive programs for instructors, a full set of solutions and MATLAB (R) laboratory exercises, making this the ideal text for senior undergraduate and graduate courses on digital signal processing. |
![]() ![]() You may like...
Dewey the Library Cat: A True Story
Vicki Myron, Bret Witter
Paperback
|