![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > Signal processing
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This first volume, Foundations, introduces core topics in inference and learning, such as matrix theory, linear algebra, random variables, convex optimization and stochastic optimization, and prepares students for studying their practical application in later volumes. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 600 end-of-chapter problems (including solutions for instructors), 100 figures, 180 solved examples, datasets and downloadable Matlab code. Supported by sister volumes Inference and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.
Over the past decades a considerable interest has been concentrated on problems involving signals and systems that depend on more than one variable. 2-D signals and systems have been studied in relation to several modern engineering fields such as process control, multidimensional digital filtering, image enhancement, image deblurring, signal processing etc. Among the major results developed so far, 2-D digital filters are investigated as a description in frequency domain or as a convolution of the input and the unit response, which has a great potential for practical applications in 2-D image and signal processing. This monograph aims to address several problems of control and filtering of 2-D discrete systems. Specifically the problems of Hinfinity filtering, Hinfinity control, stabilization, Hinfinity model reduction as well as Hinfinity deconvolution filtering of 2-D linear discrete systems are treated.
Bivariate Markov processes play a central role in the theory and applications of estimation, control, queuing, biomedical engineering, and reliability. Bivariate Markov Processes and Their Estimation presents some of the fundamentals of the theory of bivariate Markov processes, and reviews the various parameters and signal estimation approaches that are associated with these Markov processes. It reviews both causal and non-causal estimation of some statistics of the bivariate Markov processes. In addition, it covers off-line as well as on-line recursive parameter estimation approaches. Bivariate Markov Processes and Their Estimation is an ideal springboard for researchers and students who are interested in pursuing the study of this interesting family of processes. While proofs are generally omitted, an interested reader should be able to implement the estimation algorithms for bivariate Markov chains directly from the text. The material should be accessible to the signal processing community, although it requires some familiarity with Markov chains and the intricacies of the theory of hidden Markov models.
Markov Random Fields in Image Segmentation introduces the fundamentals of Markovian modeling in image segmentation as well as providing a brief overview of recent advances in the field. Segmentation is considered in a common framework, called image labelling, where the problem is reduced to assigning labels to pixels. In a probabilistic approach, label dependencies are modeled by Markov random fields (MRF) and an optimal labeling is determined by Bayesian estimation, in particular maximum a posteriori (MAP) estimation. The main advantage of MRF models is that prior information can be imposed locally through clique potentials. The primary goal is to demonstrate the basic steps to construct an easily applicable MRF segmentation model and further develop its multiscale and hierarchical implementations as well as their combination in a multilayer model. MRF models usually yield a non-convex energy function. The minimization of this function is crucial in order to find the most likely segmentation according to the MRF model. Besides classical optimization algorithms like simulated annealing or deterministic relaxation, this book also presents recently introduced graph cut-based algorithms. It discusses the possible parallelization techniques of simulated annealing, which allows efficient implementation on, for example, GPU hardware without compromising convergence properties of the algorithms. While the main focus of this monograph is on generic model construction and related energy minimization methods, many sample applications are also presented to demonstrate the applicability of these models in real life problems such as remote sensing, biomedical imaging, change detection, and color- and motion-based segmentation. In real-life applications, parameter estimation is an important issue when implementing completely data-driven algorithms. Therefore some basic procedures, such as expectation-maximization, are also presented in the context of color image segmentation. Markov Random Fields in Image Segmentation is an essential companion for students, researchers and practitioners working on, or about to embark on research in statistical image segmentation.
The book contains the proceedings of the 8th Eurographics Rendering Workshop, which took place from 16th to 18th June, 1997, in Saint Etienne, France. After a series of seven successful events the workshop is now well established as the major international forum in the field of rendering and illumination techniques. It brought together the experts of this field. Their recent research results are compiled in this proceedings together with many color images that demonstrate new ideas and techniques. This year we received a total of 63 submissions of which 28 were selected for the workshop after a period of careful reviewing and evaluation by the 27 mem bers of the international program committee. The quality of the submissions was again very high and, unfortunately, many interesting papers had to be rejected. In addition to regular papers the program also contains two invited lectures by Shenchang Eric Chen (Live Picture) and Per Christensen (Mental Images). The papers in this proceedings contain new research results in the areas of Finite-Element and Monte-Carlo illumination algorithms, image-based render ing, outdoor and natural illumination, error metrics, perception, texture and color handling, data acquisition for rendering, and efficient use of hardware. While some contributions report results from more efficient or elegant algo rithms, others pursue new and experimental approaches to find better solutions to the open problems in rendering."
Spectral analysis is widely used to interpret time series collected in diverse areas. This book covers the statistical theory behind spectral analysis and provides data analysts with the tools needed to transition theory into practice. Actual time series from oceanography, metrology, atmospheric science and other areas are used in running examples throughout, to allow clear comparison of how the various methods address questions of interest. All major nonparametric and parametric spectral analysis techniques are discussed, with emphasis on the multitaper method, both in its original formulation involving Slepian tapers and in a popular alternative using sinusoidal tapers. The authors take a unified approach to quantifying the bandwidth of different nonparametric spectral estimates. An extensive set of exercises allows readers to test their understanding of theory and practical analysis. The time series used as examples and R language code for recreating the analyses of the series are available from the book's website.
An introduction to a new design for nonlinear control systems—backstepping—written by its own architects. This innovative book breaks new ground in nonlinear and adaptive control design for systems with uncertainties. Introducing the recursive backstepping methodology, it shows—for the first time—how uncertain systems with severe nonlinearities can be successfully controlled with this new powerful design tool. Communicative and accessible at a level not usually present in research texts, Nonlinear and Adaptive Control Design can be used as either a stand-alone or a supplemental text in courses on nonlinear or adaptive control, as well as in control research and applications. It eases the reader into the subject matter, assuming only standard undergraduate knowledge of control theory, and provides a pedagogical presentation of the material, most of which is completely new and not available in other textbooks. Written by the creators of backstepping, the book:
Nonlinear and Adaptive Control Design is an absolute must for researchers and graduate students with an interest in nonlinear systems, adaptive control, stability and differential equations and for anyone who would like to find out about the new and exciting advances in these areas.
Learn about the state-of-the-art at the interface between information theory and data science with this first unified treatment of the subject. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information-theoretic methods are being used in data acquisition, data representation, data analysis, and statistics and machine learning. Coverage is broad, with chapters on signal acquisition, data compression, compressive sensing, data communication, representation learning, emerging topics in statistics, and much more. Each chapter includes a topic overview, definition of the key problems, emerging and open problems, and an extensive reference list, allowing readers to develop in-depth knowledge and understanding. Providing a thorough survey of the current research area and cutting-edge trends, this is essential reading for graduate students and researchers working in information theory, signal processing, machine learning, and statistics.
An Introduction to Frames is an introduction to redundant signal representations called frames. These representations have recently emerged as yet another powerful tool in the signal processing toolbox, spurred by a host of recent applications requiring some level of redundancy. It asks the question: Why and where should one use frames? And answers emphatically: Anywhere where redundancy is a must. It then goes on to discuss a host of applications that richly illustrate that answer. An Introduction to Frames is geared primarily toward engineering students and those without extensive mathematical training. It is also intended to help researchers and practitioners decide whether frames are the right tool for their application.
Written in the intuitive yet rigorous style that readers of A Foundation in Digital Communication have come to expect, this second edition includes entirely new chapters on the radar problem (with Lyapunov's theorem) and intersymbol interference channels, new discussion of the baseband representation of passband noise, and a simpler, more geometric derivation of the optimal receiver for the additive white Gaussian noise channel. Other key topics covered include the definition of the power spectral density of nonstationary stochastic processes, the geometry of the space of energy-limited signals, the isometry properties of the Fourier transform, and complex sampling. Including over 500 homework problems and all the necessary mathematical background, this is the ideal text for one- or two-semester graduate courses on digital communications and courses on stochastic processes and detection theory. Solutions to problems and video lectures are available online.
This book is a companion to the original book by Johnson and Graham, High-Speed Digital Design: A Handbook of Black Magic, Prentice-Hall, 1993. The two books may be used separately, or together. They cover different material. KEY TOPICS: High-Speed Signal Propagation delves into the issues relevant to signal transmission at the upper limits of speed and distance. This book shows you how to transmit faster and further than ever before, considering today's digital networks and wireless devices. You'll find it packed with practical advice, including material never before published on the subject of high-speed design. Johnson also presents a complete and unified theory of signal propagation for all metallic media from cables to pcb traces to chips. It includes numerous examples, pictures, tables, and wide-ranging discussion of the high-speed properties of transmission lines. This is not yet another book on the subject of ringing and crosstalk. It's about long, high-speed channels operating at the upper limits of speed and distance. EDN Magazine will feature and 1- 1/2 page excerpt from Johnson's book each month, for seven months leading up to the book's publication. MARKET: The reader should know what a transmission line is, and have some general understanding of the relationship between the time domain and the frequency domain. Digital logic designers, system architects, EMC specialists, technicians, and printed wiring layout professionals. Anyone who works with long, high-speed signal channels fighting tradeoffs between speed and distance.
Das Buch vermittelt grundlegende Kenntnisse zur Synthese kombinatorischer (Schaltnetze) und sequentieller Schaltungen (Schaltwerke/Automaten) und wendet sich dabei vor allem an Studierende der Ingenieurwissenschaften.
Here's a thorough overview of the state-of-the-art in design and implementation of advanced tracking for single and multiple sensor systems. This practical resource provides modern system designers and analysts with in-depth evaluations of sensor management, kinematic and attribute data processing, data association, situation assessment, and modern tracking and data fusion methods as applied in both military and non-military arenas. Whether you desire background information to get you up-to-speed
or if you want to access only the most recently developed advanced
methods, the book's modular chapter structure makes its easy for
you to get the specific information you're looking for quickly. You
get full coverage of tracking topics such as:
This work addresses this problem in the short-time Fourier transform (STFT) domain. We divide the general problem into five basic categories depending on the number of microphones being used and whether the interframe or interband correlation is considered. The first category deals with the single-channel problem where STFT coefficients at different frames and frequency bands are assumed to be independent. In this case, the noise reduction filter in each frequency band is basically a real gain. Since a gain does not improve the signal-to-noise ratio (SNR) for any given subband and frame, the noise reduction is basically achieved by liftering the subbands and frames that are less noisy while weighing down on those that are more noisy. The second category also concerns the single-channel problem. The difference is that now the interframe correlation is taken into account and a filter is applied in each subband instead of just a gain. The advantage of using the interframe correlation is that we can improve not only the long-time fullband SNR, but the frame-wise subband SNR as well. The third and fourth classes discuss the problem of multichannel noise reduction in the STFT domain with and without interframe correlation, respectively. In the last category, we consider the interband correlation in the design of the noise reduction filters. We illustrate the basic principle for the single-channel case as an example, while this concept can be generalized to other scenarios. In all categories, we propose different optimization cost functions from which we derive the optimal filters and we also define the performance measures that help analyzing them.
Over the last 50 years there have been an increasing number of applications of algebraic tools to solve problems in communications, in particular in the fields of error-control codes and cryptography. More recently, broader applications have emerged, requiring quite sophisticated algebra - for example, the Alamouti scheme in MIMO communications is just Hamilton's quaternions in disguise and has spawned the use of PhD-level algebra to produce generalizations. Likewise, in the absence of credible alternatives, the industry has in many cases been forced to adopt elliptic curve cryptography. In addition, algebra has been successfully applied to problems in signal processing such as face recognition, biometrics, control design, and signal design for radar. This book introduces the reader to the algebra they need to appreciate these developments and to various problems solved by these techniques.
Gain a solid understanding of how information theoretic approaches can inform the design of more secure information systems and networks with this authoritative text. With a particular focus on theoretical models and analytical results, leading researchers show how techniques derived from the principles of source and channel coding can provide new ways of addressing issues of data security, embedded security, privacy, and authentication in modern information systems. A wide range of wireless and cyber-physical systems is considered, including 5G cellular networks, the Tactile Internet, biometric identification systems, online data repositories, and smart electricity grids. This is an invaluable guide for both researchers and graduate students working in communications engineering, and industry practitioners and regulators interested in improving security in the next generation of information systems.
Master the usage of s-parameters in signal integrity applications and gain full understanding of your simulation and measurement environment with this rigorous and practical guide. Solve specific signal integrity problems including calculation of the s-parameters of a network, linear simulation of circuits, de-embedding, and virtual probing, all with expert guidance. Learn about the interconnectedness of s-parameters, frequency responses, filters, and waveforms. This invaluable resource for signal integrity engineers is supplemented with the open-source software SignalIntegrity, a Python package for scripting solutions to signal integrity problems.
Elucidating fundamental design principles by means of accurate trade-off analysis of relevant design options using suitable mathematical tools, this is the first book to provide a coherent treatment of transmission technologies essential to current and future wireless systems. Develop in-depth knowledge of the capabilities and limitations of wireless transmission technologies in supporting high-quality wireless transmission services, and foster a thorough understanding of various design trade-offs, to help identify an ideal choice for your own application requirements. Key technologies such as advanced diversity combining, multi-user scheduling, multi-user multi-antenna transmission, relay transmission, and cognitive radio are examined, making this an essential resource for senior graduate students, researchers, and engineers working in wireless communications.
Following the successful PCS Auction conducted by the US Federal Communications Commission in 1994, auctions have replaced traditional ways of allocating valuable radio spectrum, a key resource for any mobile telecommunications operator. Spectrum auctions have raised billions of dollars worldwide and have become a role model for market-based approaches in the public and private sectors. The design of spectrum auctions is a central application of game theory and auction theory due to its importance in industry and the theoretical challenges it presents. Several auction formats have been developed with different properties addressing fundamental questions about efficiently selling multiple objects to a group of buyers. This comprehensive handbook features classic papers and new contributions by international experts on all aspects of spectrum auction design, including pros and cons of different auctions and lessons learned from theory, experiments, and the field, providing a valuable resource for regulators, telecommunications professionals, consultants, and researchers.
This book is devoted to the application of advanced signal processing on event-related potentials (ERPs) in the context of electroencephalography (EEG) for the cognitive neuroscience. ERPs are usually produced through averaging single-trials of preprocessed EEG, and then, the interpretation of underlying brain activities is based on the ordinarily averaged EEG. We find that randomly fluctuating activities and artifacts can still present in the averaged EEG data, and that constant brain activities over single trials can overlap with each other in time, frequency and spatial domains. Therefore, before interpretation, it will be beneficial to further separate the averaged EEG into individual brain activities. The book proposes systematic approaches pre-process wavelet transform (WT), independent component analysis (ICA), and nonnegative tensor factorization (NTF) to filter averaged EEG in time, frequency and space domains to sequentially and simultaneously obtain the pure ERP of interest. Software of the proposed approaches will be open-accessed.
Computer vision seeks a process that starts with a noisy, ambiguous signal from a TV camera and ends with a high-level description of discrete objects located in 3-dimensional space and identified in a human classification. This book addresses the process at several levels. First to be treated are the low-level image-processing issues of noise removaland smoothing while preserving important lines and singularities in an image. At a slightly higher level, a robust contour tracing algorithm is described that produces a cartoon of the important lines in the image. Thirdis the high-level task of reconstructing the geometry of objects in the scene. The book has two aims: to give the computer vision community a new approach to early visual processing, in the form of image segmentation that incorporates occlusion at a low level, and to introduce real computer algorithms that do a better job than what most vision programmers use currently. The algorithms are: - a nonlinear filter that reduces noise and enhances edges, - an edge detector that also finds corners and produces smoothed contours rather than bitmaps, - an algorithm for filling gaps in contours.
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This second volume, Inference, builds on the foundational topics established in volume I to introduce students to techniques for inferring unknown variables and quantities, including Bayesian inference, Monte Carlo Markov Chain methods, maximum-likelihood estimation, hidden Markov models, Bayesian networks, and reinforcement learning. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 350 end-of-chapter problems (including solutions for instructors), 180 solved examples, almost 200 figures, datasets and downloadable Matlab code. Supported by sister volumes Foundations and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.
In diesem Band der Reihe Fachwissen Technische Akustik wird das Verfahren der experimentellen Modalanalyse vorgestellt. Mit diesem Verfahren koennen die von der Ausbreitung von Luft- und Koerperschall bestimmten dynamischen Eigenschaften von Systemen untersucht werden. Beispiele fur solche Systeme sind Strukturen im Maschinen- und Fahrzeugbau oder auch kleinere Innenraume, deren akustischen Verhalten von Interesse ist. In einer Einfuhrung wird zunachst auf den Zusammenhang des physikalischen Modells und des systemtheoretischen Modells eingegangen sowie der Nutzen des modalen Modells fur die Beschreibung der Systemeigenschaften erlautert. Danach wird die dem modalen Modell zugrunde liegende Theorie sowie der Zusammenhang der modalen Parameter mit den im Systemmodell verwendeten Frequenzgangen dargestellt. Verschiedene Verfahren der experimentellen Modalanalyse werden diskutiert, darunter sowohl solche zur getrennten Bestimmung einzelner modaler Parameter als auch solche, bei denen eine Vielzahl modaler Parameter gleichzeitig aus den gemessenen Frequenzgangen ermittelt wird. Zusatzlich wird auf das praktische Vorgehen bei der Gewinnung der dazu notwendigen Messdaten und die Moeglichkeiten zur UEberprufung der Ergebnisse eingegangen. Zur Demonstration der verschiedenen Moeglichkeiten und Verfahren wird ein einfaches praktisches Beispiel ausfuhrlich behandelt. Das umfasst die Vorgehensweise bei der Messung ebenso wie die Anwendung unterschiedlich aufwandiger Verfahren zur Extraktion der modalen Parameter. Dazu werden zahlreiche Ergebnisse gezeigt, so dass Moeglichkeiten und Grenzen der experimentellen Modalanalyse deutlich werden. |
![]() ![]() You may like...
Advances in Imaging and Electron Physics…
Peter W. Hawkes, Martin Hytch
Hardcover
Signal Processing in Medicine and…
Iyad Obeid, Ivan Selesnick, …
Hardcover
R3,558
Discovery Miles 35 580
Photoplethysmography - Technology…
Panicos A. Kyriacou, John Allen
Paperback
R4,746
Discovery Miles 47 460
|