![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Signal processing
This book will introduce design methodologies, known as Built-in-Self-Test (BiST) and Built-in-Self-Calibration (BiSC), which enhance the robustness of radio frequency (RF) and millimeter wave (mmWave) integrated circuits (ICs). These circuits are used in current and emerging communication, computing, multimedia and biomedical products and microchips. The design methodologies presented will result in enhancing the yield (percentage of working chips in a high volume run) of RF and mmWave ICs which will enable successful manufacturing of such microchips in high volume.
The English edition is based upon the second edition of the German version of the book. The author would like to thank Mr. A.H. Armstrong for providing the basic English manuscript of the text, his critical reading, and valuable comments. Thanks are also due to Mrs. A. Demmer, Mr. J. Matern, Mrs. B. Titze and Mrs. S. Pfetsch for preparing the camera ready manuscript and the figures. Springer Verlag has generously supported the project and cooperating with them has been a great pleasure. Ulm, April 1992 K.J. Ebeling Preface to the First German Edition This book is a comprehensive introduction to waveguide optics and photonics in semiconductor crystals. Interest is centered on integrated optoelectronic devices for the transmission and processing of optical signals. These optical communi cations engineering devices are becoming increasingly important for optical disk storage systems, for optical chip-chip interconnections and of course for optical fiber transmission and exchange."
According to market analysts, the market for consumer electronics will con tinue to grow at a rate higher than that of electronic systems in general. The consumer market can be characterized by rapidly growing complexities of appli cations and a rather short market window. As a result, more and more complex designs have to be completed in shrinking time frames. A key concept for coping with such stringent requirements is re-use. Since the re-use of completely fixed large hardware blocks is limited to subproblems of system-level applications (for example MPEG-2), flexible, programmable pro cessors are being used as building blocks for more and more designs. Processors provide a unique combination offeatures: they provide flexibility and re-use. The processors used in consumer electronics are, however, in many cases dif ferent from those that are used for screen and keyboard-based equipment, such as PCs. For the consumer market in particular, efficiency of the product plays a dominating role. Hence, processor architectures for these applications are usually highly-optimized and tailored towards a certain application domain."
An engineer's introduction to concepts, algorithms, and advancements in Digital Signal Processing. This lucidly written resource makes extensive use of real-world examples as it covers all the important design and engineering references.
This book provides an overview of advanced digital image and signal processing techniques that are currently being applied in the realm of measurement systems. The book is a selection of extended versions of the best papers presented at the Sixth IEEE International Workshop on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications IDAACS 2011 related to this topic and encompass applications that go from multidimensional imaging to evoked potential detection in brain computer interfaces. The objective was to provide a broad spectrum of measurement applications so that the different techniques and approaches could be presented. Digital Image and Signal Processing for Measurement Systems concentrates on signal processing for measurement systems and its objective is to provide a general overview of the area and an appropriate introduction to the topics considered. This is achieved through 10 chapters devoted to current topics of research addressed by different research groups within this area. These 10 chapters reflect advances corresponding to signals of different dimensionality. They go from mostly one dimensional signals in what would be the most traditional area of signal processing realm to RGB signals and to signals of very high dimensionality such as hyperspectral signals that can go up to dimensionalities of more than one thousand. The chapters have been thought out to provide an easy to follow introduction to the topics that are addressed, including the most relevant references, so that anyone interested in this field can get started in the area. They provide an overview of some of the problems in the area of signal and image processing for measurement systems and the approaches and techniques that relevant research groups within this area are employing to try to solve them which, in many instances are the state of the art of some of these topics.
This book focuses linear estimation theory, which is essential for effective signal processing. The first section offers a comprehensive overview of key methods like reduced-rank signal processing and Krylov subspace methods of numerical mathematics. Also, the relationship between statistical signal processing and numerical mathematics is presented. In the second part, the theory is applied to iterative multiuser detection receivers (Turbo equalization) which are typically desired in wireless communications systems.
The book tries to briefly introduce the diverse literatures in the field of fractional order signal processing which is becoming an emerging topic among an interdisciplinary community of researchers. This book is aimed at postgraduate and beginning level research scholars who would like to work in the field of Fractional Order Signal processing (FOSP). The readers should have preliminary knowledge about basic signal processing techniques. Prerequisite knowledge of fractional calculus is not essential and is exposited at relevant places in connection to the appropriate signal processing topics. Basic signal processing techniques like filtering, estimation, system identification, etc. in the light of fractional order calculus are presented along with relevant application areas. The readers can easily extend these concepts to varied disciplines like image or speech processing, pattern recognition, time series forecasting, financial data analysis and modeling, traffic modeling in communication channels, optics, biomedical signal processing, electrochemical applications and many more. Adequate references are provided in each category so that the researchers can delve deeper into each area and broaden their horizon of understanding. Available MATLAB tools to simulate FOSP theories are also introduced so that the readers can apply the theoretical concepts right-away and gain practical insight in the specific domain.
The statistical bootstrap is one of the methods that can be used to calculate estimates of a certain number of unknown parameters of a random process or a signal observed in noise, based on a random sample. Such situations are common in signal processing and the bootstrap is especially useful when only a small sample is available or an analytical analysis is too cumbersome or even impossible. This book covers the foundations of the bootstrap, its properties, its strengths and its limitations. The authors focus on bootstrap signal detection in Gaussian and non-Gaussian interference as well as bootstrap model selection. The theory developed in the book is supported by a number of useful practical examples written in MATLAB. The book is aimed at graduate students and engineers, and includes applications to real-world problems in areas such as radar and sonar, biomedical engineering and automotive engineering.
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
The 9th ISMM conference covered a very diverse collection of papers, bound together by the central themes of mathematical morphology, namely, the tre- ment of images in terms of set and lattice theory. Notwithstanding this central theme, this ISMM showed increasing interaction with other ?elds of image and signal processing, and several hybrid methods were presented, which combine the strengths of traditional morphological methods with those of, for example, linear ?ltering.This trendis particularlystrong in the emerging?eld of adaptive morphological ?ltering, where the local shape of structuring elements is det- mined by non-morphological techniques. This builds on previous developments of PDE-based methods in morphology and amoebas. In segmentation we see similar advancements, in the development of morphological active contours. Even within morphology itself, diversi?cation is great, and many new areas of research are being opened up. In particular, morphology of graph-based and complex-based image representations are being explored. Likewise, in the we- established area of connected ?ltering we ?nd new theory and new algorithms, but also expansion into the direction of hyperconnected ?lters. New advances in morphological machine learning, multi-valued and fuzzy morphology are also presented. Notwithstanding the often highly theoretical reputation of mathematical morphology, practitioners in this ?eld have always had an eye for the practical.
Applied Signal Processing: A MATLAB-Based Proof of Concept benefits readers by including the teaching background of experts in various applied signal processing fields and presenting them in a project-oriented framework. Unlike many other MATLAB-based textbooks which only use MATLAB to illustrate theoretical aspects, this book provides fully commented MATLAB code for working proofs-of-concept. The MATLAB code provided on the accompanying online files is the very heart of the material. In addition each chapter offers a functional introduction to the theory required to understand the code as well as a formatted presentation of the contents and outputs of the MATLAB code. Each chapter exposes how digital signal processing is applied for solving a real engineering problem used in a consumer product. The chapters are organized with a description of the problem in its applicative context and a functional review of the theory related to its solution appearing first. Equations are only used for a precise description of the problem and its final solutions. Then a step-by-step MATLAB-based proof of concept, with full code, graphs, and comments follows. The solutions are simple enough for readers with general signal processing background to understand and they use state-of-the-art signal processing principles. Applied Signal Processing: A MATLAB-Based Proof of Concept is an ideal companion for most signal processing course books. It can be used for preparing student labs and projects.
Now available in a three-volume set, this updated and expanded edition of the bestselling The Digital Signal Processing Handbook continues to provide the engineering community with authoritative coverage of the fundamental and specialized aspects of information-bearing signals in digital form. Encompassing essential background material, technical details, standards, and software, the second edition reflects cutting-edge information on signal processing algorithms and protocols related to speech, audio, multimedia, and video processing technology associated with standards ranging from WiMax to MP3 audio, low-power/high-performance DSPs, color image processing, and chips on video. Drawing on the experience of leading engineers, researchers, and scholars, the three-volume set contains 29 new chapters that address multimedia and Internet technologies, tomography, radar systems, architecture, standards, and future applications in speech, acoustics, video, radar, and telecommunications. This volume, Wireless, Networking, Radar, Sensor Array Processing, and Nonlinear Signal Processing, provides complete coverage of the foundations of signal processing related to wireless, radar, space-time coding, and mobile communications, together with associated applications to networking, storage, and communications.
Bayesian optimization is a methodology for optimizing expensive objective functions that has proven success in the sciences, engineering, and beyond. This timely text provides a self-contained and comprehensive introduction to the subject, starting from scratch and carefully developing all the key ideas along the way. This bottom-up approach illuminates unifying themes in the design of Bayesian optimization algorithms and builds a solid theoretical foundation for approaching novel situations. The core of the book is divided into three main parts, covering theoretical and practical aspects of Gaussian process modeling, the Bayesian approach to sequential decision making, and the realization and computation of practical and effective optimization policies. Following this foundational material, the book provides an overview of theoretical convergence results, a survey of notable extensions, a comprehensive history of Bayesian optimization, and an extensive annotated bibliography of applications.
This book deals with the problem of detecting and localizing multiple simultaneously active wideband acoustic sources by applying the notion of wavefield decomposition using circular and spherical microphone arrays. A rigorous derivation of modal array signal processing algorithms for unambiguous source detection and localization, as well as performance evaluations by means of measurements using an actual real-time capable implementation, are discussed.
This self-contained introduction to machine learning, designed from the start with engineers in mind, will equip students with everything they need to start applying machine learning principles and algorithms to real-world engineering problems. With a consistent emphasis on the connections between estimation, detection, information theory, and optimization, it includes: an accessible overview of the relationships between machine learning and signal processing, providing a solid foundation for further study; clear explanations of the differences between state-of-the-art techniques and more classical methods, equipping students with all the understanding they need to make informed technique choices; demonstration of the links between information-theoretical concepts and their practical engineering relevance; reproducible examples using Matlab, enabling hands-on student experimentation. Assuming only a basic understanding of probability and linear algebra, and accompanied by lecture slides and solutions for instructors, this is the ideal introduction to machine learning for engineering students of all disciplines.
With its intuitive yet rigorous approach to machine learning, this text provides students with the fundamental knowledge and practical tools needed to conduct research and build data-driven products. The authors prioritize geometric intuition and algorithmic thinking, and include detail on all the essential mathematical prerequisites, to offer a fresh and accessible way to learn. Practical applications are emphasized, with examples from disciplines including computer vision, natural language processing, economics, neuroscience, recommender systems, physics, and biology. Over 300 color illustrations are included and have been meticulously designed to enable an intuitive grasp of technical concepts, and over 100 in-depth coding exercises (in Python) provide a real understanding of crucial machine learning algorithms. A suite of online resources including sample code, data sets, interactive lecture slides, and a solutions manual are provided online, making this an ideal text both for graduate courses on machine learning and for individual reference and self-study.
Practical emphasis to teach students to use the powerful ideas of adaptive control in real applications Custom-made Matlab(r) functionality to facilitate the design and construction of self-tuning controllers for different processes and systems Examples, tutorial exercises and clearly laid-out flowcharts and formulae to make the subject simple to follow for students and to help tutors with class preparation
Provides easy learning and understanding of DWT from a signal processing point of view * Presents DWT from a digital signal processing point of view, in contrast to the usual mathematical approach, making it highly accessible * Offers a comprehensive coverage of related topics, including convolution and correlation, Fourier transform, FIR filter, orthogonal and biorthogonal filters * Organized systematically, starting from the fundamentals of signal processing to the more advanced topics of DWT and Discrete Wavelet Packet Transform. * Written in a clear and concise manner with abundant examples, figures and detailed explanations * Features a companion website that has several MATLAB programs for the implementation of the DWT with commonly used filters This well-written textbook is an introduction to the theory of discrete wavelet transform (DWT) and its applications in digital signal and image processing. -- Prof. Dr. Manfred Tasche - Institut fur Mathematik, Uni Rostock Full review at https://zbmath.org/?q=an:06492561
Teaches students about classical and nonclassical adaptive systems within one pair of covers Helps tutors with time-saving course plans, ready-made practical assignments and examination guidance The recently developed "practical sub-space adaptive filter" allows the reader to combine any set of classical and/or non-classical adaptive systems to form a powerful technology for solving complex nonlinear problems
This book deals with various theoretical and practical methods for real-time automatic signal processing in local (and regional) seismic networks and associated software developments, including extraction of small seismic signal from noisy observation by piecewise modeling and self-organizing state space modeling, determination of arrival time of S wave by locally multivariate stationary AT modeling, automatic interpretation of seismic signal by combining cumulativ sum and simulative annealing (CUSUM-SA), AR-filtering for local and teleseismic events, the currently high sensitivity seismic network running in Japan (Hi-net), PC-based computer package for automatic detection and location of earthquakes, real-time automatic seismic data-processing in seismic network running in eastern Sicily (Italy), the SIL (South Iceland Lowland) seismological data acquisition system and routine analysis in Iceland and Sweden.
Within the healthcare domain, big data is defined as any ``high volume, high diversity biological, clinical, environmental, and lifestyle information collected from single individuals to large cohorts, in relation to their health and wellness status, at one or several time points.'' Such data is crucial because within it lies vast amounts of invaluable information that could potentially change a patient's life, opening doors to alternate therapies, drugs, and diagnostic tools. Signal Processing and Machine Learning for Biomedical Big Data thus discusses modalities; the numerous ways in which this data is captured via sensors; and various sample rates and dimensionalities. Capturing, analyzing, storing, and visualizing such massive data has required new shifts in signal processing paradigms and new ways of combining signal processing with machine learning tools. This book covers several of these aspects in two ways: firstly, through theoretical signal processing chapters where tools aimed at big data (be it biomedical or otherwise) are described; and, secondly, through application-driven chapters focusing on existing applications of signal processing and machine learning for big biomedical data. This text aimed at the curious researcher working in the field, as well as undergraduate and graduate students eager to learn how signal processing can help with big data analysis. It is the hope of Drs. Sejdic and Falk that this book will bring together signal processing and machine learning researchers to unlock existing bottlenecks within the healthcare field, thereby improving patient quality-of-life. Provides an overview of recent state-of-the-art signal processing and machine learning algorithms for biomedical big data, including applications in the neuroimaging, cardiac, retinal, genomic, sleep, patient outcome prediction, critical care, and rehabilitation domains. Provides contributed chapters from world leaders in the fields of big data and signal processing, covering topics such as data quality, data compression, statistical and graph signal processing techniques, and deep learning and their applications within the biomedical sphere. This book's material covers how expert domain knowledge can be used to advance signal processing and machine learning for biomedical big data applications.
Multimodal Interfaces represents an emerging interdisciplinary research direction and has become one of the frontiers in Computer Science. Multimodal interfaces aim at efficient, convenient and natural interaction and communication between computers (in their broadest sense) and human users. They will ultimately enable users to interact with computers using their everyday skills. These proceedings include the papers accepted for presentation at the Third International Conference on Multimodal Interfaces (ICMI 2000) held in Beijing, China on 1416 O ctober 2000. The papers were selected from 172 contributions submitted worldwide. Each paper was allocated for review to three members of the Program Committee, which consisted of more than 40 leading researchers in the field. Final decisions of 38 oral papers and 48 poster papers were made based on the reviewers' comments and the desire for a balance of topics. The decision to have a single track conference led to a competitive selection process and it is very likely that some good submissions are not included in this volume. The papers collected here cover a wide range of topics such as affective and perceptual computing, interfaces for wearable and mobile computing, gestures and sign languages, face and facial expression analysis, multilingual interfaces, virtual and augmented reality, speech and handwriting, multimodal integration and application systems. They represent some of the latest progress in multimodal interfaces research.
This volume and the accompanying software describe and demonstrate all the basics and fundamentals of modern computer graphics. After an overview of computer graphics, the following chapters--complete with discussions and exercises--are devoted to modeling of 3D objects with polygons and wireframes; animation of modeled objects; and rendering of photorealistic images from the modeled objects, including lighting, shading, and texture mapping. After modeling, animating, and rendering, coverage details how to add special effects such as warping, bending, or morphing, as described in the chapter on image manipulation and postproduction. The book concludes with a look into the future of computer graphics and an overview of computer graphics in various fields. The CD-ROM software includes a complete 3D graphics application with a user-friendly graphical interface, which can be used to perform all the exercises in the book.
This book concerns modern methods in scientific computing and linear algebra, relevant to image and signal processing. For these applications, it is important to consider ingredients such as: (1) sophisticated mathematical models of the problems, including a priori knowledge, (2) rigorous mathematical theories to understand the difficulties of solving problems which are ill-posed, and (3) fast algorithms for either real-time or data-massive computations. Such are the topics brought into focus by these proceedings of the Workshop on Scientific Computing (held in Hong Kong on March 10-12, 1997, the sixth in such series of Workshops held in Hong Kong since 1990), where the major themes were on numerical linear algebra, signal processing, and image processing. |
You may like...
Conceptual Spaces: Elaborations and…
Mauri Kaipainen, Frank Zenker, …
Hardcover
R3,112
Discovery Miles 31 120
Linear and Integer Programming Made Easy
T.C. Hu, Andrew B. Kahng
Hardcover
R2,118
Discovery Miles 21 180
Some Tapas of Computer Algebra
Arjeh M. Cohen, Hans Cuypers, …
Hardcover
R1,461
Discovery Miles 14 610
Advances and Applications in Chaotic…
Sundarapandian Vaidyanathan, Christos Volos
Hardcover
R4,121
Discovery Miles 41 210
Creativity in Computing and DataFlow…
Suyel Namasudra, Veljko Milutinovic
Hardcover
R4,204
Discovery Miles 42 040
Arithmetic and Algebraic Circuits
Antonio Lloris Ruiz, Encarnacion Castillo Morales, …
Hardcover
R4,827
Discovery Miles 48 270
CABology: Value of Cloud, Analytics and…
Nitin Upadhyay
Hardcover
Quantum Communication Networks
Riccardo Bassoli, Holger Boche, …
Hardcover
R2,216
Discovery Miles 22 160
|