![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Electronics & communications engineering > Electronics engineering > Applied optics
The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications. The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, such as those related with Computer Graphics, Computer Vision, Computer Imaging, Biomedicine, Bioengineering, Mathematics, Physics, Medical Imaging and Medicine.
This book describes techniques for realizing wide bandwidth (125MHz) over-sampled analog-to-digital converters (ADCs) in nano meter-CMOS processes.The authors offer a clear and complete picture of system level challenges and practical design solutions in high-speed Delta-Sigma modulators. Readers will be enabled to implement ADCs as continuous-time delta-sigma (CT ) modulators, offering simple resistive inputs, which do not require the use of power-hungry input buffers, as well as offering inherent anti-aliasing, which simplifies system integration. The authors focus on the design of high speed and wide-bandwidth Ms that make a step in bandwidth range which was previously only possible with Nyquist converters. More specifically, this book describes the stability, power efficiency and linearity limits of Ms, aiming at a GHz sampling frequency."
Mathematical methods play a significant role in the rapidly growing field of nonlinear optical materials. This volume discusses a number of successful or promising contributions. The overall theme of this volume is twofold: (1) the challenges faced in computing and optimizing nonlinear optical material properties; and (2) the exploitation of these properties in important areas of application. These include the design of optical amplifiers and lasers, as well as novel optical switches. Research topics in this volume include how to exploit the magnetooptic effect, how to work with the nonlinear optical response of materials, how to predict laser-induced breakdown in efficient optical devices, and how to handle electron cloud distortion in femtosecond processes.
This both accessible and exhaustive book will help to improve modeling of attention and to inspire innovations in industry. It introduces the study of attention and focuses on attention modeling, addressing such themes as saliency models, signal detection and different types of signals, as well as real-life applications. The book is truly multi-disciplinary, collating work from psychology, neuroscience, engineering and computer science, amongst other disciplines. What is attention? We all pay attention every single moment of our lives. Attention is how the brain selects and prioritizes information. The study of attention has become incredibly complex and divided: this timely volume assists the reader by drawing together work on the computational aspects of attention from across the disciplines. Those working in the field as engineers will benefit from this book's introduction to the psychological and biological approaches to attention, and neuroscientists can learn about engineering work on attention. The work features practical reviews and chapters that are quick and easy to read, as well as chapters which present deeper, more complex knowledge. Everyone whose work relates to human perception, to image, audio and video processing will find something of value in this book, from students to researchers and those in industry.
With a preface by Ton Kalker. Informed Watermarking is an essential tool for both academic and professional researchers working in the areas of multimedia security, information embedding, and communication. Theory and practice are linked, particularly in the area of multi-user communication. From the Preface: Watermarking has become a more mature discipline with proper foundation in both signal processing and information theory. We can truly say that we are in the era of "second generation" watermarking. This book is first in addressing watermarking problems in terms of second-generation insights. It provides a complete overview of the most important results on capacity and security. The Costa scheme, and in particular a simpler version of it, the Scalar Costa scheme, is studied in great detail. An important result of this book is that it is possible to approach the Shannon limit within a few decibels in a practical system. These results are verified on real-world data, not only the classical category of images, but also on chemical structure sets. Inspired by the work of Moulin and O'Sullivan, this book also addresses security aspects by studying AGWN attacks in terms of game theory. "The authors of Informed Watermarking give a well-written exposA(c) of how watermarking came of age, where we are now, and what to expect in the future. It is my expectation that this book will be a standard reference on second-generation watermarking for the years to come." Ton Kalker, Technische Universiteit Eindhoven
The fourth volume in a series, this work details coverage of recent developments in the field of optical fibre sensors. It describes the impact which fibre sensors are having in such areas as chemical and environmental monitoring, structural instrumentation and "smart" structure, process control and engineering, and specialist industrial measurements.
Of all the recent discoveries in biotechnology, that of biosensor is one of those which has seen an exponential expansion over the last few years. This evolution corresponds with the increasing need for measuring devices that can follow continuously changing biological processes. Biosensors can meet this need provided that their signals include all the information necessary for an understanding of the process, especially concerning the nature and concentration of the species present in the sample medium. It is well known that sensors form the basis of all instrumental analysis systems, but they also represent the limiting factors of such systems. In this book, we restrict ourselves to the description and study of sensors, leaving aside the different aspects of signal and data treatment. We believe, however, that it is important to stress the multifaceted character of biosensors, and the applications and economic factors which follow. Biosensor construction is essentially based on the immobilization of a bioreceptor on the corresponding transducer. The reader will find that there are a large variety of techniques for immobilizing enzymes, cofactors and mediators, and even microorganisms, immunoagents, e now commercially available. Other types of tissues, and organelles. A large part of this book is devoted to enzyme biosensors are discussed, with regard to both the principles of their sensors, which is hardly surprising considering that they have been operation, and their construction.
This book will bring together experts in the field of astronomical photometry to discuss how their subfields provide the precision and accuracy in astronomical energy flux measurements that are needed to permit tests of astrophysical theories. Differential photometers and photometry, improvements in infrared precision, theimprovements in precision and accuracy of CCD photometry, the absolute calibration of flux, the development of the Johnson UBVRI photometric system and other passband systems to measure and precisely classify specific types of stars and astrophysical quantities, and the current capabilities of spectrophotometry, and polarimetry to provide precise and accurate data, will all be discussed in this volume. The discussion of differential or two-star photometers will include those developed for planetary as well as stellar photometry and will range from the Princeton polarizing photometer through the pioneering work of Walraven to the differential photometers designed to measure the ashen light of Venus and to counter the effects of aurorae at high latitude sites; the last to be discussed will be the Rapid Alternate Detection System (RADS) developed at the University of Calgary in the 1980s."
When the 50th anniversary of the birth of Information Theory was celebrated at the 1998 IEEE International Symposium on Informa tion Theory in Boston, there was a great deal of reflection on the the year 1993 as a critical year. As the years pass and more perspec tive is gained, it is a fairly safe bet that we will view 1993 as the year when the "early years" of error control coding came to an end. This was the year in which Berrou, Glavieux and Thitimajshima pre sented "Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes" at the International Conference on Communications in Geneva. In their presentation, Berrou et al. claimed that a combi nation of parallel concatenation and iterative decoding can provide reliable communications at a signal to noise ratio that is within a few tenths of a dB of the Shannon limit. Nearly fifty years of striving to achieve the promise of Shannon's noisy channel coding theorem had come to an end. The implications of this result were immediately apparent to all -coding gains on the order of 10 dB could be used to dramatically extend the range of communication receivers, increase data rates and services, or substantially reduce transmitter power levels. The 1993 ICC paper set in motion several research efforts that have permanently changed the way we look at error control coding."
The book describes a system for visual surveillance using intelligent cameras. The camera uses robust techniques for detecting and tracking moving objects. The real time capture of the objects is then stored in the database. The tracking data stored in the database is analysed to study the camera view, detect and track objects, and study object behavior. These set of models provide a robust framework for coordinating the tracking of objects between overlapping and non-overlapping cameras, and recording the activity of objects detected by the system.
This book-unique in the literature-provides readers with the mathematical background needed to design many of the optical combinations that are used in astronomical telescopes and cameras. The results presented in the work were obtained by using a different approach to third-order aberration theory as well as the extensive use of the software package Mathematica (R). Replete with workout examples and exercises, Geometric Optics is an excellent reference for advanced graduate students, researchers, and practitioners in applied mathematics, engineering, astronomy, and astronomical optics. The work may be used as a supplementary textbook for graduate-level courses in astronomical optics, optical design, optical engineering, programming with Mathematica, or geometric optics.
This volume focuses on Time-Correlated Single Photon Counting (TCSPC), a powerful tool allowing luminescence lifetime measurements to be made with high temporal resolution, even on single molecules. Combining spectrum and lifetime provides a "fingerprint" for identifying such molecules in the presence of a background. Used together with confocal detection, this permits single-molecule spectroscopy and microscopy in addition to ensemble measurements, opening up an enormous range of hot life science applications such as fluorescence lifetime imaging (FLIM) and measurement of Foerster Resonant Energy Transfer (FRET) for the investigation of protein folding and interaction. Several technology-related chapters present both the basics and current state-of-the-art, in particular of TCSPC electronics, photon detectors and lasers. The remaining chapters cover a broad range of applications and methodologies for experiments and data analysis, including the life sciences, defect centers in diamonds, super-resolution microscopy, and optical tomography. The chapters detailing new options arising from the combination of classic TCSPC and fluorescence lifetime with methods based on intensity fluctuation represent a particularly unique highlight.
This book is about the interaction of laser radiation with various surfaces at variable parameters of radiation. As a basic principle of classification we chose the energetic or intensity level of interaction of laser radiation with the surfaces. These two characteristics of laser radiation are the most important parameters defining entire spectrum of the processes occurring on the surfaces during interaction with electromagnetic waves. This is a first book containing a whole spectrum of the laser-surface interactions distinguished by the ranges of used laser intensity. It combines the surface response starting from extremely weak laser intensities (~1 W cm-2) up to the relativistic intensities (~1020 W cm-2 and higher). The book provides the basic information about lasers and acquaints the reader with both common applications of laser-surface interactions (laser-related printers, scanners, barcode readers, discs, material processing, military, holography, medicine, etc) and unusual uses of the processes on the surfaces under the action of lasers (art conservation, rangefinders and velocimeters, space and earth explorations, surface engineering and ablation, and others). The scientific applications of laser-surfaces interactions (surface optical nonlinearities, surface enhanced Raman spectroscopy, surface nanostructuring, nanoripples and clusters formation, X-ray lasers and harmonic generation from the surfaces) are discussed from the point of view of the close relations between the properties of surface and matter, which is a cornerstone of most of studies of materials. The novelty of the approach developed in Laser - Surface Interactions is related with the interconnection of scientific studies with numerous applications of the laser-surface interactions separated in different chapters by the ranges of laser intensities. We present most recent achievements in this field. The book provides valuable information for different ranges of reader's preparedness to the laser-related topics (from unprepared readers, to students, engineers and researchers, professionals and academics).
Super-Resolution Imaging serves as an essential reference for both academicians and practicing engineers. It can be used both as a text for advanced courses in imaging and as a desk reference for those working in multimedia, electrical engineering, computer science, and mathematics. The first book to cover the new research area of super-resolution imaging, this text includes work on the following groundbreaking topics: Image zooming based on wavelets and generalized interpolation; Super-resolution from sub-pixel shifts; Use of blur as a cue; Use of warping in super-resolution; Resolution enhancement using multiple apertures; Super-resolution from motion data; Super-resolution from compressed video; Limits in super-resolution imaging. Written by the leading experts in the field, Super-Resolution Imaging presents a comprehensive analysis of current technology, along with new research findings and directions for future work.
The work introduces the fundamentals concerning the measure of discrete information, the modeling of discrete sources without and with a memory, as well as of channels and coding. The understanding of the theoretical matter is supported by many examples. One particular emphasis is put on the explanation of Genomic Coding. Many examples throughout the book are chosen from this particular area and several parts of the book are devoted to this exciting implication of coding.
This book provides an introduction to Swarm Robotics, which is the application of methods from swarm intelligence to robotics. It goes on to present methods that allow readers to understand how to design large-scale robot systems by going through many example scenarios on topics such as aggregation, coordinated motion (flocking), task allocation, self-assembly, collective construction, and environmental monitoring. The author explains the methodology behind building multiple, simple robots and how the complexity emerges from the multiple interactions between these robots such that they are able to solve difficult tasks. The book can be used as a short textbook for specialized courses or as an introduction to Swarm Robotics for graduate students, researchers, and professionals who want a concise introduction to the field.
This book presents all aspects of situational awareness using acoustic signals. It starts by presenting the science behind understanding and interpretation of sound signals. The book then goes on to provide various signal processing techniques used in acoustics to find the direction of sound source, localize gunfire, track vehicles and detect people. The necessary mathematical background and various classification and fusion techniques are presented. The book contains majority of the things one would need to process acoustic signals for all aspects of situational awareness in one location. The book also presents array theory, which is pivotal in finding the direction of arrival of acoustic signals. In addition, the book presents techniques to fuse the information from multiple homogeneous/heterogeneous sensors for better detection. MATLAB code is provided for majority of the real application, which is a valuable resource in not only understanding the theory but readers can also use the code as a spring-board to develop their own application based software code.
"Blind Signal Processing: Theory and Practice" not only introduces related fundamental mathematics, but also reflects the numerous advances in the field, such as probability density estimation-based processing algorithms, underdetermined models, complex value methods, uncertainty of order in the separation of convolutive mixtures in frequency domains, and feature extraction using Independent Component Analysis (ICA). At the end of the book, results from a study conducted at Shanghai Jiao Tong University in the areas of speech signal processing, underwater signals, image feature extraction, data compression, and the like are discussed. This book will be of particular interest to advanced undergraduate students, graduate students, university instructors and research scientists in related disciplines. Xizhi Shi is a Professor at Shanghai Jiao Tong University.
This book provides comprehensive, state-of-the art coverage of photorefractive organic compounds, a class of material with the ability to change their index of refraction upon illumination. The change is both dynamic and reversible. Dynamic because no external processing is required for the index modulation to be revealed, and reversible because the index change can be modified or suppressed by altering the illumination pattern. These properties make photorefractive materials very attractive candidates for many applications such as image restoration, correlation, beam conjugation, non-destructive testing, data storage, imaging through scattering media, holographic imaging and display. The field of photorefractive organic material is also closely related to organic photovoltaic and light emitting diode (OLED), which makes new discoveries in one field applicable to others.
This book gives a practical overview of Fractional Calculus as it relates to Signal Processing
Over the last decade, significant progress has been made in 3D imaging research. As a result, 3D imaging methods and techniques are being employed for various applications, including 3D television, intelligent robotics, medical imaging, and stereovision. Depth Map and 3D Imaging Applications: Algorithms and Technologies present various 3D algorithms developed in the recent years and to investigate the application of 3D methods in various domains. Containing five sections, this book offers perspectives on 3D imaging algorithms, 3D shape recovery, stereoscopic vision and autostereoscopic vision, 3D vision for robotic applications, and 3D imaging applications. This book is an important resource for professionals, scientists, researchers, academics, and software engineers in image/video processing and computer vision.
Compression and Coding Algorithms describes in detail the coding
mechanisms that are available for use in data compression systems.
The well known Huffman coding technique is one mechanism, but there
have been many others developed over the past few decades, and this
book describes, explains and assesses them. People undertaking
research of software development in the areas of compression and
coding algorithms will find this book an indispensable reference.
In particular, the careful and detailed description of algorithms
and their implementation, plus accompanying pseudo-code that can be
readily implemented on computer, make this book a definitive
reference in an area currently without one.
This edited monograph presents the collected interdisciplinary research results of the priority program "Information- and Communication Theory in Molecular Biology (InKoMBio, SPP 1395)", funded by the German Research Foundation DFG, 2010 until 2016. The topical spectrum is very broad and comprises, but is not limited to, aspects such as microRNA as part of cell communication, information flow in mammalian signal transduction pathway, cell-cell communication, semiotic structures in biological systems, as well as application of methods from information theory in protein interaction analysis. The target audience primarily comprises research experts in the field of biological signal processing, but the book is also beneficial for graduate students alike.
While books on the medical applications of x-ray imaging exist, there is not one currently available that focuses on industrial applications. Full of color images that show clear spectrometry and rich with applications, X-Ray Imaging fills the need for a comprehensive work on modern industrial x-ray imaging. It reviews the fundamental science of x-ray imaging and addresses equipment and system configuration. Useful to a broad range of radiation imaging practitioners, the book looks at the rapid development and deployment of digital x-ray imaging system.
This book provides a compilation of important optical techniques applied to experiments in heat and mass transfer, multiphase flow and combustion. The emphasis of this book is on the application of these techniques to various engineering problems. The contributions are aiming to provide practicing engineers, both in industry and research, with the recent state-of-science in the application of advanced optical measurements. The book is written by selected specialists representing leading experts in this field who present new information for the possibilities of these techniques and give stimulation of new ideas for their application. |
![]() ![]() You may like...
Big Data Analytics - Methods and…
Saumyadipta Pyne, B.L.S.Prakasa Rao, …
Hardcover
R4,286
Discovery Miles 42 860
Reel Masters - Chefs Casting about with…
Susan Schadt
Hardcover
Statistics and Analysis of Scientific…
Massimiliano Bonamente
Hardcover
R1,525
Discovery Miles 15 250
|