![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Electronics & communications engineering > Electronics engineering > Applied optics
An exciting new development has taken place in the digital era that has captured the imagination and talent of researchers around the globe - wavelet image compression. This technology has deep roots in theories of vision, and promises performance improvements over all other compression methods, such as those based on Fourier transforms, vectors quantizers, fractals, neural nets, and many others. It is this revolutionary new technology that is presented in Wavelet Image and Video Compression, in a form that is accessible to the largest audience possible. Wavelet Image and Video Compression is divided into four parts. Part I, Background Material, introduces the basic mathematical structures that underly image compression algorithms with the intention of providing an easy introduction to the mathematical concepts that are prerequisites for the remainder of the book. It explains such topics as change of bases, scalar and vector quantization, bit allocation and rate-distortion theory, entropy coding, the discrete-cosine transform, wavelet filters and other related topics. Part II, Still Image Coding, presents a spectrum of wavelet still image coding techniques. Part III, Special Topics in Still Image Coding, provides a variety of example coding schemes with a special flavor in either approach or application domain. Part IV, Video Coding, examines wavelet and pyramidal coding techniques for video data. Wavelet Image and Video Compression serves as an excellent reference and may be used as a text for advanced courses covering the subject.
With a preface by Ton Kalker. Informed Watermarking is an essential tool for both academic and professional researchers working in the areas of multimedia security, information embedding, and communication. Theory and practice are linked, particularly in the area of multi-user communication. From the Preface: Watermarking has become a more mature discipline with proper foundation in both signal processing and information theory. We can truly say that we are in the era of "second generation" watermarking. This book is first in addressing watermarking problems in terms of second-generation insights. It provides a complete overview of the most important results on capacity and security. The Costa scheme, and in particular a simpler version of it, the Scalar Costa scheme, is studied in great detail. An important result of this book is that it is possible to approach the Shannon limit within a few decibels in a practical system. These results are verified on real-world data, not only the classical category of images, but also on chemical structure sets. Inspired by the work of Moulin and O'Sullivan, this book also addresses security aspects by studying AGWN attacks in terms of game theory. "The authors of Informed Watermarking give a well-written exposA(c) of how watermarking came of age, where we are now, and what to expect in the future. It is my expectation that this book will be a standard reference on second-generation watermarking for the years to come." Ton Kalker, Technische Universiteit Eindhoven
This book will bring together experts in the field of astronomical photometry to discuss how their subfields provide the precision and accuracy in astronomical energy flux measurements that are needed to permit tests of astrophysical theories. Differential photometers and photometry, improvements in infrared precision, theimprovements in precision and accuracy of CCD photometry, the absolute calibration of flux, the development of the Johnson UBVRI photometric system and other passband systems to measure and precisely classify specific types of stars and astrophysical quantities, and the current capabilities of spectrophotometry, and polarimetry to provide precise and accurate data, will all be discussed in this volume. The discussion of differential or two-star photometers will include those developed for planetary as well as stellar photometry and will range from the Princeton polarizing photometer through the pioneering work of Walraven to the differential photometers designed to measure the ashen light of Venus and to counter the effects of aurorae at high latitude sites; the last to be discussed will be the Rapid Alternate Detection System (RADS) developed at the University of Calgary in the 1980s."
The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications. The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, such as those related with Computer Graphics, Computer Vision, Computer Imaging, Biomedicine, Bioengineering, Mathematics, Physics, Medical Imaging and Medicine.
This book describes a unique approach to smart receiver system design. It starts with the analysis of a very basic, single-path receiver structure, then using similar methods, extends the analysis to a more complicated multi-path receiver. Within the multi-path structure, two different types of phased -array architectures are discussed: Analog beam-forming, and digital beam-forming. The pros and cons are studied, and the gaps are identified. Whereas previous books in this area focus mainly on phased-array circuit implementations, this book fills a gap by providing a system-level approach and introduces new methods for developing smart systems.
The book describes a system for visual surveillance using intelligent cameras. The camera uses robust techniques for detecting and tracking moving objects. The real time capture of the objects is then stored in the database. The tracking data stored in the database is analysed to study the camera view, detect and track objects, and study object behavior. These set of models provide a robust framework for coordinating the tracking of objects between overlapping and non-overlapping cameras, and recording the activity of objects detected by the system.
This book-unique in the literature-provides readers with the mathematical background needed to design many of the optical combinations that are used in astronomical telescopes and cameras. The results presented in the work were obtained by using a different approach to third-order aberration theory as well as the extensive use of the software package Mathematica (R). Replete with workout examples and exercises, Geometric Optics is an excellent reference for advanced graduate students, researchers, and practitioners in applied mathematics, engineering, astronomy, and astronomical optics. The work may be used as a supplementary textbook for graduate-level courses in astronomical optics, optical design, optical engineering, programming with Mathematica, or geometric optics.
This volume focuses on Time-Correlated Single Photon Counting (TCSPC), a powerful tool allowing luminescence lifetime measurements to be made with high temporal resolution, even on single molecules. Combining spectrum and lifetime provides a "fingerprint" for identifying such molecules in the presence of a background. Used together with confocal detection, this permits single-molecule spectroscopy and microscopy in addition to ensemble measurements, opening up an enormous range of hot life science applications such as fluorescence lifetime imaging (FLIM) and measurement of Foerster Resonant Energy Transfer (FRET) for the investigation of protein folding and interaction. Several technology-related chapters present both the basics and current state-of-the-art, in particular of TCSPC electronics, photon detectors and lasers. The remaining chapters cover a broad range of applications and methodologies for experiments and data analysis, including the life sciences, defect centers in diamonds, super-resolution microscopy, and optical tomography. The chapters detailing new options arising from the combination of classic TCSPC and fluorescence lifetime with methods based on intensity fluctuation represent a particularly unique highlight.
This book is about the interaction of laser radiation with various surfaces at variable parameters of radiation. As a basic principle of classification we chose the energetic or intensity level of interaction of laser radiation with the surfaces. These two characteristics of laser radiation are the most important parameters defining entire spectrum of the processes occurring on the surfaces during interaction with electromagnetic waves. This is a first book containing a whole spectrum of the laser-surface interactions distinguished by the ranges of used laser intensity. It combines the surface response starting from extremely weak laser intensities (~1 W cm-2) up to the relativistic intensities (~1020 W cm-2 and higher). The book provides the basic information about lasers and acquaints the reader with both common applications of laser-surface interactions (laser-related printers, scanners, barcode readers, discs, material processing, military, holography, medicine, etc) and unusual uses of the processes on the surfaces under the action of lasers (art conservation, rangefinders and velocimeters, space and earth explorations, surface engineering and ablation, and others). The scientific applications of laser-surfaces interactions (surface optical nonlinearities, surface enhanced Raman spectroscopy, surface nanostructuring, nanoripples and clusters formation, X-ray lasers and harmonic generation from the surfaces) are discussed from the point of view of the close relations between the properties of surface and matter, which is a cornerstone of most of studies of materials. The novelty of the approach developed in Laser - Surface Interactions is related with the interconnection of scientific studies with numerous applications of the laser-surface interactions separated in different chapters by the ranges of laser intensities. We present most recent achievements in this field. The book provides valuable information for different ranges of reader's preparedness to the laser-related topics (from unprepared readers, to students, engineers and researchers, professionals and academics).
The fourth volume in a series, this work details coverage of recent developments in the field of optical fibre sensors. It describes the impact which fibre sensors are having in such areas as chemical and environmental monitoring, structural instrumentation and "smart" structure, process control and engineering, and specialist industrial measurements.
When the 50th anniversary of the birth of Information Theory was celebrated at the 1998 IEEE International Symposium on Informa tion Theory in Boston, there was a great deal of reflection on the the year 1993 as a critical year. As the years pass and more perspec tive is gained, it is a fairly safe bet that we will view 1993 as the year when the "early years" of error control coding came to an end. This was the year in which Berrou, Glavieux and Thitimajshima pre sented "Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes" at the International Conference on Communications in Geneva. In their presentation, Berrou et al. claimed that a combi nation of parallel concatenation and iterative decoding can provide reliable communications at a signal to noise ratio that is within a few tenths of a dB of the Shannon limit. Nearly fifty years of striving to achieve the promise of Shannon's noisy channel coding theorem had come to an end. The implications of this result were immediately apparent to all -coding gains on the order of 10 dB could be used to dramatically extend the range of communication receivers, increase data rates and services, or substantially reduce transmitter power levels. The 1993 ICC paper set in motion several research efforts that have permanently changed the way we look at error control coding."
Super-Resolution Imaging serves as an essential reference for both academicians and practicing engineers. It can be used both as a text for advanced courses in imaging and as a desk reference for those working in multimedia, electrical engineering, computer science, and mathematics. The first book to cover the new research area of super-resolution imaging, this text includes work on the following groundbreaking topics: Image zooming based on wavelets and generalized interpolation; Super-resolution from sub-pixel shifts; Use of blur as a cue; Use of warping in super-resolution; Resolution enhancement using multiple apertures; Super-resolution from motion data; Super-resolution from compressed video; Limits in super-resolution imaging. Written by the leading experts in the field, Super-Resolution Imaging presents a comprehensive analysis of current technology, along with new research findings and directions for future work.
This book provides a comprehensive overview of digital signal processing for a multi-disciplinary audience. It posits that though the theory involved in digital signal processing stems from electrical, electronics, communication, and control engineering, the topic has use in other disciplinary areas like chemical, mechanical, civil, computer science, and management. This book is written about digital signal processing in such a way that it is suitable for a wide ranging audience. Readers should be able to get a grasp of the field, understand the concepts easily, and apply as needed in their own fields. It covers sampling and reconstruction of signals; infinite impulse response filter; finite impulse response filter; multi rate signal processing; statistical signal processing; and applications in multidisciplinary domains. The book takes a functional approach and all techniques are illustrated using Matlab.
This book gives a practical overview of Fractional Calculus as it relates to Signal Processing
Over the last decade, significant progress has been made in 3D imaging research. As a result, 3D imaging methods and techniques are being employed for various applications, including 3D television, intelligent robotics, medical imaging, and stereovision. Depth Map and 3D Imaging Applications: Algorithms and Technologies present various 3D algorithms developed in the recent years and to investigate the application of 3D methods in various domains. Containing five sections, this book offers perspectives on 3D imaging algorithms, 3D shape recovery, stereoscopic vision and autostereoscopic vision, 3D vision for robotic applications, and 3D imaging applications. This book is an important resource for professionals, scientists, researchers, academics, and software engineers in image/video processing and computer vision.
Compression and Coding Algorithms describes in detail the coding
mechanisms that are available for use in data compression systems.
The well known Huffman coding technique is one mechanism, but there
have been many others developed over the past few decades, and this
book describes, explains and assesses them. People undertaking
research of software development in the areas of compression and
coding algorithms will find this book an indispensable reference.
In particular, the careful and detailed description of algorithms
and their implementation, plus accompanying pseudo-code that can be
readily implemented on computer, make this book a definitive
reference in an area currently without one.
The work introduces the fundamentals concerning the measure of discrete information, the modeling of discrete sources without and with a memory, as well as of channels and coding. The understanding of the theoretical matter is supported by many examples. One particular emphasis is put on the explanation of Genomic Coding. Many examples throughout the book are chosen from this particular area and several parts of the book are devoted to this exciting implication of coding.
This edited monograph presents the collected interdisciplinary research results of the priority program "Information- and Communication Theory in Molecular Biology (InKoMBio, SPP 1395)", funded by the German Research Foundation DFG, 2010 until 2016. The topical spectrum is very broad and comprises, but is not limited to, aspects such as microRNA as part of cell communication, information flow in mammalian signal transduction pathway, cell-cell communication, semiotic structures in biological systems, as well as application of methods from information theory in protein interaction analysis. The target audience primarily comprises research experts in the field of biological signal processing, but the book is also beneficial for graduate students alike.
This book provides a compilation of important optical techniques applied to experiments in heat and mass transfer, multiphase flow and combustion. The emphasis of this book is on the application of these techniques to various engineering problems. The contributions are aiming to provide practicing engineers, both in industry and research, with the recent state-of-science in the application of advanced optical measurements. The book is written by selected specialists representing leading experts in this field who present new information for the possibilities of these techniques and give stimulation of new ideas for their application.
Focusing on the rapidly increasing interaction between
biotechnology and advanced fiberoptics/electronics, Biosensors with
Fiberoptics emphasizes the three major phases of the developmental
process from concept to marketplace: research, development, and
applications.
This book presents the many different techniques and methods of fabricating materials on the nanometer scale, and, specifically, the utilization of these resources with regard to sensors. The techniques described are studied from an application-oriented perspective, providing the reader with a perspective of the types of nanostructured sensors available that is broader than other books which concentrate on theoretical situations related to specific fabrication techniques.
A uniquely practical book, this monograph is the first to describe basic and applied spectroscopic techniques for the study of physical processes in high frequency, electrodeless discharge lamps. Special attention is given to the construction and optimization of these lamps, a popular source of line spectra and an important tool in ultraprecise optical engineering. Highlights include discussions of: high precision measurements of gas pressures spectral source lifespan and more.
This thesis presents an experimental study of the ultrafast molecular dynamics of CO_2 DEGREES+ that are induced by a strong, near-infrared, femtosecond laser pulse. In particular, typical strong field phenomena such as tunneling ionisation, nonsequential double ionisation and photo-induced dissociation are investigated and controlled by employing an experimental technique called impulsive molecular alignment. Here, a first laser pulse fixes the molecule in space, such that the molecular dynamics can be studied as a function of the molecular geometry with a second laser pulse. The experiments are placed within the context of the study and control of ultrafast molecular dynamics, where sub-femtosecond (10 DEGREES-15 seconds) resolution in ever larger molecular systems represents the current frontier of research. The thesis presents the required background in strong field and molecular physics, femtosecond laser architecture and experimental techniques in a clear and accessible language that does not require any previous knowledge in
This book presents all aspects of situational awareness using acoustic signals. It starts by presenting the science behind understanding and interpretation of sound signals. The book then goes on to provide various signal processing techniques used in acoustics to find the direction of sound source, localize gunfire, track vehicles and detect people. The necessary mathematical background and various classification and fusion techniques are presented. The book contains majority of the things one would need to process acoustic signals for all aspects of situational awareness in one location. The book also presents array theory, which is pivotal in finding the direction of arrival of acoustic signals. In addition, the book presents techniques to fuse the information from multiple homogeneous/heterogeneous sensors for better detection. MATLAB code is provided for majority of the real application, which is a valuable resource in not only understanding the theory but readers can also use the code as a spring-board to develop their own application based software code. |
You may like...
Lossless Information Hiding in Images
Zheming Lu, Shize Guo
Paperback
Silicon Photonics, Volume 99
Chennupati Jagadish, Sebastian Lourdudoss, …
Hardcover
R5,217
Discovery Miles 52 170
Aggregation-Induced Emission (AIE) - A…
Jianwei Xu, Ming Hui Chua, …
Paperback
R5,716
Discovery Miles 57 160
Infrared Thermography in the Evaluation…
Carosena Meola, Simone Boccardi, …
Hardcover
R3,497
Discovery Miles 34 970
|