![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Electronics & communications engineering > Electronics engineering > Applied optics
This book will bring together experts in the field of astronomical photometry to discuss how their subfields provide the precision and accuracy in astronomical energy flux measurements that are needed to permit tests of astrophysical theories. Differential photometers and photometry, improvements in infrared precision, theimprovements in precision and accuracy of CCD photometry, the absolute calibration of flux, the development of the Johnson UBVRI photometric system and other passband systems to measure and precisely classify specific types of stars and astrophysical quantities, and the current capabilities of spectrophotometry, and polarimetry to provide precise and accurate data, will all be discussed in this volume. The discussion of differential or two-star photometers will include those developed for planetary as well as stellar photometry and will range from the Princeton polarizing photometer through the pioneering work of Walraven to the differential photometers designed to measure the ashen light of Venus and to counter the effects of aurorae at high latitude sites; the last to be discussed will be the Rapid Alternate Detection System (RADS) developed at the University of Calgary in the 1980s."
The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications. The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, such as those related with Computer Graphics, Computer Vision, Computer Imaging, Biomedicine, Bioengineering, Mathematics, Physics, Medical Imaging and Medicine.
This book describes a unique approach to smart receiver system design. It starts with the analysis of a very basic, single-path receiver structure, then using similar methods, extends the analysis to a more complicated multi-path receiver. Within the multi-path structure, two different types of phased -array architectures are discussed: Analog beam-forming, and digital beam-forming. The pros and cons are studied, and the gaps are identified. Whereas previous books in this area focus mainly on phased-array circuit implementations, this book fills a gap by providing a system-level approach and introduces new methods for developing smart systems.
A guide to the theory and application of methods of projections. With the rise of powerful personal computers, methods of vector space projections have moved rapidly from the realm of theory into widespread use. This book reflects the growing interest in the application of these methods to problem solving in science and engineering. It brings together material previously scattered in disparate papers, book chapters, and articles, and offers a systematic treatment of vector space projections. Written by two leading authorities in the field, this self-contained volume provides a tutorial on projection methods and how to apply them in science and engineering. It details effective problem-solving strategies, and explores key applications in communication and signal processing, neural networks and pattern recognition, and optics and image processing. This book:
This extremely useful reference for practicing engineers, scientists, and educators can also be used for graduate-level study in science, mathematics, and engineering. Portions of the book have been used as material in short courses on applications of vector space projections.
When the 50th anniversary of the birth of Information Theory was celebrated at the 1998 IEEE International Symposium on Informa tion Theory in Boston, there was a great deal of reflection on the the year 1993 as a critical year. As the years pass and more perspec tive is gained, it is a fairly safe bet that we will view 1993 as the year when the "early years" of error control coding came to an end. This was the year in which Berrou, Glavieux and Thitimajshima pre sented "Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes" at the International Conference on Communications in Geneva. In their presentation, Berrou et al. claimed that a combi nation of parallel concatenation and iterative decoding can provide reliable communications at a signal to noise ratio that is within a few tenths of a dB of the Shannon limit. Nearly fifty years of striving to achieve the promise of Shannon's noisy channel coding theorem had come to an end. The implications of this result were immediately apparent to all -coding gains on the order of 10 dB could be used to dramatically extend the range of communication receivers, increase data rates and services, or substantially reduce transmitter power levels. The 1993 ICC paper set in motion several research efforts that have permanently changed the way we look at error control coding."
The book describes a system for visual surveillance using intelligent cameras. The camera uses robust techniques for detecting and tracking moving objects. The real time capture of the objects is then stored in the database. The tracking data stored in the database is analysed to study the camera view, detect and track objects, and study object behavior. These set of models provide a robust framework for coordinating the tracking of objects between overlapping and non-overlapping cameras, and recording the activity of objects detected by the system.
This book consists of the identification, characterization, and modeling of electromagnetic interferences in substations for the deployment of wireless sensor networks. The authors present in chapter 3 the measurement setup to record sequences of impulsive noise samples in the ISM band of interest. The setup can measure substation impulsive noise, in wide band, with enough samples per time window and enough precision to allow a statistical study of the noise. During the measurement campaign, the authors recorded around 120 noise sequences in different substations and for four ranges of equipment voltage, which are 25 kV, 230 kV, 315 kV and 735 kV. A characterization process is proposed, by which physical characteristics of partial discharge can be measured in terms of first- and second-order statistics. From the measurement campaign, the authors infer the characteristics of substation impulsive noise as a function of the substation equipment voltage, and can provide representative parameters for the four voltage ranges and for several existing impulsive noise models. The authors investigate in chapters 4 and 5 the modeling of electromagnetic interferences caused by partial discharge sources. First, the authors propose a complete and coherent approach model that links physical characteristics of high-voltage installations to the induced radio-interference spectra of partial discharge sources. The goodness-of-fit of the proposed physical model has been measured based on some interesting statistical metrics. This allows one to assess the effectiveness of the authors' approach in terms of first- and second-order statistics. Chapter 6 proposes a model based on statistical approach. Indeed, substation impulsive noise is composed of correlated impulses, which would require models with memory in order to replicate a similar correlation. Among different models, we have configured a Partitioned Markov Chain (PMC) with 19 states (one state for the background noise and 18 states for the impulse); this Markov-Gaussian model is able to generate impulsive noise with correlated impulse samples. The correlation is observable on the impulse duration and the power spectrum of the impulses. Our PMC model provides characteristics that are more similar to the characteristics of substation impulsive noise in comparison with other models, in terms of time and frequency response, as well as Probability Density Functions (PDF). Although PMC represents reliably substation impulsive noise, the model remains complex in terms of parameter estimation due to a large number of Markov states, which can be an obstacle for future wireless system design. In order to simplify the model, the authors decrease the number of states to 7 by assigning one state to the background noise and 6 states to the impulse and we call this model PMC-6. PMC-6 can generate realistic impulses and can be easily implemented in a receiver in order to mitigate substation impulsive noise. Representative parameters are provided in order to replicate substation impulsive noise for different voltage ranges (25-735 kV). Chapter 7, a generalized radio-noise model for substations is proposed, in which there are many discharges sources that are randomly distributed over space and time according to the Poisson field of interferers approach. This allows for the identification of some interesting statistical properties of moments, cumulants and probability distributions. These can, in turn, be utilized in signal processing algorithms for rapid partial discharge's identification, localization, and impulsive noise mitigation techniques in wireless communications in substations. The primary audience for this book is the electrical and power engineering industry, electricity providers and companies who are interested in substation automation systems using wireless communication technologies for smart grid applications. Researchers, engineers and students studying and working in wireless communication will also want to buy this book as a reference.
This volume focuses on Time-Correlated Single Photon Counting (TCSPC), a powerful tool allowing luminescence lifetime measurements to be made with high temporal resolution, even on single molecules. Combining spectrum and lifetime provides a "fingerprint" for identifying such molecules in the presence of a background. Used together with confocal detection, this permits single-molecule spectroscopy and microscopy in addition to ensemble measurements, opening up an enormous range of hot life science applications such as fluorescence lifetime imaging (FLIM) and measurement of Foerster Resonant Energy Transfer (FRET) for the investigation of protein folding and interaction. Several technology-related chapters present both the basics and current state-of-the-art, in particular of TCSPC electronics, photon detectors and lasers. The remaining chapters cover a broad range of applications and methodologies for experiments and data analysis, including the life sciences, defect centers in diamonds, super-resolution microscopy, and optical tomography. The chapters detailing new options arising from the combination of classic TCSPC and fluorescence lifetime with methods based on intensity fluctuation represent a particularly unique highlight.
This book provides a comprehensive overview of digital signal processing for a multi-disciplinary audience. It posits that though the theory involved in digital signal processing stems from electrical, electronics, communication, and control engineering, the topic has use in other disciplinary areas like chemical, mechanical, civil, computer science, and management. This book is written about digital signal processing in such a way that it is suitable for a wide ranging audience. Readers should be able to get a grasp of the field, understand the concepts easily, and apply as needed in their own fields. It covers sampling and reconstruction of signals; infinite impulse response filter; finite impulse response filter; multi rate signal processing; statistical signal processing; and applications in multidisciplinary domains. The book takes a functional approach and all techniques are illustrated using Matlab.
Super-Resolution Imaging serves as an essential reference for both academicians and practicing engineers. It can be used both as a text for advanced courses in imaging and as a desk reference for those working in multimedia, electrical engineering, computer science, and mathematics. The first book to cover the new research area of super-resolution imaging, this text includes work on the following groundbreaking topics: Image zooming based on wavelets and generalized interpolation; Super-resolution from sub-pixel shifts; Use of blur as a cue; Use of warping in super-resolution; Resolution enhancement using multiple apertures; Super-resolution from motion data; Super-resolution from compressed video; Limits in super-resolution imaging. Written by the leading experts in the field, Super-Resolution Imaging presents a comprehensive analysis of current technology, along with new research findings and directions for future work.
This book provides an introduction to Swarm Robotics, which is the application of methods from swarm intelligence to robotics. It goes on to present methods that allow readers to understand how to design large-scale robot systems by going through many example scenarios on topics such as aggregation, coordinated motion (flocking), task allocation, self-assembly, collective construction, and environmental monitoring. The author explains the methodology behind building multiple, simple robots and how the complexity emerges from the multiple interactions between these robots such that they are able to solve difficult tasks. The book can be used as a short textbook for specialized courses or as an introduction to Swarm Robotics for graduate students, researchers, and professionals who want a concise introduction to the field.
This book is about the interaction of laser radiation with various surfaces at variable parameters of radiation. As a basic principle of classification we chose the energetic or intensity level of interaction of laser radiation with the surfaces. These two characteristics of laser radiation are the most important parameters defining entire spectrum of the processes occurring on the surfaces during interaction with electromagnetic waves. This is a first book containing a whole spectrum of the laser-surface interactions distinguished by the ranges of used laser intensity. It combines the surface response starting from extremely weak laser intensities (~1 W cm-2) up to the relativistic intensities (~1020 W cm-2 and higher). The book provides the basic information about lasers and acquaints the reader with both common applications of laser-surface interactions (laser-related printers, scanners, barcode readers, discs, material processing, military, holography, medicine, etc) and unusual uses of the processes on the surfaces under the action of lasers (art conservation, rangefinders and velocimeters, space and earth explorations, surface engineering and ablation, and others). The scientific applications of laser-surfaces interactions (surface optical nonlinearities, surface enhanced Raman spectroscopy, surface nanostructuring, nanoripples and clusters formation, X-ray lasers and harmonic generation from the surfaces) are discussed from the point of view of the close relations between the properties of surface and matter, which is a cornerstone of most of studies of materials. The novelty of the approach developed in Laser - Surface Interactions is related with the interconnection of scientific studies with numerous applications of the laser-surface interactions separated in different chapters by the ranges of laser intensities. We present most recent achievements in this field. The book provides valuable information for different ranges of reader's preparedness to the laser-related topics (from unprepared readers, to students, engineers and researchers, professionals and academics).
The work introduces the fundamentals concerning the measure of discrete information, the modeling of discrete sources without and with a memory, as well as of channels and coding. The understanding of the theoretical matter is supported by many examples. One particular emphasis is put on the explanation of Genomic Coding. Many examples throughout the book are chosen from this particular area and several parts of the book are devoted to this exciting implication of coding.
This book provides a compilation of important optical techniques applied to experiments in heat and mass transfer, multiphase flow and combustion. The emphasis of this book is on the application of these techniques to various engineering problems. The contributions are aiming to provide practicing engineers, both in industry and research, with the recent state-of-science in the application of advanced optical measurements. The book is written by selected specialists representing leading experts in this field who present new information for the possibilities of these techniques and give stimulation of new ideas for their application.
Compression and Coding Algorithms describes in detail the coding
mechanisms that are available for use in data compression systems.
The well known Huffman coding technique is one mechanism, but there
have been many others developed over the past few decades, and this
book describes, explains and assesses them. People undertaking
research of software development in the areas of compression and
coding algorithms will find this book an indispensable reference.
In particular, the careful and detailed description of algorithms
and their implementation, plus accompanying pseudo-code that can be
readily implemented on computer, make this book a definitive
reference in an area currently without one.
In recent years there has been an increasing interest in Second Generation Image and Video Coding Techniques. These techniques introduce new concepts from image analysis that greatly improve the performance of the coding schemes for very high compression. This interest has been further emphasized by the future MPEG 4 standard. Second generation image and video coding techniques are the ensemble of approaches proposing new and more efficient image representations than the conventional canonical form. As a consequence, the human visual system becomes a fundamental part of the encoding/decoding chain. More insight to distinguish between first and second generation can be gained if it is noticed that image and video coding is basically carried out in two steps. First, image data are converted into a sequence of messages and, second, code words are assigned to the messages. Methods of the first generation put the emphasis on the second step, whereas methods of the second generation put it on the first step and use available results for the second step. As a result of including the human visual system, second generation can be also seen as an approach of seeing the image composed by different entities called objects. This implies that the image or sequence of images have first to be analyzed and/or segmented in order to find the entities. It is in this context that we have selected in this book three main approaches as second generation video coding techniques: Segmentation-based schemes Model Based Schemes Fractal Based Schemes GBP/LISTGBP Video Coding: The Second Generation Approach is an important introduction to the new coding techniques for video. As such, all researchers, students and practitioners working in image processing will find this book of interest.
Focusing on the rapidly increasing interaction between
biotechnology and advanced fiberoptics/electronics, Biosensors with
Fiberoptics emphasizes the three major phases of the developmental
process from concept to marketplace: research, development, and
applications.
This book presents the many different techniques and methods of fabricating materials on the nanometer scale, and, specifically, the utilization of these resources with regard to sensors. The techniques described are studied from an application-oriented perspective, providing the reader with a perspective of the types of nanostructured sensors available that is broader than other books which concentrate on theoretical situations related to specific fabrication techniques.
Interference coatings are an essential part of modern optics. This book is designed to give a concise but complete overview of the field, with contributions written by leading experts in the various areas. Topics include design, materials, film growth, deposition including large area, characterization and monitoring, and mechanical stress. The authors also describe applications in astronomy, microcomponents, DUV/VUV, EUV/X, ultrafast optics, displays, and ultrasensitive fluorescence. Furthermore, laser-resistant coatings and coatings for free-electron lasers and plastic optics are covered. The book concludes with chapters on photonic structures as interference devices and on the brilliant world of natural coatings.
A uniquely practical book, this monograph is the first to describe basic and applied spectroscopic techniques for the study of physical processes in high frequency, electrodeless discharge lamps. Special attention is given to the construction and optimization of these lamps, a popular source of line spectra and an important tool in ultraprecise optical engineering. Highlights include discussions of: high precision measurements of gas pressures spectral source lifespan and more.
This thesis presents an experimental study of the ultrafast molecular dynamics of CO_2 DEGREES+ that are induced by a strong, near-infrared, femtosecond laser pulse. In particular, typical strong field phenomena such as tunneling ionisation, nonsequential double ionisation and photo-induced dissociation are investigated and controlled by employing an experimental technique called impulsive molecular alignment. Here, a first laser pulse fixes the molecule in space, such that the molecular dynamics can be studied as a function of the molecular geometry with a second laser pulse. The experiments are placed within the context of the study and control of ultrafast molecular dynamics, where sub-femtosecond (10 DEGREES-15 seconds) resolution in ever larger molecular systems represents the current frontier of research. The thesis presents the required background in strong field and molecular physics, femtosecond laser architecture and experimental techniques in a clear and accessible language that does not require any previous knowledge in
This book gives a practical overview of Fractional Calculus as it relates to Signal Processing
This book presents all aspects of situational awareness using acoustic signals. It starts by presenting the science behind understanding and interpretation of sound signals. The book then goes on to provide various signal processing techniques used in acoustics to find the direction of sound source, localize gunfire, track vehicles and detect people. The necessary mathematical background and various classification and fusion techniques are presented. The book contains majority of the things one would need to process acoustic signals for all aspects of situational awareness in one location. The book also presents array theory, which is pivotal in finding the direction of arrival of acoustic signals. In addition, the book presents techniques to fuse the information from multiple homogeneous/heterogeneous sensors for better detection. MATLAB code is provided for majority of the real application, which is a valuable resource in not only understanding the theory but readers can also use the code as a spring-board to develop their own application based software code.
This edited monograph presents the collected interdisciplinary research results of the priority program "Information- and Communication Theory in Molecular Biology (InKoMBio, SPP 1395)", funded by the German Research Foundation DFG, 2010 until 2016. The topical spectrum is very broad and comprises, but is not limited to, aspects such as microRNA as part of cell communication, information flow in mammalian signal transduction pathway, cell-cell communication, semiotic structures in biological systems, as well as application of methods from information theory in protein interaction analysis. The target audience primarily comprises research experts in the field of biological signal processing, but the book is also beneficial for graduate students alike. |
You may like...
On Writing Well - The Classic Guide to…
William Knowlton Zinsser
Paperback
(4)
|