![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Electronics & communications engineering > Electronics engineering > Applied optics
This book presents recent research on the hybridization of intelligent methods, which refers to combining methods to solve complex problems. It discusses hybrid approaches covering different areas of intelligent methods and technologies, such as neural networks, swarm intelligence, machine learning, reinforcement learning, deep learning, agent-based approaches, knowledge-based system and image processing. The book includes extended and revised versions of invited papers presented at the 6th International Workshop on Combinations of Intelligent Methods and Applications (CIMA 2016), held in The Hague, Holland, in August 2016. The book is intended for researchers and practitioners from academia and industry interested in using hybrid methods for solving complex problems.
Advances in Imaging and Electron Physics, Volume 204, merges two long-running serials, Advances in Electronics and Electron Physics and Advances in Optical and Electron Microscopy. The series features extended articles on the physics of electron devices (especially semiconductor devices), particle optics at high and low energies, microlithography, image science, and digital image processing, electromagnetic wave propagation, electron microscopy, and the computing methods used in all these domains.
This book provides information regarding spectrum sharing between wireless systems, motivated by emerging new technologies. Readers will benefit from information about how to conduct research on the interference mitigation between IMT-Advanced and FSS. The author presents a deterministic analysis for interference to noise ratio (I/N), adjacent channel interference ratio (ACIR), field strength, and path loss propagation, in order to determine the separation distances in the co-channel interference (CCI) and adjacent channel Interference (ACI) scenarios. An analytical model is discussed, for the shielding mitigation technique based on the deterministic analysis of the propagation model. The shielding technique has been developed based on test bed measurements for evaluating the attenuation of the proposed materials. Matlab (TM) and Transfinite Visualyse Pro (TM) have been used as simulation tools for the verification of the obtained results, whereas the IMT-Advanced parameters have been represented by Worldwide Interoperability for Microwave Access (WiMAX) 802.16e.
This thesis reports on sparsity-based multipath exploitation methods for through-the-wall radar imaging. Multipath creates ambiguities in the measurements provoking unwanted ghost targets in the image. This book describes sparse reconstruction methods that are not only suppressing the ghost targets, but using multipath to one's advantage. With adopting the compressive sensing principle, fewer measurements are required for image reconstruction as compared to conventional techniques. The book describes the development of a comprehensive signal model and some associated reconstruction methods that can deal with many relevant scenarios, such as clutter from building structures, secondary reflections from interior walls, as well as stationary and moving targets, in urban radar imaging. The described methods are evaluated here using simulated as well as measured data from semi-controlled laboratory experiments.
The scope of image processing and recognition has broadened due to the gap in scientific visualization. Thus, new imaging techniques have developed, and it is imperative to study this progression for optimal utilization. Big Data Analytics for Satellite Image Processing and Remote Sensing is a critical scholarly resource that examines the challenges and difficulties of implementing big data in image processing for remote sensing and related areas. Featuring coverage on a broad range of topics, such as distributed computing, parallel processing, and spatial data, this book is geared towards scientists, professionals, researchers, and academicians seeking current research on the use of big data analytics in satellite image processing and remote sensing.
This clearly written thesis discusses the development of a highly innovative single-photon source that uses active optical switching, known as multiplexing, to increase the probability of delivering photons into a single mode. Improving single-photon sources is critical in advancing the state of the art in photonic quantum technologies for information processing and communications.
Advances in Imaging and Electron Physics, Volume 200, the latest release in a series that merges two long-running serials, Advances in Electronics and Electron Physics and Advances in Optical and Electron Microscopy features extended articles on the physics of electron devices (especially semiconductor devices), particle optics at high and low energies, microlithography, image science, digital image processing, electromagnetic wave propagation, electron microscopy, and computing methods. Topics in this latest release include Past and Present Attempts to Attain the Resolution Limit of the Transmission Electron Microscope, Phase Plates for Transmission Electron Microscopy, and X-Ray Lasers in Biology: Structure and Dynamics.
This book presents a technology to help speech-, hearing- and sight-impaired people. It explains how they will benefit from an enhancement in their ability to recognize and produce speech or to detect sounds in their surroundings. Additionally, it is considered how sound-based assistive technology might be applied to the areas of speech recognition, speech synthesis, environmental recognition, virtual reality and robots. The primary focus of this book is to provide an understanding of both the methodology and basic concepts of assistive technology rather than listing the variety of assistive devices developed. This book presents a number of different topics which are sufficiently independent from one another that the reader may begin at any chapter without lacking background information. Much of the research quoted in this book was conducted in the author's laboratories at Hokkaido University and University of Tokyo. This book offers the reader a better understanding of a number of unsolved problems that still persist in the field of sound-based assistive technology.
Technology has brought about the age of convenience, but at a hefty cost. As a result of a growing production demand on a global scale, adhesive bonding operations also generate a huge amount of hazardous waste. Adhesive bonding, an integral step in manufacturing across several sectors, is one of many culprits of the unprecedented overproduction and environmental burden of municipal, industrial, and hazardous waste. If a cleaner, greener bonding process is formulated, hazardous waste production can be reined in and the world can be safer. Using Lasers as Safe Alternatives for Adhesive Bonding: Emerging Research and Opportunities is a pivotal reference source that analyzes the new conditions for laser processing in the context of adhesive bonding. The book includes the results of experimental research, giving grounds to believe that laser technology has a future in the preparation of products for bonding. From this research, the book presents conclusions for eliminating poisonous chemicals, a threat to humans and the environment, and the burden of liquid and solid waste. It further outlines limitations and requirements imposed on people, such as the need to use personal protective equipment, to establish specific work procedures to ensure the safety of working with lasers, with a view to the future implementation of laser technology in manufacturing facilities. Featuring coverage of a wide range of topics including static strength, surface preparation, and beam impact, this book is ideally designed for engineers, policymakers, researchers, academicians, and students.
This book presents multibiometric watermarking techniques for security of biometric data. This book also covers transform domain multibiometric watermarking techniques and their advantages and limitations. The authors have developed novel watermarking techniques with a combination of Compressive Sensing (CS) theory for the security of biometric data at the system database of the biometric system. The authors show how these techniques offer higher robustness, authenticity, better imperceptibility, increased payload capacity, and secure biometric watermarks. They show how to use the CS theory for the security of biometric watermarks before embedding into the host biometric data. The suggested methods may find potential applications in the security of biometric data at various banking applications, access control of laboratories, nuclear power stations, military base, and airports.
This proceeding features papers discussing big data innovation for sustainable cognitive computing. The papers feature detail on cognitive computing and its self-learning systems that use data mining, pattern recognition and natural language processing (NLP) to mirror the way the human brain works. This international conference focuses on cognitive computing technologies, from knowledge representation techniques and natural language processing algorithms to dynamic learning approaches. Topics covered include Data Science for Cognitive Analysis, Real-Time Ubiquitous Data Science, Platform for Privacy Preserving Data Science, and Internet-Based Cognitive Platform. The EAI International Conference on Big Data Innovation for Sustainable Cognitive Computing (BDCC 2018), took place on 13 - 15 December 2018 in Coimbatore, India.
This book gathers the main recent results on positive trigonometric polynomials within a unitary framework. The book has two parts: theory and applications. The theory of sum-of-squares trigonometric polynomials is presented unitarily based on the concept of Gram matrix (extended to Gram pair or Gram set). The applications part is organized as a collection of related problems that use systematically the theoretical results.
This book presents design methods and considerations for digitally-assisted wideband millimeter-wave transmitters. It addresses comprehensively both RF design and digital implementation simultaneously, in order to design energy- and cost-efficient high-performance transmitters for mm-wave high-speed communications. It covers the complete design flow, from link budget assessment to the transistor-level design of different RF front-end blocks, such as mixers and power amplifiers, presenting different alternatives and discussing the existing trade-offs. The authors also analyze the effect of the imperfections of these blocks in the overall performance, while describing techniques to correct and compensate for them digitally. Well-known techniques are revisited, and some new ones are described, giving examples of their applications and proving them in real integrated circuits.
This book describes the concept and design of the capacitively-coupled chopper technique, which can be used in precision analog amplifiers. Readers will learn to design power-efficient amplifiers employing this technique, which can be powered by regular low supply voltage such as 2V and possibly having a +/-100V input common-mode voltage input. The authors provide both basic design concepts and detailed design examples, which cover the area of both operational and instrumentation amplifiers for multiple applications, particularly in power management and biomedical circuit designs.
This book addresses the nature of sound, focusing on the characteristics of sound waves in the context of time structures. This time domain approach provides an informative and intuitively understandable description of various acoustic topics such as sound waves travelling in an acoustic tube or in other media where spectral or modal analysis can be intensively performed. Starting from the introductory topic of sinusoidal waves, it discusses the formal relationship between the time and frequency domains, summarizing the fundamental notions of Fourier or z-transformations and linear systems theory, along with interesting examples from acoustical research. The books novel approach is of interest to research engineers and scientists In particular, the expressions concerning waveforms including the impulse responses are important for audio engineers who are familiar with digital signal analysis. Every chapter includes simple exercises designed to be solved without the need for a computer. Thus they help reconfirm the fundamental ideas and notions present in every chapter. The book is self-contained and concise, and requires only basic knowledge of acoustics and signal processing, making it valuable as a textbook for graduate and undergraduate university courses.
"This text covers key mathematical principles and algorithms for
nonlinear filters used in image processing. Readers will gain an
in-depth understanding of the underlying mathematical and filter
design methodologies needed to construct and use nonlinear filters
in a variety of applications.
This book provides the first critical edition of Ibn al-Haytham's On the Shape of the Eclipse with English translation and commentary, which records the first scientific analysis of the camera obscura. On the Shape of the Eclipse includes pioneering research on the conditions of formation of the image, in a time deemed to be committed to aniconism. It also provides an early attempt to merge the two branches of Ancient optics-the theory of light and theory of vision. What perhaps most strongly characterizes this treatise is the close interaction of a geometric analysis of light and experimental reasoning. Ibn al-Haytham conducted his experiments in a systematic way by varying all that could be changed: the shape and size of the aperture, the focal length of the camera obscura, the distance and shape of the celestial bodies. This way, he achieved a thorough understanding. This work represents a decisive step in both the history of optics and the application of the experimental method that was just as efficient in medieval Islam as today.
This book introduces new methods to analyze vertex-varying graph signals. In many real-world scenarios, the data sensing domain is not a regular grid, but a more complex network that consists of sensing points (vertices) and edges (relating the sensing points). Furthermore, sensing geometry or signal properties define the relation among sensed signal points. Even for the data sensed in the well-defined time or space domain, the introduction of new relationships among the sensing points may produce new insights in the analysis and result in more advanced data processing techniques. The data domain, in these cases and discussed in this book, is defined by a graph. Graphs exploit the fundamental relations among the data points. Processing of signals whose sensing domains are defined by graphs resulted in graph data processing as an emerging field in signal processing. Although signal processing techniques for the analysis of time-varying signals are well established, the corresponding graph signal processing equivalent approaches are still in their infancy. This book presents novel approaches to analyze vertex-varying graph signals. The vertex-frequency analysis methods use the Laplacian or adjacency matrix to establish connections between vertex and spectral (frequency) domain in order to analyze local signal behavior where edge connections are used for graph signal localization. The book applies combined concepts from time-frequency and wavelet analyses of classical signal processing to the analysis of graph signals. Covering analytical tools for vertex-varying applications, this book is of interest to researchers and practitioners in engineering, science, neuroscience, genome processing, just to name a few. It is also a valuable resource for postgraduate students and researchers looking to expand their knowledge of the vertex-frequency analysis theory and its applications. The book consists of 15 chapters contributed by 41 leading researches in the field.
As the need for geographical data rapidly expands in the 21st century, so too do applications of small-format aerial photography for a wide range of scientific, commercial and governmental purposes. Small-format Aerial Photography (SFAP) presents basic and advanced principles and techniques with an emphasis on digital cameras. Unmanned platforms are described in considerable detail, including kites, helium and hot-air blimps, model airplanes, and paragliders. Several case studies, primarily drawn from the geosciences, are presented to demonstrate how SFAP is actually used in various applications. Many of these integrate SFAP with ground-based investigations as well as conventional large-format aerial photography, satellite imagery, and other kinds of geographic information.
Taking the Qinghai-Tibet Railway as an example, this book introduces intelligent processing for Global Positioning Data (GPS) data. Combining theory with practical applications, it provides essential insights into the Chinese Qinghai-Tibet Railway and novel methods of data processing for GPS satellite positioning, making it a valuable resource for all those working with train control systems, train positioning systems, satellite positioning, and intelligent data processing. As satellite positioning guarantees the safe and efficient operation of train control systems, it focuses on how to best process the GPS data collected, including methods for error detection, reduction and information fusion.
Photographic imagery has come a long way from the pinhole cameras of the nineteenth century. Digital imagery, and its applications, develops in tandem with contemporary society's sophisticated literacy of this subtle medium. This book examines the ways in which digital images have become ever more ubiquitous as legal and medical evidence, just as they have become our primary source of news and have replaced paper-based financial documentation. Crucially, the contributions also analyze the very profound problems which have arisen alongside the digital image, issues of veracity and progeny that demand systematic and detailed response: It looks real, but is it? What camera captured it? Has it been doctored or subtly altered? Attempting to provide answers to these slippery issues, the book covers how digital images are created, processed and stored before moving on to set out the latest techniques for forensically examining images, and finally addressing practical issues such as courtroom admissibility. In an environment where even novice users can alter digital media, this authoritative publication will do much so stabilize public trust in these real, yet vastly flexible, images of the world around us. |
You may like...
Multi-scale Simulation of Composite…
Stefan Diebels, Sergej Rjasanow
Hardcover
R2,653
Discovery Miles 26 530
Advanced Computational Intelligence…
Sheryl Brahnam, Lakhmi C. Jain
Hardcover
R2,781
Discovery Miles 27 810
Domain Decomposition Methods in Science…
Jocelyne Erhel, Martin J. Gander, …
Hardcover
R5,340
Discovery Miles 53 400
Constructive Approximation on the Sphere…
W Freeden, T. Gervens, …
Hardcover
R3,855
Discovery Miles 38 550
|