![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Electronics & communications engineering > Electronics engineering > Applied optics
This textbook, intended for advanced undergraduate and graduate students, is an introduction to the physical and mathematical principles used in clinical medical imaging. The first two chapters introduce basic concepts and useful terms used in medical imaging and the tools implemented in image reconstruction, while the following chapters cover an array of topics such as physics of x-rays and their implementation in planar and computed tomography (CT) imaging; nuclear medicine imaging and the methods of forming functional planar and single photon emission computed tomography (SPECT) images and Clinical imaging using positron emitters as radiotracers. The book also discusses the principles of MRI pulse sequencing and signal generation, gradient fields, and the methodologies implemented for image formation, form flow imaging and magnetic resonance angiography and the basic physics of acoustic waves, the different acquisition modes used in medical ultrasound, and the methodologies implemented for image formation and flow imaging using the Doppler Effect. By the end of the book, readers will know what is expected from a medical image, will comprehend the issues involved in producing and assessing the quality of a medical image, will be able to conceptually implement this knowledge in the development of a new imaging modality, and will be able to write basic algorithms for image reconstruction. Knowledge of calculus, linear algebra, regular and partial differential equations, and a familiarity with the Fourier transform and it applications is expected, along with fluency with computer programming. The book contains exercises, homework problems, and sample exam questions that are exemplary of the main concepts and formulae students would encounter in a clinical setting.
This book introduces an efficient resource management approach for future spectrum sharing systems. The book focuses on providing an optimal resource allocation framework based on carrier aggregation to allocate multiple carriers' resources efficiently among mobile users. Furthermore, it provides an optimal traffic dependent pricing mechanism that could be used by network providers to charge mobile users for the allocated resources. The book provides different resource allocation with carrier aggregation solutions, for different spectrum sharing scenarios, and compares them. The provided solutions consider the diverse quality of experience requirement of multiple applications running on the user's equipment since different applications require different application performance. In addition, the book addresses the resource allocation problem for spectrum sharing systems that require user discrimination when allocating the network resources.
This book introduces a new cyberphysical system that combines clinical and basic neuroscience research with advanced data analysis and medical management tools for developing novel applications for the management of epilepsy. The authors describe the algorithms and architectures needed to provide ambulatory, diagnostic and long-term monitoring services, through multi parametric data collection. Readers will see how to achieve in-hospital quality standards, addressing conventional "routine" clinic-based service purposes, at reduced cost, enhanced capability and increased geographical availability. The cyberphysical system described in this book is flexible, can be optimized for each patient and is demonstrated in several case studies.
Human Factors of Stereoscopic Displays provides an overview of all vision-relevant topics and issues that inform stereo display design from a user-centric or human factor, perspective. Although both the basic vision science literature and the applied literature will be reviewed, the strength and originality of this book comes from the emphasis on the basic science literature on human stereo vision and its implications for stereo display design. The reader will learn how to design stereo displays from a human vision/human factors perspective. Over the past several years, there has been a growing interest in the development of high-quality displays that present binocular parallax information to the human visual system for inducing the perception of three-dimensional depth. The methods for presenting binocular parallax to an observer vary widely and include three broad categories of display: stereoscopic, holographic and volumetric displays. Because the technology for stereoscopic displays is more developed and more widely used, than those based on holography or volumetric methods, the proposed book addresses those human factors issues involved in the viewing of stereoscopic displays. Despite the diverse methods for creating stereoscopic displays, which includes stereo spatial multiplexing as well as temporal multiplexing (i.e., field sequential) techniques, there remain common human factor issues that arise when viewing such displays. Human Factors of Stereoscopic Displays will provide a detailed review of these important issues so that they can be considered when designing and using 3D displays. In doing so, the following topics will be covered: interocular cross talk; interocular differences in luminance and contrast; accommodation-vergence mismatch; stereoanomaly; spatio-temporal frequency effects; distance scaling of disparity and high-level cue conflict. body>
Silicon, the leading material in microelectronics during the last four decades, also promises to be the key material in the future. Despite many claims that silicon technology has reached fundamental limits, the performance of silicon microelectronics continues to improve steadily. The same holds for almost all the applications for which Si was considered to be unsuitable. The main exception to this positive trend is the silicon laser, which has not been demonstrated to date. The main reason for this comes from a fundamental limitation related to the indirect nature of the Si band-gap. In the recent past, many different approaches have been taken to achieve this goal: dislocated silicon, extremely pure silicon, silicon nanocrystals, porous silicon, Er doped Si-Ge, SiGe alloys and multiquantum wells, SiGe quantum dots, SiGe quantum cascade structures, shallow impurity centers in silicon and Er doped silicon. All of these are abundantly illustrated in the present book.
Optical solitons in fibers are a beautiful example of how an abstract mathematical concept has had an impact on new information transmission technologies. The concept of all-optical data transmission with optical soliton systems is now setting the standard for the most advanced transmission systems. The book deals with the motion of light waves in optical fibers, the evolution of light wavepackets, optical information transfer, all-optical soliton transmission systems, the control of optical solitons, polarization effects, dispersion-managed solitons, WDM transmission, soliton lasers, all-optical switching and other applications. This book is a must for all researchers and graduate students active in the field of optical data transmission.
Ultrafast Phenomena XV presents the latest advances in ultrafast science, including both ultrafast optical technology and the study of ultrafast phenomena. It covers picosecond, femtosecond, and attosecond processes relevant to applications in physics, chemistry, biology, and engineering. Ultrafast technology has a profound impact in a wide range of applications, among them biomedical imaging, chemical dynamics, frequency standards, materials processing, and ultrahigh-speed communications. This book summarizes the results presented at the 15th International Conference on Ultrafast Phenomena and provides an up-to-date view of this important and rapidly advancing field.
Modern telecommunication systems are highly complex from an algorithmic point of view. The complexity continues to increase due to advanced modulation schemes, multiple protocols and standards, as well as additional functionality such as personal organizers or navigation aids. To have short and reliable design cycles, efficient verification methods and tools are necessary. Modeling and simulation need to accompany the design steps from the specification to the overall system verification in order to bridge the gaps between system specification, system simulation, and circuit level simulation. Very high carrier frequencies together with long observation periods result in extremely large computation times and requires, therefore, specialized modeling methods and simulation tools on all design levels. The focus of Modeling and Simulation for RF System Design lies on RF specific modeling and simulation methods and the consideration of system and circuit level descriptions. It contains application-oriented training material for RF designers which combines the presentation of a mixed-signal design flow, an introduction into the powerful standardized hardware description languages VHDL-AMS and Verilog-A, and the application of commercially available simulators. Models are provided on a CD-ROM included with the book because models are necessary to reproduce, understand and explore the real world behavior on a simulation platform. Modeling and Simulation for RF System Design is addressed to graduate students and industrial professionals who are engaged in communication system design and want to gain insight into the system structure by own simulation experiences. The authors areexperts in design, modeling and simulation of communication systems engaged at the Nokia Research Center (Bochum, Germany) and the Fraunhofer Institute for Integrated Circuits, Branch Lab Design Automation (Dresden, Germany).
The fascinating pages of this book detail many of the key issues associated with the scaling to nano-dimensions of silicon-on-insulator structures.Some papers offer new insight particularly at the device/circuit interface as appropriate for SOI which is fast becoming a mainstream technology.One of the key issues concerns mobility degradation in SOI films less than about 5nm.The advantages of combining scaled SOI devices with high permittivity (k) dielectric indicates that potential solutions are indeed available down to the 22nm node.A further key issue and potential show stopper' for SOI CMOS is highlighted in a number of invited and contributed papers addressing atomistic level effects.
Multi-Frame Motion-Compensated Prediction for Video Transmission presents a comprehensive description of a new technique in video coding and transmission. The work presented in the book has had a very strong impact on video coding standards and will be of interest to practicing engineers and researchers as well as academics. The multi-frame technique and the Lagrangian coder control have been adopted by the ITU-T as an integral part of the well known H.263 standard and are were adopted in the ongoing H.26L project of the ITU-T Video Coding Experts Group. This work will interest researchers and students in the field of video coding and transmission. Moreover, engineers in the field will also be interested since an integral part of the well known H.263 standard is based on the presented material.
This book presents the latest developments in biometrics technologies and reports on new approaches, methods, findings, and technologies developed or being developed by the research community and the industry. The book focuses on introducing fundamental principles and concepts of key enabling technologies for biometric systems applied for both physical and cyber security. The authors disseminate recent research and developing efforts in this area, investigate related trends and challenges, and present case studies and examples such as fingerprint, face, iris, retina, keystroke dynamics, and voice applications . The authors also investigate the advances and future outcomes in research and development in biometric security systems. The book is applicable to students, instructors, researchers, industry practitioners, and related government agencies staff. Each chapter is accompanied by a set of PowerPoint slides for use by instructors.
This book traces the quest to use nanostructured media for novel and improved optoelectronic devices. Starting with the invention of the heterostructure laser, the progression via thin films to quasi zero-dimensional quantum dots has led to novel device concepts and tremendous improvements in device performance. Along the way sophisticated methods of material preparation and characterization have been developed. Novel physical phenomena have emerged and are now used in devices such as lasers and optical amplifiers. Leading experts - among them Nobel laureate Zhores Alferov - write here about the fundamental concepts behind nano-optoelectronics, the material basis, physical phenomena, device physics and systems.
In order to adapt to the ever-increasing demands of telecommunication needs, today 's network operators are implementing 100 Gb/s per dense wavelength division multiplexing (DWDM) channel transmission. At those data rates, the performance of fiberoptic communication systems is degraded significantly due to intra- and inter-channel fiber nonlinearities, polarization-mode dispersion (PMD), and chromatic dispersion. In order to deal with those channel impairments, novel advanced techniques in modulation and detection, coding and signal processing are needed. This unique book represents a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory with the fundamentals of optical communication.
This book provides a detailed overview on the use of global optimization and parallel computing in microwave tomography techniques. The book focuses on techniques that are based on global optimization and electromagnetic numerical methods. The authors provide parallelization techniques on homogeneous and heterogeneous computing architectures on high performance and general purpose futuristic computers. The book also discusses the multi-level optimization technique, hybrid genetic algorithm and its application in breast cancer imaging.
The deployment of high-order modulation formats in optical fiber transmission systems is presently seen as a promising way of increasing spectral efficiency and of making better use of the capacity of currently existing fiber infrastructure. Catering to this interest, this book presents possible ways of generating and detecting optical signals with high-order phase and quadrature amplitude modulation and characterizes their system and transmission properties. Several implementation options for high-order modulation optical transmitters are possible. Their optical and electrical parts are described and their individual signal properties are discussed. Receiver concepts with direct detection, homodyne differential detection and homodyne synchronous detection are illustrated, starting with optical frontends and ending with concrete data recovery. The description of transmitters and receivers provided in the first part of the book not only helps to demonstrate their functioning, but also allows their complexity and practicability to be estimated and compared. To advance understanding of the system and transmission behavior of high-order modulation formats for optical fiber transmission, various system parameters such as noise performances, optimal receiver filter bandwidths, required laser linewidths and the chromatic dispersion and self phase modulation tolerances of a wide range of modulation formats are highlighted in the second part of the book, considering different line codes and many transmitter and receiver configurations. Currently, the determination of attainable transmission distances for multi-span long-haul transmission using high-order modulation formats represents an exciting field of research. Recent results in this area are also covered by this book. This monograph is intended for students and researchers in the field of optical communications, as well as for system designers who want to learn about the properties and complexity of optical systems employing high-order modulation.
Brain Inspired Cognitive Systems - BICS 2010 aims to bring together leading scientists and engineers who use analytic and synthetic methods both to understand the astonishing processing properties of biological systems and specifically of the brain, and to exploit such knowledge to advance engineering methods to build artificial systems with higher levels of cognitive competence. BICS is a meeting point of brain scientists and cognitive systems engineers where cross-domain ideas are fostered in the hope of getting emerging insights on the nature, operation and extractable capabilities of brains. This multiple approach is necessary because the progressively more accurate data about the brain is producing a growing need of a quantitative understanding and an associated capacity to manipulate this data and translate it into engineering applications rooted in sound theories. BICS 2010 is intended for both researchers that aim to build brain inspired systems with higher cognitive competences, and for life scientists who use and develop mathematical and engineering approaches for a better understanding of complex biological systems like the brain. Four major interlaced focal symposia are planned for this conference and these are organized into patterns that encourage cross-fertilization across the symposia topics. This emphasizes the role of BICS as a major meeting point for researchers and practitioners in the areas of biological and artificial cognitive systems. Debates across disciplines will enrich researchers with complementary perspectives from diverse scientific fields. BICS 2010 will take place July 14-16, 2010, in Madrid, Spain.
This book presents the a scientific discussion of the state-of-the-art techniques and designs for modeling, testing and for the performance analysis of data converters. The focus is put on sustainable data conversion. Sustainability has become a public issue that industries and users can not ignore. Devising environmentally friendly solutions for data conversion designing, modeling and testing is nowadays a requirement that researchers and practitioners must consider in their activities. This book presents the outcome of the IWADC workshop 2011, held in Orvieto, Italy.
This book presents lecture materials from the Third LOFAR Data School, transformed into a coherent and complete reference book describing the LOFAR design, along with descriptions of primary science cases, data processing techniques, and recipes for data handling. Together with hands-on exercises the chapters, based on the lecture notes, teach fundamentals and practical knowledge. LOFAR is a new and innovative radio telescope operating at low radio frequencies (10-250 MHz) and is the first of a new generation of radio interferometers that are leading the way to the ambitious Square Kilometre Array (SKA) to be built in the next decade. This unique reference guide serves as a primary information source for research groups around the world that seek to make the most of LOFAR data, as well as those who will push these topics forward to the next level with the design, construction, and realization of the SKA. This book will also be useful as supplementary reading material for any astrophysics overview or astrophysical techniques course, particularly those geared towards radio astronomy (and radio astronomy techniques).
Edward Teller Medalists: Laser Fusion Research in 30 Years (C. Yamanaka). New Basic Physics Derived from Laser Plasma Interaction (H. Hora). Lasers: Demonstration of a Nuclear FlashPumped Iodine Laser (G. Miley, W. Williams). Progress in ICF and XRay Laser Experiments at CAEP (H.S. Peng et al.). Interaction Mechanisms: Distributed Absorption and Inhibited Heat Transport (J.S. DeGroot et al.). A Survey of Ion Acoustic Decay Instabilities in Laser Produced Plasma (K. Mizuno). Inertial Fusion Energy Strategy: Advancement of Inertial Fusion Research (C. Yamanaka). Inertial Fusion Energy Results: Interaction Physics for Megajoule Laser Fusion Targets (W.L. Kruer). Related Ion Beam Interactions: Focusing and Propagation of the Proton Beam (K. Niu). Basic Phenomena: Acceleration of Electrons by Lasers in Vacuum (T. Hauser et al.). 37 additional articles. Index.
This book summarizes the authors' latest research on narrowband interference and impulsive noise mitigation and cancelation, including (i) mitigating the impacts of NBI on synchronization; (ii) improving time-frequency interleaving performance under NBI and IN; (iii) accurately recovering and eliminating NBI and IN. The complicated, random and intensive narrowband interference and impulsive noise are a serious bottleneck of the next-generation wireless communications and Internet of things. This book also proposes effective and novel frameworks and algorithms, which will significantly improve the capability of mitigating and eliminating NBI and IN in the next-generation broadband communications systems. This book not only presents thorough theoretical models and algorithm design guidelines, but also provides adequate simulation and experimental engineering methods and results. The book is a valuable reference for those engaged in theoretical study, algorithm design and engineering practice in related fields, such as wireless communications, smart lighting, IoT and smart grid communications.
Solid-state lasers have seen a fast and steady development and are the ubiquitous tool both for research and industrial applications. The author's monograph Solid-State Lasers has become the most-used reference book in this area. The present graduate text on solid-state lasers takes advantage of this rich source by focusing on the needs at the graduate level and those who need an introduction. Numerous exercises with hints for solution, new text and updated material where needed make this text very accessible.
This book comprehensively describes high-resolution microwave imaging and super-resolution information processing technologies and discusses new theories, methods and achievements in the high-resolution microwave imaging fields. Its chapters, which include abundant research results and examples, systematically summarize the authors' main research findings in recent years. The book is intended for researchers, engineers and postgraduates in the fields of electronics systems, signal information processing and data analysis, microwave remote sensing and microwave imaging radar, as well as space technology, especially in the microwave remote sensing and airborne or space-borne microwave imaging radar fields.
This book describes a systematic approach to scattering of transient fields which can be introduced in undergraduate or graduate courses. The initial boundary value problems considered describe the transient electromagnetic fields formed by open periodic, compact, and waveguide resonators. The methods developed and the mathematical and physical results obtained provide a basis on which a modern theory for the scattering of resonant non-harmonic waves can be developed.
This book introduces the methods for predicting the future behavior of a system's health and the remaining useful life to determine an appropriate maintenance schedule. The authors introduce the history, industrial applications, algorithms, and benefits and challenges of PHM (Prognostics and Health Management) to help readers understand this highly interdisciplinary engineering approach that incorporates sensing technologies, physics of failure, machine learning, modern statistics, and reliability engineering. It is ideal for beginners because it introduces various prognostics algorithms and explains their attributes, pros and cons in terms of model definition, model parameter estimation, and ability to handle noise and bias in data, allowing readers to select the appropriate methods for their fields of application.Among the many topics discussed in-depth are:* Prognostics tutorials using least-squares* Bayesian inference and parameter estimation* Physics-based prognostics algorithms including nonlinear least squares, Bayesian method, and particle filter* Data-driven prognostics algorithms including Gaussian process regression and neural network* Comparison of different prognostics algorithms The authors also present several applications of prognostics in practical engineering systems, including wear in a revolute joint, fatigue crack growth in a panel, prognostics using accelerated life test data, fatigue damage in bearings, and more. Prognostics tutorials with a Matlab code using simple examples are provided, along with a companion website that presents Matlab programs for different algorithms as well as measurement data. Each chapter contains a comprehensive set of exercise problems, some of which require Matlab programs, making this an ideal book for graduate students in mechanical, civil, aerospace, electrical, and industrial engineering and engineering mechanics, as well as researchers and maintenance engineers in the above fields. |
![]() ![]() You may like...
Dialogues at the Edge of American…
Heather MacDonald, David Goodman, …
Hardcover
R4,373
Discovery Miles 43 730
Psychological Assessment in South Africa…
Sumaya Laher, Kate Cockcroft, …
Paperback
Introduction to Audiology: Global…
Frederick Martin, John Clark
Paperback
R2,130
Discovery Miles 21 300
Approaches to Social Research - The Case…
Alys Young, Bogusia Temple
Hardcover
R2,235
Discovery Miles 22 350
|