![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
Since the discovery of the giant magnetoresistance (GMR) effect in 1988, spintronics has been presented as a new technology paradigm, awarded by the Nobel Prize in Physics in 2007. Initially used in read heads of hard disk drives, and while disputing a piece of the market to the flash memories, GMR devices have broadened their range of usage by growing towards magnetic field sensing applications in a huge range of scenarios. Potential applications at the time of the discovery have become real in the last two decades. Definitively, GMR was born to stand. In this sense, selected successful approaches of GMR based sensors in different applications: space, automotive, microelectronics, biotechnology ... are collected in the present book. While keeping a practical orientation, the fundamentals as well as the current trends and challenges of this technology are also analyzed. In this sense, state of the art contributions from academy and industry can be found through the contents. This book can be used by starting researchers, postgraduate students and multidisciplinary scientists in order to have a reference text in this topical fascinating field.
Albert Einstein's General Theory of Relativity, published in 1915, made a remarkable prediction: gravitational radiation. Just like light (electromagnetic radiation), gravity could travel through space as a wave and affect any objects it encounters by alternately compressing and expanding them. However, there was a problem. The force of gravity is around a trillion, trillion, trillion times weaker than electromagnetism so the calculated compressions and expansions were incredibly small, even for gravity waves resulting from a catastrophic astrophysical event such as a supernova explosion in our own galaxy. Discouraged by this result, physicists and astronomers didn't even try to detect these tiny, tiny effects for over 50 years. Then, in the late 1960s and early 1970s, two events occurred which started the hunt for gravity waves in earnest. The first was a report of direct detection of gravity waves thousands of times stronger than even the most optimistic calculation. Though ultimately proved wrong, this result started scientists thinking about what instrumentation might be necessary to detect these waves. The second was an actual, though indirect, detection of gravitational radiation due to the effects it had on the period of rotation of two 'neutron stars' orbiting each other. In this case, the observations were in exact accord with predictions from Einstein's theory, which confirmed that a direct search might ultimately be successful. Nevertheless, it took another 40 years of development of successively more sensitive detectors before the first real direct effects were observed in 2015, 100 years after gravitational waves were first predicted. This is the story of that hunt, and the insight it is producing into an array of topics in modern science, from the creation of the chemical elements to insights into the properties of gravity itself.
This monograph and translation from the Russian describes in detail and comments on the fundamentals of metrology. The basic concepts of metrology, the principles of the International System of Units SI, the theory of measurement uncertainty, the new methodology of estimation of measurement accuracy on the basis of the uncertainty concept, as well as the methods for processing measurement results and estimating their uncertainty are discussed from the modern position. It is shown that the uncertainty concept is compatible with the classical theory of accuracy. The theory of random uncertainties is supplemented with their most general description on the basis of generalized normal distribution; the instrumental systematic errors are presented in connection with the methodology of normalization of the metrological characteristics of measuring instruments. The information about modern systems of traceability is given. All discussed theoretical principles and calculation methods are illustrated with examples.
This book explores the microsensing technologies and systems now available to monitor the quality of air and water within the urban environment and examines their role in the creation of sustainable cities against the background of the challenges posed by rapid urbanization. The opening section addresses the theoretical and conceptual background of microsensing networks. The coverage includes detailed description of microsensors, supported by design-specific equations, and clear explanation of the ways in which devices that harvest energy from ambient sources can detect and quantify pollution. The practical application of such systems in addressing environmental impacts within cities and in sustainable urban planning is then discussed with the aid of case studies in developing countries. The book will be of interest to all who wish to understand the benefits of microsensing networks in promoting sustainable cities through better delivery of information on health hazards and improved provision of data to environmental agencies and regulatory bodies in order to assist in monitoring, decision-making, and regulatory enforcement.
This book shows how Bohmian mechanics overcomes the need for a measurement postulate involving wave function collapse. The measuring process plays a very important role in quantum mechanics. It has been widely analyzed within the Copenhagen approach through the Born and von Neumann postulates, with later extension due to Luders. In contrast, much less effort has been invested in the measurement theory within the Bohmian mechanics framework. The continuous measurement (sharp and fuzzy, or strong and weak) problem is considered here in this framework. The authors begin by generalizing the so-called Mensky approach, which is based on restricted path integral through quantum corridors. The measuring system is then considered to be an open quantum system following a stochastic Schroedinger equation. Quantum stochastic trajectories (in the Bohmian sense) and their role in basic quantum processes are discussed in detail. The decoherence process is thereby described in terms of classical trajectories issuing from the violation of the noncrossing rule of quantum trajectories.
Spacecraft TT&C and Information Transmission Theory and Technologies introduces the basic theory of spacecraft TT&C (telemetry, track and command) and information transmission. Combining TT&C and information transmission, the book presents several technologies for continuous wave radar including measurements for range, range rate and angle, analog and digital information transmissions, telecommand, telemetry, remote sensing and spread spectrum TT&C. For special problems occurred in the channels for TT&C and information transmission, the book represents radio propagation features and its impact on orbit measurement accuracy, and the effects caused by rain attenuation, atmospheric attenuation and multi-path effect, and polarization composition technology. This book can benefit researchers and engineers in the field of spacecraft TT&C and communication systems. Liu Jiaxing is a professor at The 10th Institute of China Electronics Technology Group Corporation.
This volume presents measurement uncertainty and uncertainty budgets in a form accessible to practicing engineers and engineering students from across a wide range of disciplines. The book gives a detailed explanation of the methods presented by NIST in the "GUM" - Guide to Uncertainty of Measurement. Emphasis is placed on explaining the background and meaning of the topics, while keeping the level of mathematics at the minimum level necessary. Dr. Colin Ratcliffe, USNA, and Bridget Ratcliffe, Johns Hopkins, develop uncertainty budgets and explain their use. In some examples, the budget may show a process is already adequate and where costs can be saved. In other examples, the budget may show the process is inadequate and needs improvement. The book demonstrates how uncertainty budgets help identify the most cost effective place to make changes. In addition, an extensive fully-worked case study leads readers through all issues related to an uncertainty analysis, including a variety of different types of uncertainty budgets. The book is ideal for professional engineers and students concerned with a broad range of measurement assurance challenges in applied sciences. This book also: Facilitates practicing engineers' understanding of uncertainty budgets, essential to calculating cost-effective savings to a wide variety of processes contingent on measurement Presents uncertainty budgets in an accessible style suitable for all undergraduate STEM courses that include a laboratory component Provides a highly adaptable supplement to graduate textbooks for courses where students' work includes reporting on experimental results Includes an expanded case study developing uncertainty from transducers though measurands and propagated to the final measurement that can be used as a template for the analysis of many processes Stands as a useful pocket reference for all engineers and experimental scientists
This book discusses the study of double charm B decays and the first observation of B0->D0D0Kst0 decay using Run I data from the LHCb experiment. It also describes in detail the upgrade for the Run III of the LHCb tracking system and the trigger and tracking strategy for the LHCb upgrade, as well as the development and performance studies of a novel standalone tracking algorithm for the scintillating fibre tracker that will be used for the LHCb upgrade. This algorithm alone allows the LHCb upgrade physics program to achieve incredibly high sensitivity to decays containing long-lived particles as final states as well as to boost the physics capabilities for the reconstruction of low momentum particles.
This book comprehensively and systematically introduces readers to the theories, structures, performance and applications of non-driven mechanical and non-driven micromechanical gyroscopes. The book is divided into three parts, the first of which mainly addresses mathematic models, precision, performance and operating error in non-driven mechanical gyroscopes. The second part focuses on the operating theory, error, phase shift and performance experiments involving non-driven micromechanical gyroscopes in rotating flight carriers, while the third part shares insights into the application of non-driven micromechanical gyroscopes in control systems for rotating flight carriers. The book offers a unique resource for all researchers and engineers who are interested in the use of inertial devices and automatic control systems for rotating flight carriers. It can also serve as a reference book for undergraduates, graduates and instructors in related fields at colleges and universities.
Conceived as a reference manual for practicing engineers, instrument designers, service technicians and engineering students. The related fields of physics, mechanics and mathematics are frequently incorporated to enhance the understanding of the subject matter. Historical anecdotes as far back as Hellenistic times to modern scientists help illustrate in an entertaining manner ideas ranging from impractical inventions in history to those that have changed our lives.
The goal of the project presented in this book is to detect neutrinos created by resonant interactions of ultrahigh energy cosmic rays on the CMB photon field filling the Universe. In this pioneering first analysis, the author puts forward much of the analysis framework, including calibrations of the electronic hardware and antenna geometry, as well as the development of algorithms for event reconstruction and data reduction. While only two of the 37 stations planned for the Askaryan Radio Array were used in this assessment of 10 months of data, the analysis was able to exclude neutrino fluxes above 10 PeV with a limit not far from the best current limit set by the IceCube detector, a result which establishes the radio detection technique as the path forward to achieving the massive volumes needed to detect these ultrahigh energy neutrinos.
This volume contains the proceedings of possibly the last conference ever on integral-field spectroscopy. The contributors, noted authorities in the field, focus on the scientific questions that can be answered with integral-field spectroscopy, ranging from solar system studies all the way to high redshift surveys. Overall readers get a state-of-the-science review of astronomical 3D spectroscopy.
The book is a collection of peer-reviewed scientific papers submitted by active researchers in the 1st International Conference on Advancements of Medical Electronics (ICAME2015). The conference is organized jointly by the Department of Biomedical Engineering and Electronics and Communication Engineering, JIS College of Engineering, West Bengal, India. The primary objective of the conference is to strengthen interdisciplinary research and its applications for the welfare of humanity. A galaxy of academicians, professionals, scientists, statesman and researchers from different parts of the country and abroad got together and shared their knowledge. The book presents research articles of medical image processing & analysis, biomedical instrumentation & measurements, DSP & clinical applications, embedded systems & its applications in healthcare. The book can be referred as a tool for further research.
"Neutron Applications in Materials for Energy "collects results and conclusions of recent neutron-based investigations of materials that are important in the development of sustainable energy. Chapters are authored by leading scientists with hands-on experience in the field, providing overviews, recent highlights, and case-studies to illustrate the applicability of one or more neutron-based techniques of analysis. The theme follows energy production, storage, and use, but each chapter, or section, can also be read independently, with basic theory and instrumentation for neutron scattering being outlined in the introductory chapter. Whilst neutron scattering is extensively used to understand properties of condensed matter, neutron techniques are exceptionally-well suited to studying how the transport and binding of energy and charge-carrying molecules and ions are related to their dynamics and the material s crystal structure. These studies extend to "in situ" and "in operando" in some cases. The species of interest in leading energy-technologies include H2, H+, and Li+ which have particularly favourable neutron-scattering properties that render these techniques of analysis ideal for such studies and consequently, neutron-based analysis is common-place for hydrogen storage, fuel-cell, catalysis, and battery materials. Similar research into the functionality of solar cell, nuclear, and CO2 capture/storage materials rely on other unique aspects of neutron scattering and again show how structure and dynamics provide an understanding of the material stability and the binding and mobility of species of interest within these materials. Scientists and students looking for methods to help them understand the atomic-level mechanisms and behaviour underpinning the performance characteristics of energy materials will find "Neutron Applications in Materials for Energy "a valuable resource, whilst the wider audience of sustainable energy scientists, and newcomers to neutron scattering should find this a useful reference. "
The use of scintillating materials in the detection of ionising radiation for medical imaging is the main topic of this book. It starts with an overview of the state of the art in using radiation detectors for medical imaging, followed by an in depth discussion of all aspects of the use of scintillating materials for this application. Possibilities to improve the performance of existing scintillating materials and completely new ideas on how to use scintillating materials are discussed in detail. The first 4 chapters contain a general overview of the applications of radiation detectors in medicine and present a closer look at the 3 most important subfields, X-ray imaging, gamma ray imaging and PET. One chapter is devoted to semiconductor detectors, a promising new area, and two chapters are devoted to recent technical advances in PET. The remaining 5 chapters deal with scintillating materials and their use in medical imaging.
In the present work, the target station of the accelerator-driven neutron source HBS is optimized in comprehensive parameter studies using the Monto-Carlo method. The dependence of the most important performance characteristics of such a system on the external parameters is investigated neglecting technical and mechanical limitations. In this way, qualitative and quantitative statements for all possible configurations and envisaged applications can be derived and should be considered in the detailed planning of such facilities. For this purpose, different scenarios are considered that place completely different requirements on the design of the target station. The central statements derived in this thesis can be transferred to any framework conditions, such as different accelerator energies, so that these results can be used in the development of other neutron sources, which together with the HBS form a European network and provide a prosperous community in neutron science.
This book focuses on the development of wellness protocols for smart home monitoring, aiming to forecast the wellness of individuals living in ambient assisted living (AAL) environments. It describes in detail the design and implementation of heterogeneous wireless sensors and networks as applied to data mining and machine learning, which the protocols are based on. Further, it shows how these sensor and actuator nodes are deployed in the home environment, generating real-time data on object usage and other movements inside the home, and therefore demonstrates that the protocols have proven to offer a reliable, efficient, flexible, and economical solution for smart home systems. Documenting the approach from sensor to decision making and information generation, the book addresses various issues concerning interference mitigation, errors, security and large data handling. As such, it offers a valuable resource for researchers, students and practitioners interested in interdisciplinary studies at the intersection of wireless sensing processing, radio communication, the Internet of Things and machine learning, and in how they can be applied to smart home monitoring and assisted living environments.
The book is a comprehensive edition which considers the interactions of atoms, ions and molecules with charged particles, photons and laser fields and reflects the present understanding of atomic processes such as electron capture, target and projectile ionisation, photoabsorption and others occurring in most of laboratory and astrophysical plasma sources including many-photon and many-electron processes. The material consists of selected papers written by leading scientists in various fields.
For the first time, the authors provide a comprehensive and consistent presentation of all techniques available in this field. They rigorously analyze the behavior of different electrochemical single and multipotential step techniques for electrodes of different geometries and sizes under transient and stationary conditions. The effects of these electrode features in studies of various electrochemical systems (solution systems, electroactive monolayers, and liquid-liquid interfaces) are discussed. Explicit analytical expressions for the current-potential responses are given for all available cases. Applications of each technique are outlined for the elucidation of reaction mechanisms. Coverage is comprehensive: normal pulse voltammetry, double differential pulse voltammetry, reverse pulse voltammetry and other triple and multipulse techniques, such as staircase voltammetry, differential staircase voltammetry, differential staircase voltcoulommetry, cyclic voltammetry, square wave voltammetry and square wave voltcoulommetry.
This thesis contains new research in both experimental and theoretical particle physics, making important contributions in each. Two analyses of collision data from the ATLAS experiment at the LHC are presented, as well as two phenomenological studies of heavy coloured resonances that could be produced at the LHC. The first data analysis was the measurement of top quark-antiquark production with a veto on additional jet activity. As the first detector-corrected measurement of jet activity in top-antitop events it played an important role in constraining the theoretical modelling, and ultimately reduced these uncertainties for ATLAS's other top-quark measurements by a factor of two. The second data analysis was the measurement of Z+2jet production and the observation of the electroweak vector boson fusion (VBF) component. As the first observation of VBF at a hadron collider, this measurement demonstrated new techniques to reliably extract VBF processes and paved the way for future VBF Higgs measurements. The first phenomenological study developed a new technique for identifying the colour of heavy resonances produced in proton-proton collisions. As a by-product of this study an unexpected and previously unnoticed correlation was discovered between the probability of correctly identifying a high-energy top and the colour structure of the event it was produced in. The second phenomenological study explored this relationship in more detail, and could have important consequences for the identification of new particles that decay to top quarks.
The work presented in this thesis spans a wide range of experimental particle physics subjects, starting from level-1 trigger electronics to the final results of the search for Higgs boson decay and to tau lepton pairs. The thesis describes an innovative reconstruction algorithm for tau decays and details how it was instrumental in providing a measurement of Z decay to tau lepton pairs. The reliability of the analysis is fully established by this measurement before the Higgs boson decay to tau lepton pairs is considered. The work described here continues to serve as a model for analysing CMS Higgs to tau leptons measurements.
Dimensional metrology is an essential part of modern manufacturing technologies, but the basic theories and measurement methods are no longer sufficient for today's digitized systems. The information exchange between the software components of a dimensional metrology system not only costs a great deal of money, but also causes the entire system to lose data integrity. "Information Modeling for Interoperable Dimensional Metrology" analyzes interoperability issues in dimensional metrology systems and describes information modeling techniques. It discusses new approaches and data models for solving interoperability problems, as well as introducing process activities, existing and emerging data models, and the key technologies of dimensional metrology systems. Written for researchers in industry and academia, as well as advanced undergraduate and postgraduate students, this book gives both an overview and an in-depth understanding of complete dimensional metrology systems. By covering in detail the theory and main content, techniques, and methods used in dimensional metrology systems, "Information Modeling for Interoperable Dimensional Metrology" enables readers to solve real-world dimensional measurement problems in modern dimensional metrology practices. "
This book describes the application of a novel technology for beam instrumentation and luminosity measurement and first results on a cutting edge technology potentially to be used after the upgrade of the Large Hadron Collider to higher luminosity. It presents a unique diamond-based luminometer with a detailed performance study. The online bunch-by-bunch luminosity measurements provide an invaluable feedback to the Collider for beam optimisation and for the understanding of beam dynamics. The precision of the luminosity measurement is crucial for all physics analyses. This book highlights the Van der Meer method, which is used for the calibration of the luminometers of the CMS (Compact Muon Solenoid) experiment, and describes the estimate of systematic uncertainties, e.g. due to radiation damage of sensors and electronics and uncertainties of beam parameters. For the future high-luminosity upgrade of the collider, sapphire sensors are investigated in a test beam. It is demonstrated for the first time that sapphire sensors can be used as single particle detectors. A model for the charge transport in sapphire is developed and successfully applied. |
You may like...
Hydrocarbon Fluid Inclusions in…
Vivekanandan Nandakumar, J.L. Jayanthi
Paperback
R3,485
Discovery Miles 34 850
Handbook of Biofuels Production…
Rafael Luque, Carol Sze Ki Lin, …
Paperback
R6,671
Discovery Miles 66 710
Small Angle X-Ray and Neutron Scattering…
Yixin Zhao, Shimin Liu, …
Paperback
R3,208
Discovery Miles 32 080
Microbial Bioinformatics in the Oil and…
Kenneth Wunch, Marko Stipanicev, …
Hardcover
R3,801
Discovery Miles 38 010
Petroleum Related Rock Mechanics, Volume…
Erling Fjaer, Rune Martin Holt, …
Paperback
R3,965
Discovery Miles 39 650
|