![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
Atomic Physics provides a concise treatment of atomic physics and a basis to prepare for work in other disciplines that are underpinned by atomic physics, such as chemistry, biology and several aspects of engineering science. The focus is mainly on atomic structure since this is what is primarily responsible for the physical properties of atoms. After a brief introduction to some basic concepts, the perturbation theory approach follows the hierarchy of interactions starting with the largest. The other interactions of spin, and angular momentum of the outermost electrons with each other, the nucleus and external magnetic fields are treated in order of descending strength. A spectroscopic perspective is generally taken by relating the observations of atomic radiation emitted or absorbed to the internal energy levels involved. X-ray spectra are then discussed in relation to the energy levels of the innermost electrons. Finally, a brief description is given of some modern, laser-based, spectroscopic methods for the high-resolution study of the details of atomic structure.
Measurements and experiments are made each and every day, in fields as disparate as particle physics, chemistry, economics and medicine, but have you ever wondered why it is that a particular experiment has been designed to be the way it is. Indeed, how do you design an experiment to measure something whose value is unknown, and what should your considerations be on deciding whether an experiment has yielded the sought after, or indeed any useful result? These are old questions, and they are the reason behind this volume. We will explore the origins of the methods of data analysis that are today routinely applied to all measurements, but which were unknown before the mid-19th Century. Anyone who is interested in the relationship between the precision and accuracy of measurements will find this volume useful. Whether you are a physicist, a chemist, a social scientist, or a student studying one of these subjects, you will discover that the basis of measurement is the struggle to identify the needle of useful data hidden in the haystack of obscuring background noise.
This book discusses the theory of quantum effects used in metrology, and presents the author's research findings in the field of quantum electronics. It also describes the quantum measurement standards used in various branches of metrology, such as those relating to electrical quantities, mass, length, time and frequency. The first comprehensive survey of quantum metrology problems, it introduces a new approach to metrology, placing a greater emphasis on its connection with physics, which is of importance for developing new technologies, nanotechnology in particular. Presenting practical applications of the effects used in quantum metrology for the construction of quantum standards and sensitive electronic components, the book is useful for a broad range of physicists and metrologists. It also promotes a better understanding and approval of the new system in both industry and academia. This second edition includes two new chapters focusing on the revised SI system and satellite positioning systems. Practical realization (mise en pratique) the base units (metre, kilogram, second, ampere, kelvin, candela, and mole), new defined in the revised SI, is presented in details. Another new chapter describes satellite positioning systems and their possible applications. In satellite positioning systems, like GPS, GLONASS, BeiDou and Galileo, quantum devices - atomic clocks - serve wide population of users.
Becoming Metric-Wise: A Bibliometric Guide for Researchers aims to inform researchers about metrics so that they become aware of the evaluative techniques being applied to their scientific output. Understanding these concepts will help them during their funding initiatives, and in hiring and tenure. The book not only describes what indicators do (or are designed to do, which is not always the same thing), but also gives precise mathematical formulae so that indicators can be properly understood and evaluated. Metrics have become a critical issue in science, with widespread international discussion taking place on the subject across scientific journals and organizations. As researchers should know the publication-citation context, the mathematical formulae of indicators being used by evaluating committees and their consequences, and how such indicators might be misused, this book provides an ideal tome on the topic.
This book conveys the theoretical and experimental basics of a well-founded measurement technique in the areas of high DC, AC and surge voltages as well as the corresponding high currents. Additional chapters explain the acquisition of partial discharges and the electrical measured variables. Equipment exposed to very high voltages and currents is used for the transmission and distribution of electrical energy. They are therefore tested for reliability before commissioning using standardized and future test and measurement procedures. Therefore, the book also covers procedures for calibrating measurement systems and determining measurement uncertainties, and the current state of measurement technology with electro-optical and magneto-optical sensors is discussed.
For many years, evidence suggested that all solid materials either possessed a periodic crystal structure as proposed by the Braggs or they were amorphous glasses with no long-range order. In the 1970s, Roger Penrose hypothesized structures (Penrose tilings) with long-range order which were not periodic. The existence of a solid phase, known as a quasicrystal, that possessed the structure of a three dimensional Penrose tiling, was demonstrated experimentally in 1984 by Dan Shechtman and colleagues. Shechtman received the 2011 Nobel Prize in Chemistry for his discovery. The discovery and description of quasicrystalline materials provided the first concrete evidence that traditional crystals could be viewed as a subset of a more general category of ordered materials. This book introduces the diversity of structures that are now known to exist in solids through a consideration of quasicrystals (Part I) and the various structures of elemental carbon (Part II) and through an analysis of their relationship to conventional crystal structures. Both quasicrystals and the various allotropes of carbon are excellent examples of how our understanding of the microstructure of solids has progressed over the years beyond the concepts of traditional crystallography.
This thesis reveals how the feedback trap technique, developed to trap small objects for biophysical measurement, could be adapted for the quantitative study of the thermodynamic properties of small systems. The experiments in this thesis are related to Maxwell's demon, a hypothetical intelligent, "neat fingered" being that uses information to extract work from heat, apparently creating a perpetual-motion machine. The second law of thermodynamics should make that impossible, but how? That question has stymied physicists and provoked debate for a century and a half. The experiments in this thesis confirm a hypothesis proposed by Rolf Landauer over fifty years ago: that Maxwell's demon would need to erase information, and that erasing information-resetting the measuring device to a standard starting state-requires dissipating as much energy as is gained. For his thesis work, the author used a "feedback trap" to study the motion of colloidal particles in "v irtual potentials" that may be manipulated arbitrarily. The feedback trap confines a freely diffusing particle in liquid by periodically measuring its position and applying an electric field to move it back to the origin.
This book describes modern focused ion beam microscopes and techniques and how they can be used to aid materials metrology and as tools for the fabrication of devices that in turn are used in many other aspects of fundamental metrology. Beginning with a description of the currently available instruments including the new addition to the field of plasma-based sources, it then gives an overview of ion solid interactions and how the different types of instrument can be applied. Chapters then describe how these machines can be applied to the field of materials science and device fabrication giving examples of recent and current activity in both these areas.
This book discusses the architecture of modern automated systems for spectrum monitoring including automation components: technical means for spectrum monitoring, special software and engineering infrastructure. The problems of automated system development for search and localization of unauthorized radio emission sources in open localities, mathematical methods and algorithms for modulation of parameter measurements for wireless communication as well as issues of identification and localization of radio emission sources are considered. Constructive solutions and modern technical means for radio monitoring and their application are given. Numerous examples are described for the implementation of automated systems, digital radio receivers and radio direction-finders, analyzers of parameters for GSM, CDMA, LTE, DVB-T/T2, Wi-Fi, DMR, P25, TETRA and DECT signals. Practical implementations of the described methods are presented in applied software packages and in radio monitoring equipment.
Gradiometry is a multidisciplinary area that combines theoretical and applied physics, ultra-low noise electronics, precision engineering, and advanced signal processing. All physical fields have spatial gradients that fall with distance from their sources more rapidly than the field strength itself. This makes the gradient measurements more difficult. However, there has been a considerable investment, both in terms of time and money, into the development of various types of gradiometers driven by the extremely valuable type of information that is contained in gradients. Applications include the search for oil, gas, and mineral resources, GPS-free navigation, defence, space missions, medical research, and some other applications. The author describes gravity gradiometers, magnetic gradiometers, and electromagnetic (EM) gradiometers. The first two types do not require any active sources of the primary physical fields whose gradients are measured, such as gravity field and ambient magnetic field. EM gradiometers do require a primary EM field, pulsed, or sinusoidal, which propagates through media and creates a secondary EM field. The latter one contains information about the non uniformness of electromagnetically active media such as conductivity and magnetic permeability contrasts. These anomalies are the boundaries of mineral deposits, oil and gas traps, underground water reserves, buried artifacts, unexploded ordnance (UXO), nuclear submarines, and even cancerous human tissue. This book provides readers with a comprehensive introduction, history, potential applications, and current developments in relation to some of the most advanced technologies in the 21st Century. Most of the developments are strictly controlled by Defence Export Control rules and regulations, introduced in all developed countries that typically require permission to transfer relevant information from one country to another. The book is based on the materials that have been available in public domain such as scientific journals, conferences, extended abstracts, and online presentations. In addition, medical applications of EM gradiometers are exempt from any control, and some new results relevant to breast cancer early detection research are published in this book for the first time.
This book presents the state-of-the-art methods and procedures necessary for operating a power system. It takes into account the theoretical investigations and practical considerations of the modern electrical power system. It highlights in a systematic way the following sections: Power Sector Scenario in India, Distribution Planning and Optimization, Best practices in Operation & Maintenance of Sub-Transmission & Distribution Lines, Best Practices in Operation and Maintenance of Distribution Substation Equipment's and Auxiliaries, Best Practice in Operation & Maintenance of Transformer and Protection Systems, International Best Practices in Operation & Maintenance (Advanced Gadgets), Aerial Bunch Conductor (ABC) based Distribution System, Best Practices in Operation & Maintenance of Energy Meters.
Devised at the beginning of the 20th century by french physicists Charles Fabry and Alfred Perot, the Fabry-Perot optical cavity is perhaps the most deceptively simple setup in optics, and today a key resource in many areas of science and technology. This thesis delves deeply into the applications of optical cavities in a variety of contexts: from LIGO's 4-km-long interferometer arms that are allowing us to observe the universe in a new way by measuring gravitational waves, to the atomic clocks used to realise time with unprecedented accuracy which will soon lead to a redefinition of the second, and the matterwave interferometers that are enabling us to test and measure gravity in a new scale. The work presented accounts for the elegance and versatility of this setup, which today underpins much of the progress in the frontier of atomic and gravitational experimental physics.
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations - the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers and scientists in general who study the phenomenon of statistical stability and use statistical methods for high-precision measurements, prediction, and signal processing over long observation intervals.
Dealing with the basics, theory and applications of dynamic pulsed-field-gradient NMR NMR (PFG NMR), this book describes the essential theory behind diffusion in heterogeneous media that can be combined with NMR measurements to extract important information of the system being investigated. This information could be the surface to volume ratio, droplet size distribution in emulsions, brine profiles, fat content in food stuff, permeability/connectivity in porous materials and medical applications currently being developed. Besides theory and applications it will provide the readers with background knowledge on the experimental set-ups, and most important, deal with the pitfalls that are numerously present in work with PFG-NMR. How to analyze the NMR data and some important basic knowledge on the hardware will be explained, too.
This thesis covers a diverse set of topics related to space-based gravitational wave detectors such as the Laser Interferometer Space Antenna (LISA). The core of the thesis is devoted to the preprocessing of the interferometric link data for a LISA constellation, specifically developing optimal Kalman filters to reduce arm length noise due to clock noise. The approach is to apply Kalman filters of increasing complexity to make optimal estimates of relevant quantities such as constellation arm length, relative clock drift, and Doppler frequencies based on the available measurement data. Depending on the complexity of the filter and the simulated data, these Kalman filter estimates can provide up to a few orders of magnitude improvement over simpler estimators. While the basic concept of the LISA measurement (Time Delay Interferometry) was worked out some time ago, this work brings a level of rigor to the processing of the constellation-level data products. The thesis concludes with some topics related to the eLISA such as a new class of phenomenological waveforms for extreme mass-ratio inspiral sources (EMRIs, one of the main source for eLISA), an octahedral space-based GW detector that does not require drag-free test masses, and some efficient template-search algorithms for the case of relatively high SNR signals.
This thesis addresses two different topics, both vital for implementing modern high-energy physics experiments: detector development and data analysis. Providing a concise introduction to both the standard model of particle physics and the basic principles of semiconductor tracking detectors, it presents the first measurement of the top quark pole mass from the differential cross-section of tt+J events in the dileptonic tt decay channel. The first part focuses on the development and characterization of silicon pixel detectors. To account for the expected increase in luminosity of the Large Hadron Collider (LHC), the pixel detector of the compact muon solenoid (CMS) experiment is replaced by an upgraded detector with new front-end electronics. It presents comprehensive test beam studies conducted to verify the design and quantify the performance of the new front-end in terms of tracking efficiency and spatial resolution. Furthermore, it proposes a new cluster interpolation method, which utilizes the third central moment of the cluster charge distribution to improve the position resolution. The second part of the thesis introduces an alternative measurement of the top quark mass from the normalized differential production cross-sections of dileptonic top quark pair events with an additional jet. The energy measurement is 8TeV. Using theoretical predictions at next-to-leading order in perturbative Quantum Chromodynamics (QCD), the top quark pole mass is determined using a template fit method.
This Thesis describes the first measurement of, and constraints on, Higgs boson production in the vector boson fusion mode, where the Higgs decays to b quarks (the most common decay channel), at the LHC. The vector boson fusion mode, in which the Higgs is produced simultaneously with a pair of quark jets, provides an unparalleled opportunity to study the detailed properties of the Higgs, including the possibility of parity and CP violation, as well as its couplings and mass. It thus opens up this new field of study for precision investigation as the LHC increases in energy and intensity, leading the way to this new and exciting arena of precision Higgs research.
A new experimental method - the "Stiffnessometer", is developed to measure elementary properties of a superconductor, including the superconducting stiffness and the critical current. This technique has many advantages over existing methods, such as: the ability to measure these properties while minimally disturbing the system; the ability to measure large penetration depths (comparable to sample size), as necessary when approaching the critical temperature; and the ability to measure critical currents without attaching contacts and heating the sample. The power of this method is demonstrated in a study of the penetration depth of LSCO, where striking evidence is found for two separate critical temperatures for the in-plane and out-of-plane directions. The results in the thesis are novel, important and currently have no theoretical explanation. The stiffnessometer in a tool with great potential to explore new grounds in condensed matter physics.
Matter wave interferometry is a promising and successful way to explore truly macroscopic quantum phenomena and probe the validity of quantum theory at the borderline to the classic world. Indeed, we may soon witness quantum superpositions with nano to micrometer-sized objects. Yet, venturing deeper into the macroscopic domain is not only an experimental but also a theoretical endeavour: new interferometers must be conceived, sources of noise and decoherence identified, size effects understood and possible modifications of the theory taken into account. This thesis provides the theoretical background to recent advances in molecule and nanoparticle interferometry. In addition, it contains a physical and objective method to assess the degree of macroscopicity of such experiments, ranking them among other macroscopic quantum superposition phenomena."
This book discusses a novel and high-rate-capable micro pattern gaseous detector of the Micromegas (MICRO-MEsh GAS detector) type. It provides a detailed characterization of the performance of Micromegas detectors on the basis of measurements and simulations, along with an in-depth examination of analysis and reconstruction methods. The accurate and efficient detection of minimum ionizing particles in high-rate background environments is demonstrated. The excellent performance determined here for these lightweight detectors will make possible the live medical imaging of a patient during ion-beam treatment.
This two-volume work introduces the theory and applications of Schur-convex functions. The second volume mainly focuses on the application of Schur-convex functions in sequences inequalities, integral inequalities, mean value inequalities for two variables, mean value inequalities for multi-variables, and in geometric inequalities.
This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based measurement systems; familiarity with the hardware necessary to acquire and store signals; an appreciation for the key issue of long-term preservation of signals; and a full grasp of the often neglected issue of uncertainty in acoustical measurements. Pedagogical features include in-text worked-out examples, end-of-chapter problems, a glossary of metrology terms, and extensive appendices covering statistics, proofs, additional examples, file formats, and underlying theory.
This edited book contains invited papers from renowned experts working in the field of Wearable Electronics Sensors. It includes 14 chapters describing recent advancements in the area of Wearable Sensors, Wireless Sensors and Sensor Networks, Protocols, Topologies, Instrumentation architectures, Measurement techniques, Energy harvesting and scavenging, Signal processing, Design and Prototyping. The book will be useful for engineers, scientist and post-graduate students as a reference book for their research on wearable sensors, devices and technologies which is experiencing a period of rapid growth driven by new applications such as heart rate monitors, smart watches, tracking devices and smart glasses.
|
You may like...
Perfect Children - Growing Up on the…
Amanda Van Eck Duymaer Van Twist
Hardcover
R3,572
Discovery Miles 35 720
Kezia, Winston, and the Magic Leaf
Tonny Rutakirwa, Sharon Rutakirwa
Hardcover
R727
Discovery Miles 7 270
Organisational Analysis and…
Steve Mpedi Madue, Stellah Lubinga
Paperback
R460
Discovery Miles 4 600
|