![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific standards
This textbook offers a unique compendium of measurement procedures for experimental data acquisition. After introducing readers to the basic theory of uncertainty evaluation in measurements, it shows how to apply it in practice to conduct a range of laboratory experiments with instruments and procedures operating both in the time and frequency domains. Offering extensive practical information and hands-on tips on using oscilloscopes, spectrum analyzers and reflectometric instrumentation, the book shows readers how to deal with e.g. filter characterization, operational amplifiers, digital and analogic spectral analysis, and reflectometry-based measurements. For each experiment, it describes the corresponding uncertainty evaluation in detail. Bridging the gap between theory and practice, the book offers a unique, self-contained guide for engineering students and professionals alike. It also provides university teachers and professors with a valuable resource for their laboratory courses on electric and electronic measurements.
This book offers a genuinely practical introduction to the most commonly encountered optical and non-optical systems used for the metrology and characterization of surfaces, including guidance on best practice, calibration, advantages and disadvantages, and interpretation of results. It enables the user to select the best approach in a given context. Most methods in surface metrology are based upon the interaction of light or electromagnetic radiation (UV, NIR, IR), and different optical effects are utilized to get a certain optical response from the surface; some of them record only the intensity reflected or scattered by the surface, others use interference of EM waves to obtain a characteristic response from the surface. The book covers techniques ranging from microscopy (including confocal, SNOM and digital holographic microscopy) through interferometry (including white light, multi-wavelength, grazing incidence and shearing) to spectral reflectometry and ellipsometry. The non-optical methods comprise tactile methods (stylus tip, AFM) as well as capacitive and inductive methods (capacitive sensors, eddy current sensors). The book provides: Overview of the working principles Description of advantages and disadvantages Currently achievable numbers for resolutions, repeatability, and reproducibility Examples of real-world applications A final chapter discusses examples where the combination of different surface metrology techniques in a multi-sensor system can reasonably contribute to a better understanding of surface properties as well as a faster characterization of surfaces in industrial applications. The book is aimed at scientists and engineers who use such methods for the measurement and characterization of surfaces across a wide range of fields and industries, including electronics, energy, automotive and medical engineering.
The search for neutrinoless double beta decay is one of the highest priority areas in particle physics today; it could provide insights to the nature of neutrino masses (currently not explained by the Standard Model) as well as how the universe survived its early stages. One promising experimental approach involves the use of large volumes of isotope-loaded liquid scintillator, but new techniques for background identification and suppression must be developed in order to reach the required sensitivity levels and clearly distinguish the signal. The results from this thesis constitute a significant advance in this area, laying the groundwork for several highly effective and novel approaches based on a detailed evaluation of state-of-the-art detector characteristics. This well written thesis includes a particularly clear and comprehensive description of the theoretical motivations as well as impressively demonstrating the effective use of diverse statistical techniques. The professionally constructed signal extraction framework contains clever algorithmic solutions to efficient error propagation in multi-dimensional space. In general, the techniques developed in this work will have a notable impact on the field.
This thesis presents two significant results in the field of precision measurements in low-energy nuclear physics. Firstly, it presents a precise half-life determination of 11C, leading to the most precise ft-value for a beta decay transition between mirror nuclides, an important advance in the testing of the electroweak sector of the Standard Model. Secondly, it describes a high-precision mass measurement of 56Cu, a critical nucleus for determining the path of the astrophysical rapid-proton capture process, performed by the author using the LEBIT Penning trap at the National Superconducting Cyclotron Laboratory. This new measurement resolves discrepancies in previously-reported calculated mass excesses. In addition, the thesis also presents the construction and testing of a radio-frequency quadrupole cooler and buncher that will be part of the future N = 126 factory at Argonne National Laboratory aimed at producing nuclei of interest for the astrophysical rapid-neutron capture process for the first time.
This thesis develops next-generation multi-degree-of-freedom gyroscopes and inertial measurement units (IMU) using micro-electromechanical-systems (MEMS) technology. It covers both a comprehensive study of the physics of resonator gyroscopes and novel micro/nano-fabrication solutions to key performance limits in MEMS resonator gyroscopes. Firstly, theoretical and experimental studies of physical phenomena including mode localization, nonlinear behavior, and energy dissipation provide new insights into challenges like quadrature errors and flicker noise in resonator gyroscope systems. Secondly, advanced designs and micro/nano-fabrication methods developed in this work demonstrate valuable applications to a wide range of MEMS/NEMS devices. In particular, the HARPSS+ process platform established in this thesis features a novel slanted nano-gap transducer, which enabled the first wafer-level-packaged single-chip IMU prototype with co-fabricated high-frequency resonant triaxial gyroscopes and high-bandwidth triaxial micro-gravity accelerometers. This prototype demonstrates performance amongst the highest to date, with unmatched robustness and potential for flexible substrate integration and ultra-low-power operation. This thesis shows a path toward future low-power IMU-based applications including wearable inertial sensors, health informatics, and personal inertial navigation.
This second open access volume of the handbook series deals with detectors, large experimental facilities and data handling, both for accelerator and non-accelerator based experiments. It also covers applications in medicine and life sciences. A joint CERN-Springer initiative, the "Particle Physics Reference Library" provides revised and updated contributions based on previously published material in the well-known Landolt-Boernstein series on particle physics, accelerators and detectors (volumes 21A,B1,B2,C), which took stock of the field approximately one decade ago. Central to this new initiative is publication under full open access.
The work described in this PhD thesis is a study of a real implementation of a track-finder system which could provide reconstructed high transverse momentum tracks to the first-level trigger of the High Luminosity LHC upgrade of the CMS experiment. This is vital for the future success of CMS, since otherwise it will be impossible to achieve the trigger selectivity needed to contain the very high event rates. The unique and extremely challenging requirement of the system is to utilise the enormous volume of tracker data within a few microseconds to arrive at a trigger decision. The track-finder demonstrator described proved unequivocally, using existing hardware, that a real-time track-finder could be built using present-generation FPGA-based technology which would meet the latency and performance requirements of the future tracker. This means that more advanced hardware customised for the new CMS tracker should be even more capable, and will deliver very significant gains for the future physics returns from the LHC.
This proceedings book presents dual approaches to examining new theoretical models and their applicability in the search for new scintillation materials and, ultimately, the development of industrial technologies. The ISMART conferences bring together the radiation detector community, from fundamental research scientists to applied physics experts, engineers, and experts on the implementation of advanced solutions. This scientific forum builds a bridge between the different parts of the community and is the basis for multidisciplinary, cooperative research and development efforts. The main goals of the conference series are to review the latest results in scintillator development, from theory to applications, and to arrive at a deeper understanding of fundamental processes, as well as to discover components for the production of new generations of scintillation materials. The book highlights recent findings and hypotheses, key advances, as well as exotic detector designs and solutions, and includes papers on the microtheory of scintillation and the initial phase of luminescence development, applications of the various materials, as well as the development and characterization of ionizing radiation detection equipment. It also touches on the increased demand for cryogenic scintillators, the renaissance of garnet materials for scintillator applications, nano-structuring in scintillator development, trends in and applications for security, and exploration of hydrocarbons and ecological monitoring.
Devised at the beginning of the 20th century by french physicists Charles Fabry and Alfred Perot, the Fabry-Perot optical cavity is perhaps the most deceptively simple setup in optics, and today a key resource in many areas of science and technology. This thesis delves deeply into the applications of optical cavities in a variety of contexts: from LIGO's 4-km-long interferometer arms that are allowing us to observe the universe in a new way by measuring gravitational waves, to the atomic clocks used to realise time with unprecedented accuracy which will soon lead to a redefinition of the second, and the matterwave interferometers that are enabling us to test and measure gravity in a new scale. The work presented accounts for the elegance and versatility of this setup, which today underpins much of the progress in the frontier of atomic and gravitational experimental physics.
Combinatorial Kalman filters are a standard tool today for pattern recognition and charged particle reconstruction in high energy physics. In this thesis the implementation of the track finding software for the Belle II experiment and first studies on early Belle II data are presented. The track finding algorithm exploits novel concepts such as multivariate track quality estimates to form charged trajectory hypotheses combining information from the Belle II central drift chamber with the inner vertex sub-detectors. The eventual track candidates show an improvement in resolution on the parameters describing their spatial and momentum properties by up to a factor of seven over the former legacy implementation. The second part of the thesis documents a novel way to determine the collision event null time T0 and the implementation of optimisation steps in the online reconstruction code, which proved crucial in overcoming the high level trigger limitations.
This book discusses the theory of quantum effects used in metrology, and presents the author's research findings in the field of quantum electronics. It also describes the quantum measurement standards used in various branches of metrology, such as those relating to electrical quantities, mass, length, time and frequency. The first comprehensive survey of quantum metrology problems, it introduces a new approach to metrology, placing a greater emphasis on its connection with physics, which is of importance for developing new technologies, nanotechnology in particular. Presenting practical applications of the effects used in quantum metrology for the construction of quantum standards and sensitive electronic components, the book is useful for a broad range of physicists and metrologists. It also promotes a better understanding and approval of the new system in both industry and academia. This second edition includes two new chapters focusing on the revised SI system and satellite positioning systems. Practical realization (mise en pratique) the base units (metre, kilogram, second, ampere, kelvin, candela, and mole), new defined in the revised SI, is presented in details. Another new chapter describes satellite positioning systems and their possible applications. In satellite positioning systems, like GPS, GLONASS, BeiDou and Galileo, quantum devices - atomic clocks - serve wide population of users.
This second edition of Mass Metrology: The Newly Defined Kilogram has been thoroughly revised to reflect the recent redefinition of the kilogram in terms of Planck's constant. The necessity of defining the kilogram in terms of physical constants was already underscored in the first edition. However, the kilogram can also be defined in terms of Avogadro's number, using a collection of ions of heavy elements, by the levitation method, or using voltage and watt balances. The book also addresses the concepts of gravitational, inertial and conventional mass, and describes in detail the variation of acceleration due to gravity. Further topics covered in this second edition include: the effect of gravity variations on the reading of electronic balances derived with respect to latitude, altitude and earth topography; the classification of weights by the OIML; and maximum permissible error in different categories of weights prescribed by national and international organizations. The book also discusses group weighing techniques and the use of nanotechnology for the detection of mass differences as small as 10-24 g. Last but not least, readers will find details on the XRCD method for defining the kilogram in terms of Planck's constant.
A new experimental method - the "Stiffnessometer", is developed to measure elementary properties of a superconductor, including the superconducting stiffness and the critical current. This technique has many advantages over existing methods, such as: the ability to measure these properties while minimally disturbing the system; the ability to measure large penetration depths (comparable to sample size), as necessary when approaching the critical temperature; and the ability to measure critical currents without attaching contacts and heating the sample. The power of this method is demonstrated in a study of the penetration depth of LSCO, where striking evidence is found for two separate critical temperatures for the in-plane and out-of-plane directions. The results in the thesis are novel, important and currently have no theoretical explanation. The stiffnessometer in a tool with great potential to explore new grounds in condensed matter physics.
This book is exceptional in offering a thorough but accessible introduction to calorimetry that will meet the needs of both students and researchers in the field of particle physics. It is designed to provide the sound knowledge of the basics of calorimetry and of calorimetric techniques and instrumentation that is mandatory for any physicist involved in the design and construction of large experiments or in data analysis. An important feature is the correction of a number of persistent common misconceptions. Among the topics covered are the physics and development of electromagnetic showers, electromagnetic calorimetry, the physics and development of hadron showers, hadron calorimetry, and calibration of a calorimeter. Two chapters are devoted to more promising calorimetric techniques for the next collider. Calorimetry for Collider Physics, an introduction will be of value for all who are seeking a reliable guide to calorimetry that occupies the middle ground between the brief chapter in a generic book on particle detection and the highly complex and lengthy reference book.
This book examines an intelligent system for the inspection planning of prismatic parts on coordinate measuring machines (CMMs). The content focuses on four main elements: the engineering ontology, the model of inspection planning for prismatic parts on CMMs, the optimisation model of the measuring path based on an ant-colony approach, and the model of probe configuration and setup planning based on a genetic algorithm. The model of inspection planning for CMMs developed here addresses inspection feature construction, the sampling strategy, probe accessibility analysis, automated collision-free operation, and probe path planning. The proposed model offers a novel approach to intelligent inspection, while also minimizing human involvement (and thus the risk of human error) through intelligent planning of the probe configuration and part setup. The advantages of this approach include: reduced preparation times due to the automatic generation of a measuring protocol; potential optimisation of the measuring probe path, i.e., less time needed for the actual measurement; and increased planning process autonomy through minimal human involvement in the setup analysis and probe configuration.
This third open access volume of the handbook series deals with accelerator physics, design, technology and operations, as well as with beam optics, dynamics and diagnostics. A joint CERN-Springer initiative, the "Particle Physics Reference Library" provides revised and updated contributions based on previously published material in the well-known Landolt-Boernstein series on particle physics, accelerators and detectors (volumes 21A,B1,B2,C), which took stock of the field approximately one decade ago. Central to this new initiative is publication under full open access
There was once a time when we could not measure sound, color, blood pressure, or even time. We now find ourselves in the throes of a measurement revolution, from the laboratory to the sports arena, from the classroom to the courtroom, from a strand of DNA to the far reaches of outer space. Measurement controls our lives at work, at school, at home, and even at play. But does all this measurement really measure up? Here, John Henshaw examines the ways in which measurement makes sense or creates nonsense. Henshaw tells the controversial story of intelligence measurement from Plato to Binet to the early days of the SAT to today's super-quantified world of No Child Left Behind. He clears away the fog on issues of measurement in the environment, such as global warming, hurricanes, and tsunamis, and in the world of computers, from digital photos to MRI to the ballot systems used in Florida during the 2000 presidential election. From cycling and car racing to baseball, tennis, and track-and-field, he chronicles the ever-growing role of measurement in sports, raising important questions about performance and the folly of comparing today's athletes to yesterday's records. We can't quite measure everything, at least not yet. What could be more difficult to quantify than reasonable doubt? However, even our justice system is yielding to the measurement revolution with new forensic technologies such as DNA fingerprinting. As we evolve from unquantified ignorance to an imperfect but everpresent state of measured awareness, Henshaw gives us a critical perspective from which we can "measure up" the measurements that have come to affect our lives so greatly.
This book is a translation from a Russian book. In 2007, the authors created a new generation of layered composite-based sensors, whose advantages are high technology and thermal stability. The use of gradient heat flux sensors in laboratory and industrial conditions confirmed their reliability, showed high information, and allowed a number of priority results to be obtained. All of this is summarized in this book.
This book covers the cross-disciplinary areas between management issues and engineering issues relevant to implementation of Environmental Management Systems (EMS) to the ISO 14000 series standards. It summarises the requirements set by ISO14001 and considers the management and engineering policies needed to satisfy these requirements and achieve ISO 14001 certification. * Unique approach by integrating environmental management and engineering considerations * Avoids overuse of complicated technical jargon * Detailed coverage of measurement and calibration standards to meet ISO14001 * Provides example of EMS documentation and records manual * Detailed coverage and control of air, water, noise, vibration pollution and waste management
This book reviews the HL-LHC experiments and the fourth-generation photon science experiments, discussing the latest radiation hardening techniques, optimization of device & process parameters using TCAD simulation tools, and the experimental characterization required to develop rad-hard Si detectors for x-ray induced surface damage and bulk damage by hadronic irradiation. Consisting of eleven chapters, it introduces various types of strip and pixel detector designs for the current upgrade, radiation, and dynamic range requirement of the experiments, and presents an overview of radiation detectors, especially Si detectors. It also describes the design of pixel detectors, experiments and characterization of Si detectors. The book is intended for researchers and master's level students with an understanding of radiation detector physics. It provides a concept that uses TCAD simulation to optimize the electrical performance of the devices used in the harsh radiation environment of the colliders and at XFEL.
This first open access volume of the handbook series contains articles on the standard model of particle physics, both from the theoretical and experimental perspective. It also covers related topics, such as heavy-ion physics, neutrino physics and searches for new physics beyond the standard model. A joint CERN-Springer initiative, the "Particle Physics Reference Library" provides revised and updated contributions based on previously published material in the well-known Landolt-Boernstein series on particle physics, accelerators and detectors (volumes 21A,B1,B2,C), which took stock of the field approximately one decade ago. Central to this new initiative is publication under full open access.
This book conveys the theoretical and experimental basics of a well-founded measurement technique in the areas of high DC, AC and surge voltages as well as the corresponding high currents. Additional chapters explain the acquisition of partial discharges and the electrical measured variables. Equipment exposed to very high voltages and currents is used for the transmission and distribution of electrical energy. They are therefore tested for reliability before commissioning using standardized and future test and measurement procedures. Therefore, the book also covers procedures for calibrating measurement systems and determining measurement uncertainties, and the current state of measurement technology with electro-optical and magneto-optical sensors is discussed.
This primer describes the general model independent searches for new physics phenomena beyond the Standard Model of particle physics. First, the motivation for performing general model independent experimental searches for new physics is presented by giving an overview of the current theoretical understanding of particle physics in terms of the Standard Model of particle physics and its shortcomings. Then, the concept and features of general model independent search for new physics at collider based experiments is explained. This is followed by an overview of such searches performed in past high energy physics experiments and the current status of such searches, particularly in the context of the experiments at the LHC. Finally, the future prospects of such general model independent searches, with possible improvements using new tools such as machine learning techniques, is discussed.
This book presents lecture materials from the Third LOFAR Data School, transformed into a coherent and complete reference book describing the LOFAR design, along with descriptions of primary science cases, data processing techniques, and recipes for data handling. Together with hands-on exercises the chapters, based on the lecture notes, teach fundamentals and practical knowledge. LOFAR is a new and innovative radio telescope operating at low radio frequencies (10-250 MHz) and is the first of a new generation of radio interferometers that are leading the way to the ambitious Square Kilometre Array (SKA) to be built in the next decade. This unique reference guide serves as a primary information source for research groups around the world that seek to make the most of LOFAR data, as well as those who will push these topics forward to the next level with the design, construction, and realization of the SKA. This book will also be useful as supplementary reading material for any astrophysics overview or astrophysical techniques course, particularly those geared towards radio astronomy (and radio astronomy techniques).
This book gathers timely contributions on metrology and measurement systems, across different disciplines and field of applications. The chapters, which were presented at the 6th International Scientific-Technical Conference, MANUFACTURING 2019, held on May 19-21, 2019, in Poznan, Poland, cover cutting-edge topics in surface metrology, biology, chemistry, civil engineering, food science, material science, mechanical engineering, manufacturing, metrology, nanotechnology, physics, tribology, quality engineering, computer science, among others. By bringing together engineering and economic topics, the book is intended as an extensive, timely and practice-oriented reference guide for both researchers and practitioners. It is also expected to foster better communication and closer cooperation between universities and their business and industry partners. |
![]() ![]() You may like...
Origami Paper 8 1/4" (21 cm) Ukiyo-e…
Tuttle Publishing
Notebook / blank book
|