![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific standards
This book presents a general and comprehensive framework for the assurance of quality in measurements. Written by a foremost expert in the field, the text reflects an on-going international effort to extend traditional quality assured measurement, rooted in fundamental physics and the SI, to include non-physical areas such as person-centred care and the social sciences more generally. Chapter by chapter, the book follows the measurement quality assurance loop, based on Deming's work. The author enhances this quality assurance cycle with insights from recent research, including work on the politics and philosophy of metrology, the new SI, quantitative and qualitative scales and entropy, decision risks and uncertainty when addressing human challenges, Man as a Measurement Instrument, and Psychometry and Person-centred care. Quality Assured Measurement: Unification across Social and Physical Sciences provides students and researchers in physics, chemistry, engineering, medicine and the social sciences with practical guidance on designing, implementing and applying a quality-assured measurement while engaging readers in the most novel and expansive areas of contemporary measurement research.
This textbook offers a unique compendium of measurement procedures for experimental data acquisition. After introducing readers to the basic theory of uncertainty evaluation in measurements, it shows how to apply it in practice to conduct a range of laboratory experiments with instruments and procedures operating both in the time and frequency domains. Offering extensive practical information and hands-on tips on using oscilloscopes, spectrum analyzers and reflectometric instrumentation, the book shows readers how to deal with e.g. filter characterization, operational amplifiers, digital and analogic spectral analysis, and reflectometry-based measurements. For each experiment, it describes the corresponding uncertainty evaluation in detail. Bridging the gap between theory and practice, the book offers a unique, self-contained guide for engineering students and professionals alike. It also provides university teachers and professors with a valuable resource for their laboratory courses on electric and electronic measurements.
Infrared thermography is a measurement technique that enables to obtain non intrusive measurements of surface temperatures. One of the interesting features of this technique is its ability to measure a full two dimensional map of the surface temperature and for this reason it has been widely used as a flow visualization technique. Since the temperature measurements can be extremely accurate it is possible, by using a heat flux sensor, also to measure convective heat transfer coefficient distributions on a surface making the technique de facto quantitative. This book, starting from the basic theory of infrared thermography and heat flux sensor guides, both the experienced researcher and the young student, in the correct application of this powerful technique to various practical problems. A significant number of examples and applications are also examined in detail.
In this thesis, the author develops new high-power millimeter wave techniques for measuring the hyperfine structure of positronium (Ps-HFS) directly for the first time in the world. Indirect measurement of Ps-HFS in the literature might have systematic uncertainties related to the use of a static magnetic field. Development of the millimeter wave devices supports the precise determination of Ps-HFS by directly measuring the Breit-Wigner resonant transition from o-Ps to p-Ps without the magnetic field. At the same time, the width of the measured Breit-Wigner resonance directly provides the lifetime of p-Ps. This measurement is the first precise spectroscopic experiment involving the magnetic dipole transition and high-power millimeter waves. The development of a gyrotron and a Fabry-Perot cavity is described as providing an effective power of over 20 kW, which is required to cause the direct transition from o-Ps to p-Ps. Those values measured by the newly developed millimeter wave device pave the way for examining the discrepancy observed between conventional indirect experiments on Ps-HFS and the theoretical predictions of Quantum Electrodynamics.
Due to steadily improving experimental accuracy, relativistic concepts - based on Einstein's theory of Special and General Relativity - are playing an increasingly important role in modern geodesy. This book offers an introduction to the emerging field of relativistic geodesy, and covers topics ranging from the description of clocks and test bodies, to time and frequency measurements, to current and future observations. Emphasis is placed on geodetically relevant definitions and fundamental methods in the context of Einstein's theory (e.g. the role of observers, use of clocks, definition of reference systems and the geoid, use of relativistic approximation schemes). Further, the applications discussed range from chronometric and gradiometric determinations of the gravitational field, to the latest (satellite) experiments. The impact of choices made at a fundamental theoretical level on the interpretation of measurements and the planning of future experiments is also highlighted. Providing an up-to-the-minute status report on the respective topics discussed, the book will not only benefit experts, but will also serve as a guide for students with a background in either geodesy or gravitational physics who are interested in entering and exploring this emerging field.
This is the first book summarizing the theoretical basics of thermal nondestructive testing (TNDT) by combining elements of heat conduction, infrared thermography, and industrial nondestructive testing. The text contains the physical models of TNDT, heat transfer in defective and sound structures, and thermal properties of materials. Also included are the optimization of TNDT procedures, defect characterization, data processing in TNDT, active and passive TNDT systems, as well as elements of statistical data treatment and decision making. This text contains in-depth descriptions of applications in infrared/thermal testing within aerospace, power production, building, as well as the conservation of artistic monuments The book is intended for the industrial specialists who are involved in technical diagnostics and nondestructive testing. It may also be useful for academic researchers, undergraduate, graduate and PhD university students.
This book presents a complete review of the unique instruments and the communication technologies utilized in downhole drilling environments. These instruments and communication technologies play a critical role in drilling hydrocarbon wells safely, accurately and efficiently into a target reservoir zone by acquiring information about the surrounding geological formations as well as providing directional measurements of the wellbore. Research into instruments and communication technologies for hydrocarbon drilling has not been explored by researchers to the same extent as other fields, such as biomedical, automotive and aerospace applications. Therefore, the book serves as an opportunity for researchers to truly understand how instruments and communication technologies can be used in a downhole environment and to provide fertile ground for research and development in this area. A look ahead, discussing other technologies such as micro-electromechanical-systems (MEMS) and fourth industrial revolution technologies such as automation, the industrial internet of things (IIoT), artificial intelligence, and robotics that can potentially be used in the oil/gas industry are also presented, as well as requirements still need to be met in order to deploy them in the field.
This manual describes the wide range of electromechanical, electrochemical and electro-optical transducers at the heart of current field-deployable ocean observing instruments. Their modes of operation, precision and accuracy are discussed in detail. Observing platforms ranging from the traditional to the most recently developed are described, as are the challenges of integrating instrument suits to individual platforms. Technical approaches are discussed to address environmental constraints on instrument and platform operation such as power sources, corrosion, biofouling and mechanical abrasion. Particular attention is also given to data generated by the networks of observing platforms that are typically integrated into value-added data visualization products, including numerical simulations or models. Readers will learn about acceptable data formats and representative model products. The last section of the book is devoted to the challenges of planning, deploying and maintaining coastal ocean observing systems. Readers will discover practical applications of ocean observations in diverse fields including natural resource conservation, commerce and recreation, safety and security, and climate change resiliency and adaptation. This volume will appeal to ocean engineers, oceanographers, commercial and recreational ocean data users, observing systems operators, and advanced undergraduate and graduate students in the field of ocean observing.
This book explains how to improve the validity, reliability, and repeatability of slip resistance assessments amongst a range of shoes, floors, and environments from an engineering metrology viewpoint-covering theoretical and experimental aspects of slip resistance mechanics and mechanisms. Pedestrian falls resulting from slips or falls are one of the foremost causes of fatal and non-fatal injuries that limit people's functionality. There have been prolonged efforts globally to identify and understand their main causes and reduce their frequency and severity. This book deals with large volumes of information on tribological characteristics such as friction and wear behaviours of the shoes and floors and their interactive impacts on slip resistance performances. Readers are introduced to theoretical concepts and models and collected evidence on slip resistance properties amongst a range of shoe and floor types and materials under various ambulatory settings. These approaches can be used to develop secure design strategies against fall incidents and provide a great step forward to build safer shoes, floors, and walking/working environments for industries and communities around the world. The book includes many case studies.
This thesis presents two significant results in the field of precision measurements in low-energy nuclear physics. Firstly, it presents a precise half-life determination of 11C, leading to the most precise ft-value for a beta decay transition between mirror nuclides, an important advance in the testing of the electroweak sector of the Standard Model. Secondly, it describes a high-precision mass measurement of 56Cu, a critical nucleus for determining the path of the astrophysical rapid-proton capture process, performed by the author using the LEBIT Penning trap at the National Superconducting Cyclotron Laboratory. This new measurement resolves discrepancies in previously-reported calculated mass excesses. In addition, the thesis also presents the construction and testing of a radio-frequency quadrupole cooler and buncher that will be part of the future N = 126 factory at Argonne National Laboratory aimed at producing nuclei of interest for the astrophysical rapid-neutron capture process for the first time.
This book provides insights into surface quality control techniques and applications based on high-definition metrology (HDM). Intended as a reference resource for engineers who routinely use a variety of quality control methods and are interested in understanding the data processing, from HDM data to final control actions, it can also be used as a textbook for advanced courses in engineering quality control applications for students who are already familiar with quality control methods and practices. It enables readers to not only assimilate the quality control methods involved, but also to quickly implement the techniques in practical engineering problems. Further, it includes numerous case studies to highlight the implementation of the methods using measured HDM data of surface features. Since MATLAB is extensively employed in these case studies, familiarity with this software is helpful, as is a general understanding of surface quality control methods.
Written by respected experts, this book highlights the latest findings on the electromagnetic ultrasonic guided wave (UGW) imaging method. It introduces main topics as the Time of Flight (TOF) extraction method for the guided wave signal, tomography and scattering imaging methods which can be used to improve the imaging accuracy of defects. Further, it offers essential insights into how electromagnetic UGW can be used in nondestructive testing (NDT) and defect imaging. As such, the book provides valuable information, useful methods and practical experiments that will benefit researchers, scientists and engineers in the field of NDT.
This well-illustrated book, by two established historians of school mathematics, documents Thomas Jefferson's quest, after 1775, to introduce a form of decimal currency to the fledgling United States of America. The book describes a remarkable study showing how the United States' decision to adopt a fully decimalized, carefully conceived national currency ultimately had a profound effect on U.S. school mathematics curricula. The book shows, by analyzing a large set of arithmetic textbooks and an even larger set of handwritten cyphering books, that although most eighteenth- and nineteenth-century authors of arithmetic textbooks included sections on vulgar and decimal fractions, most school students who prepared cyphering books did not study either vulgar or decimal fractions. In other words, author-intended school arithmetic curricula were not matched by teacher-implemented school arithmetic curricula. Amazingly, that state of affairs continued even after the U.S. Mint began minting dollars, cents and dimes in the 1790s. In U.S. schools between 1775 and 1810 it was often the case that Federal money was studied but decimal fractions were not. That gradually changed during the first century of the formal existence of the United States of America. By contrast, Chapter 6 reports a comparative analysis of data showing that in Great Britain only a minority of eighteenth- and nineteenth-century school students studied decimal fractions. Clements and Ellerton argue that Jefferson's success in establishing a system of decimalized Federal money had educationally significant effects on implemented school arithmetic curricula in the United States of America. The lens through which Clements and Ellerton have analyzed their large data sets has been the lag-time theoretical position which they have developed. That theory posits that the time between when an important mathematical "discovery" is made (or a concept is "created") and when that discovery (or concept) becomes an important part of school mathematics is dependent on mathematical, social, political and economic factors. Thus, lag time varies from region to region, and from nation to nation. Clements and Ellerton are the first to identify the years after 1775 as the dawn of a new day in U.S. school mathematics-traditionally, historians have argued that nothing in U.S. school mathematics was worthy of serious study until the 1820s. This book emphasizes the importance of the acceptance of decimal currency so far as school mathematics is concerned. It also draws attention to the consequences for school mathematics of the conscious decision of the U.S. Congress not to proceed with Thomas Jefferson's grand scheme for a system of decimalized weights and measures.
This thesis develops next-generation multi-degree-of-freedom gyroscopes and inertial measurement units (IMU) using micro-electromechanical-systems (MEMS) technology. It covers both a comprehensive study of the physics of resonator gyroscopes and novel micro/nano-fabrication solutions to key performance limits in MEMS resonator gyroscopes. Firstly, theoretical and experimental studies of physical phenomena including mode localization, nonlinear behavior, and energy dissipation provide new insights into challenges like quadrature errors and flicker noise in resonator gyroscope systems. Secondly, advanced designs and micro/nano-fabrication methods developed in this work demonstrate valuable applications to a wide range of MEMS/NEMS devices. In particular, the HARPSS+ process platform established in this thesis features a novel slanted nano-gap transducer, which enabled the first wafer-level-packaged single-chip IMU prototype with co-fabricated high-frequency resonant triaxial gyroscopes and high-bandwidth triaxial micro-gravity accelerometers. This prototype demonstrates performance amongst the highest to date, with unmatched robustness and potential for flexible substrate integration and ultra-low-power operation. This thesis shows a path toward future low-power IMU-based applications including wearable inertial sensors, health informatics, and personal inertial navigation.
This proceedings book presents dual approaches to examining new theoretical models and their applicability in the search for new scintillation materials and, ultimately, the development of industrial technologies. The ISMART conferences bring together the radiation detector community, from fundamental research scientists to applied physics experts, engineers, and experts on the implementation of advanced solutions. This scientific forum builds a bridge between the different parts of the community and is the basis for multidisciplinary, cooperative research and development efforts. The main goals of the conference series are to review the latest results in scintillator development, from theory to applications, and to arrive at a deeper understanding of fundamental processes, as well as to discover components for the production of new generations of scintillation materials. The book highlights recent findings and hypotheses, key advances, as well as exotic detector designs and solutions, and includes papers on the microtheory of scintillation and the initial phase of luminescence development, applications of the various materials, as well as the development and characterization of ionizing radiation detection equipment. It also touches on the increased demand for cryogenic scintillators, the renaissance of garnet materials for scintillator applications, nano-structuring in scintillator development, trends in and applications for security, and exploration of hydrocarbons and ecological monitoring.
This 2nd edition lays out an updated version of the general theory of light propagation and imaging through Earth's turbulent atmosphere initially developed in the late '70s and '80s, with additional applications in the areas of laser communications and high-energy laser beam propagation. New material includes a chapter providing a comprehensive mathematical tool set for precisely characterizing image formation with the anticipated Extremely Large Telescopes (ELTS), enabling a staggering range of star image shapes and sizes; existing chapters rewritten or modified so as to supplement the mathematics with clearer physical insight through written and graphical means; a history of the development of present-day understanding of light propagation and imaging through the atmosphere as represented by the general theory described. Beginning with the rudimentary, geometrical-optics based understanding of a century ago, it describes advances made in the 1960s, including the development of the 'Kolmogorov theory,' the deficiencies of which undermined its credibility, but not before it had done enormous damage, such as construction of a generation of underperforming 'light bucket' telescopes. The general theory requires no a priori turbulence assumptions. Instead, it provides means for calculating the turbulence properties directly from readily-measurable properties of star images.
This book reports on the development and application of a new uniaxial pressure apparatus that is currently generating considerable interest in the field of materials physics. The author provides practical guidelines for performing such experiments, backed up by finite element simulations. Subsequently, the book reports on two uses of the device. In the first, high pressures are used to tune to a Van Hove singularity in Sr2RuO4, while the effects on the unconventional superconductivity and the normal state properties are investigated. In the second experiment, precise and continuous strain control is used to probe symmetry breaking and novel phase formation in the vicinity of a quantum critical point in Sr3Ru2O7.
This second edition of Mass Metrology: The Newly Defined Kilogram has been thoroughly revised to reflect the recent redefinition of the kilogram in terms of Planck's constant. The necessity of defining the kilogram in terms of physical constants was already underscored in the first edition. However, the kilogram can also be defined in terms of Avogadro's number, using a collection of ions of heavy elements, by the levitation method, or using voltage and watt balances. The book also addresses the concepts of gravitational, inertial and conventional mass, and describes in detail the variation of acceleration due to gravity. Further topics covered in this second edition include: the effect of gravity variations on the reading of electronic balances derived with respect to latitude, altitude and earth topography; the classification of weights by the OIML; and maximum permissible error in different categories of weights prescribed by national and international organizations. The book also discusses group weighing techniques and the use of nanotechnology for the detection of mass differences as small as 10-24 g. Last but not least, readers will find details on the XRCD method for defining the kilogram in terms of Planck's constant.
This book highlights the recent research advances in the area of operation, management and control of electricity distribution networks. It addresses various aspects of distribution network management, including operation, customer engagement and technology accommodation. Electricity distribution networks are an important part of the power delivery system, and the smart control and management of distribution networks is vital in order to satisfy technical, economic, and customer requirements. A new management philosophy, techniques, and methods are essential to handle uncertainties, security, and stability associated with the integration of renewable-based distributed generation units, demand forecast and customer needs. This book discusses these topics in the context of managing the capacity of distribution networks while addressing the future needs of electricity systems. Furthermore, the efficient and economic operation of distribution networks is an essential part of management of system for effective use of resources, and as such the also addresses operation and control approaches and techniques suitable for future distribution networks.
This book is a translation from a Russian book. In 2007, the authors created a new generation of layered composite-based sensors, whose advantages are high technology and thermal stability. The use of gradient heat flux sensors in laboratory and industrial conditions confirmed their reliability, showed high information, and allowed a number of priority results to be obtained. All of this is summarized in this book.
This book tells the story of a unique scientific and human adventure, following the life and science of Bruno Touschek, an Austrian born physicist, who conceived and built AdA, the first matter-antimatter colliding-beam storage ring, the ancestor of the Large Hadron Collider at CERN where the Higgs Boson was discovered in 2012. Making extensive use of archival sources and personal correspondence, the author offers for the first time a unified history of European efforts to build modern-day particle accelerators, from the dark times of war-ravaged Europe up to the rebuilding of science in Germany, UK, Italy and France through the 1950s and early 1960s. This book, the result of several years of scholarly research work, includes numerous previously unpublished photos as well as original drawings by Bruno Touschek.
This book offers a genuinely practical introduction to the most commonly encountered optical and non-optical systems used for the metrology and characterization of surfaces, including guidance on best practice, calibration, advantages and disadvantages, and interpretation of results. It enables the user to select the best approach in a given context. Most methods in surface metrology are based upon the interaction of light or electromagnetic radiation (UV, NIR, IR), and different optical effects are utilized to get a certain optical response from the surface; some of them record only the intensity reflected or scattered by the surface, others use interference of EM waves to obtain a characteristic response from the surface. The book covers techniques ranging from microscopy (including confocal, SNOM and digital holographic microscopy) through interferometry (including white light, multi-wavelength, grazing incidence and shearing) to spectral reflectometry and ellipsometry. The non-optical methods comprise tactile methods (stylus tip, AFM) as well as capacitive and inductive methods (capacitive sensors, eddy current sensors). The book provides: Overview of the working principles Description of advantages and disadvantages Currently achievable numbers for resolutions, repeatability, and reproducibility Examples of real-world applications A final chapter discusses examples where the combination of different surface metrology techniques in a multi-sensor system can reasonably contribute to a better understanding of surface properties as well as a faster characterization of surfaces in industrial applications. The book is aimed at scientists and engineers who use such methods for the measurement and characterization of surfaces across a wide range of fields and industries, including electronics, energy, automotive and medical engineering.
The theoretical foundations of the Standard Model of elementary particles relies on the existence of the Higgs boson, a particle which has been revealed for the first time by the experiments run at the Large Hadron Collider (LHC) in 2012. As the Higgs boson is an unstable particle, its search strategies were based on its decay products. In this thesis, Francesco Pandolfi conducted a search for the Higgs boson in the H ZZ l + l - qq Decay Channel with 4.6 fb -1 of 7 TeV proton-proton collision data collected by the Compact Muon Solenoid (CMS) experiment. The presence of jets in the final state poses a series of challenges to the experimenter: both from a technical point of view, as jets are complex objects and necessitate of ad-hoc reconstruction techniques, and from an analytical one, as backgrounds with jets are copious at hadron colliders, therefore analyses must obtain high degrees of background rejection in order to achieve competitive sensitivity. This is accomplished by following two directives: the use of an angular likelihood discriminant, capable of discriminating events likely to originate from the decay of a scalar boson from non-resonant backgrounds, and by using jet parton flavor tagging, selecting jets compatible with quark hadronization and discarding jets more likely to be initiated by gluons. The events passing the selection requirements in 4.6 fb -1 of data collected by the CMS detector are examined, in the search of a possible signal compatible with the decay of a heavy Higgs boson. The thesis describes the statistical tools and the results of this analysis. This work is a paradigm for studies of the Higgs boson with final states with jets. The non-expert physicists will enjoy a complete and eminently readable description of a proton-proton collider analysis. At the same time, the expert reader will learn the details of the searches done with jets at CMS.
This third open access volume of the handbook series deals with accelerator physics, design, technology and operations, as well as with beam optics, dynamics and diagnostics. A joint CERN-Springer initiative, the "Particle Physics Reference Library" provides revised and updated contributions based on previously published material in the well-known Landolt-Boernstein series on particle physics, accelerators and detectors (volumes 21A,B1,B2,C), which took stock of the field approximately one decade ago. Central to this new initiative is publication under full open access
The Transmission Electron Microscope (TEM) is the ultimate tool to see and measure structures on the nanoscale and to probe their elemental composition and electronic structure with sub-nanometer spatial resolution. Recent technological breakthroughs have revolutionized our understanding of materials via use of the TEM, and it promises to become a significant tool in understanding biological and biomolecular systems such as viruses and DNA molecules. This book is a practical guide for scientists who need to use the TEM as a tool to answer questions about physical and chemical phenomena on the nanoscale. |
![]() ![]() You may like...
Key to the Hebrew-Egyptian Mystery in…
James Ral J Ralston (James Ralston)
Hardcover
R958
Discovery Miles 9 580
Pressure and Temperature Sensitive…
Tianshu Liu, John P. Sullivan, …
Hardcover
R7,159
Discovery Miles 71 590
The Electrostatic Accelerator - A…
Ragnar Hellborg, Harry J. Whitlow
Hardcover
R2,088
Discovery Miles 20 880
|