![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
This book presents in a concise way the Mie theory and its current applications. It begins with an overview of current theories, computational methods, experimental techniques, and applications of optics of small particles. There is also some biographic information on Gustav Mie, who published his famous paper on the colour of Gold colloids in 1908. The Mie solution for the light scattering of small spherical particles set the basis for more advanced scattering theories and today there are many methods to calculate light scattering and absorption for practically any shape and composition of particles. The optics of small particles is of interest in industrial, atmospheric, astronomic and other research. The book covers the latest developments in divers fields in scattering theory such as plasmon resonance, multiple scattering and optical force.
Laser measurement technology has evolved in the last years in a versatile and reflationary way. Today, its methods are indispensable for research and development activities as well as for production technology. Every physicist and engineer should therefore gain a working knowledge of laser measurement technology. This book closes the gap of existing textbooks. It introduces in a comprehensible presentation laser measurement technology in all its aspects. Numerous figures, graphs and tables allow for a fast access into the matter. In the first part of the book the important physical and optical basics are described being necessary to understand laser measurement technology. In the second part technically significant measuring methods are explained and application examples are presented. Target groups of this textbook are students of natural and engineering sciences as well as working physicists and engineers, who are interested to make themselves familiar with laser measurement technology and its fascinating potentials.
This book gives the background to differential-pressure flow measurement and goes through the requirements explaining the reason for them. For those who want to use an orifice plate or a Venturi tube the standard ISO 5167 and its associated Technical Reports give the instructions required. However, they rarely tell the users why they should follow certain instructions. This book helps users of the ISO standards for orifice plates and Venturi tubes to understand the reasons why the standards are as they are, to apply them effectively, and to understand the consequences of deviations from the standards.
This book describes new and efficient calorimetric measurement methods, which can be used to accurately follow the chemical kinetics of liquid phase reaction systems. It describes apparatus and techniques for the precise measuring of the rate of heat liberation in discontinuous and continuous isothermal as well as non-isothermal reactions. The presented methodology can be used to follow the development of chemical reactions online, even in industrial scales. Written by an experienced scientist and practitioner, who can look back on long-standing expert knowledge in chemical engineering, the book contains many practical hints and instructions. The reader will find a sound compact introduction to fundamentals, and comprehensive technical background information and instructions for performing own kinetic experiments. This book is the fusion of scientific background information and long hands-on experience in the practice.
This comprehensive volume summarizes and structures the multitude of results obtained at the LHC in its first running period and draws the grand picture of today’s physics at a hadron collider. Topics covered are Standard Model measurements, Higgs and top-quark physics, flavour physics, heavy-ion physics, and searches for supersymmetry and other extensions of the Standard Model. Emphasis is placed on overview and presentation of the lessons learned. Chapters on detectors and the LHC machine and a thorough outlook into the future complement the book. The individual chapters are written by teams of expert authors working at the forefront of LHC research.
These notes describe how to average and fit numerical data that have been obtained either by simulation or measurement. Following an introduction on how to estimate various average values, they discuss how to determine error bars on those estimates, and how to proceed for combinations of measured values. Techniques for fitting data to a given set of models will be described in the second part of these notes. This primer equips readers to properly derive the results covered, presenting the content in a style suitable for a physics audience. It also includes scripts in python, perl and gnuplot for performing a number of tasks in data analysis and fitting, thereby providing readers with a useful reference guide.
This thesis presents a study of the origin of an apparently extended X-ray emission associated with the Galactic ridge. The study was carried out with broadband spectra obtained from mapping observations in the Galactic bulge region conducted in 2005-2010 by the Suzaku space X-ray observatory. The spectra were analyzed with a newly constructed X-ray spectral model of an accreting white dwarf binary that is one of the proposed candidate stars for the origin of the Galactic ridge emission in the higher energy band. Fitting of the observed Galactic ridge spectra with the model showed that there is another spectral component that fills the gap between the observed X-ray flux and the component expected from the accreting white dwarf spectral model in the lower energy band. This additional soft spectral component was nicely explained by an X-ray spectral model of normal stars. The result, together with previously reported high-resolution imaging results, strongly supports the idea that the Galactic ridge X-ray emission is an assembly of dim, discrete X-ray point sources.
This volume presents measurement uncertainty and uncertainty budgets in a form accessible to practicing engineers and engineering students from across a wide range of disciplines. The book gives a detailed explanation of the methods presented by NIST in the "GUM" - Guide to Uncertainty of Measurement. Emphasis is placed on explaining the background and meaning of the topics, while keeping the level of mathematics at the minimum level necessary. Dr. Colin Ratcliffe, USNA, and Bridget Ratcliffe, Johns Hopkins, develop uncertainty budgets and explain their use. In some examples, the budget may show a process is already adequate and where costs can be saved. In other examples, the budget may show the process is inadequate and needs improvement. The book demonstrates how uncertainty budgets help identify the most cost effective place to make changes. In addition, an extensive fully-worked case study leads readers through all issues related to an uncertainty analysis, including a variety of different types of uncertainty budgets. The book is ideal for professional engineers and students concerned with a broad range of measurement assurance challenges in applied sciences. This book also: Facilitates practicing engineers' understanding of uncertainty budgets, essential to calculating cost-effective savings to a wide variety of processes contingent on measurement Presents uncertainty budgets in an accessible style suitable for all undergraduate STEM courses that include a laboratory component Provides a highly adaptable supplement to graduate textbooks for courses where students' work includes reporting on experimental results Includes an expanded case study developing uncertainty from transducers though measurands and propagated to the final measurement that can be used as a template for the analysis of many processes Stands as a useful pocket reference for all engineers and experimental scientists
In this thesis, the measurement of double-spin asymmetry for electron production from heavy flavor decays was performed in a Relativistic Heavy Ion Collider (RHIC) in the PHENIX experiment at Brookhaven National Laboratory to measure the polarized parton distribution function of gluon in the small Bjorken x region (x~0.01). Â For this experiment, for the first time a Hadron Blind Detector (HBD), which is a position-sensitive gas Cherenkov counter with Gas Electron Multiplier whose surface is evaporated by CsI, was employed. This HBD contributes to reducing the background from electron pairs produced by real and virtual photon conversion. Furthermore, the author develops a new analysis method for the background reduction, and the signal-to-background ratio is improved by a factor of roughly 2.0. Using the combination of the HBD and a new analysis method, the double-spin asymmetry of the electron production with transverse momentum ranging 0.5 < pT < 3.0 GeV/c is measured and confirmed to be zero-consistent within the limit of the statistical uncertainty of about 1%. This result identifies the constraint of the gluon polarization in the small Bjorken x region, a worldwide first.
This book makes the area of integration of renewable energy into the existing electricity grid accessible to engineers and researchers. This is a self-contained text which has models of power system devices and control theory necessary to understand and tune controllers in use currently. The new research in renewable energy integration is put into perspective by comparing the change in the system dynamics as compared to the traditional electricity grid. The emergence of the voltage stability problem is motivated by extensive examples. Various methods to mitigate this problem are discussed bringing out their merits clearly. As a solution to the voltage stability problem, the book covers the use of FACTS devices and basic control methods. An important contribution of this book is to introduce advanced control methods for voltage stability. It covers the application of output feedback methods with a special emphasis on how to bound modelling uncertainties and the use of robust control theory to design controllers for practical power systems. Special emphasis is given to designing controllers for FACTS devices to improve low-voltage ride-through capability of induction generators. As generally PV is connected in low voltage distribution area, this book also provides a systematic control design for the PV unit in distribution systems. The theory is amply illustrated with large IEEE Test systems with multiple generators and dynamic load. Controllers are designed using Matlab and tested using full system models in PSSE.
Nicola Salvi's thesis offers a remarkably cogent view of highly sophisticated NMR methods. Salvi developed these methods in order to characterize the amplitudes and frequency ranges of local motions in biomolecules such as proteins. These local motions play an essential role since they can explain many of the remarkable properties of proteins and enable them to carry out all sorts of vital functions, from enzymatic catalysis to intermolecular recognition and signalling in cells. Salvi's work has led to numerous publications in high-impact journals.
This thesis presents neutron scattering data that contribute to the understanding of four distinct areas of condensed matter physics, including iso-compositional liquid-liquid phase transitions and the glass formation in rare earth doped BaTi2O5. In situ aerodynamic levitation with laser heating was combined with neutron scattering in order to study both liquid-liquid phase transitions in (Y2O3)x(Al2O3)1-x and the atomic and magnetic ordering in liquid Invar. Among several significant results, obtained in this case from small angle neutron scattering, was the absence of a phase transition across a range of temperatures and compositions in the yttria aluminates. As these are a principal system in which liquid-liquid phase transitions have been hypothesized, this is an important contribution in a contentious area.
Dealing with the basics, theory and applications of dynamic pulsed-field-gradient NMR NMR (PFG NMR), this book describes the essential theory behind diffusion in heterogeneous media that can be combined with NMR measurements to extract important information of the system being investigated. This information could be the surface to volume ratio, droplet size distribution in emulsions, brine profiles, fat content in food stuff, permeability/connectivity in porous materials and medical applications currently being developed. Besides theory and applications it will provide the readers with background knowledge on the experimental set-ups, and most important, deal with the pitfalls that are numerously present in work with PFG-NMR. How to analyze the NMR data and some important basic knowledge on the hardware will be explained, too.
The book presents the recent advancements in the area of sensors and sensing technology, specifically in environmental monitoring, structural health monitoring, dielectric, magnetic, electrochemical, ultrasonic, microfluidic, flow, surface acoustic wave, gas, cloud computing and bio-medical. This book will be useful to a variety of readers, namely, Master and PhD degree students, researchers, practitioners, working on sensors and sensing technology. The book will provide an opportunity of a dedicated and a deep approach in order to improve their knowledge in this specific field.
Dimensional metrology is an essential part of modern manufacturing technologies, but the basic theories and measurement methods are no longer sufficient for today's digitized systems. The information exchange between the software components of a dimensional metrology system not only costs a great deal of money, but also causes the entire system to lose data integrity. Information Modeling for Interoperable Dimensional Metrology analyzes interoperability issues in dimensional metrology systems and describes information modeling techniques. It discusses new approaches and data models for solving interoperability problems, as well as introducing process activities, existing and emerging data models, and the key technologies of dimensional metrology systems. Written for researchers in industry and academia, as well as advanced undergraduate and postgraduate students, this book gives both an overview and an in-depth understanding of complete dimensional metrology systems. By covering in detail the theory and main content, techniques, and methods used in dimensional metrology systems, Information Modeling for Interoperable Dimensional Metrology enables readers to solve real-world dimensional measurement problems in modern dimensional metrology practices.
This series of reference books describes sciences of different elds in and around geodesy with independent chapters. Each chapter covers an individual eld and describes the history, theory, objective, technology, development, highlights of research and applications. In addition, problems as well as future directions are discussed. The subjects of this reference book include Absolute and Relative Gravimetry, Adaptively Robust Kalman Filters with Applications in Navigation, Airborne Gravity Field Determination, Analytic Orbit Theory, Deformation and Tectonics, Earth Rotation, Equivalence of GPS Algorithms and its Inference, Marine Geodesy, Satellite Laser Ranging, Superconducting Gravimetry and Synthetic Aperture Radar Interferometry. These are individual subjects in and around geodesy and are for the rst time combined in a unique book which may be used for teaching or for learning basic principles of many subjects related to geodesy. The material is suitable to provide a general overview of geodetic sciences for high-level geodetic researchers, educators as well as engineers and students. Some of the chapters are written to ll literature blanks of the related areas. Most chapters are written by well-known scientists throughout the world in the related areas. The chapters are ordered by their titles. Summaries of the individual chapters and introductions of their authors and co-authors are as follows. Chapter 1 "Absolute and Relative Gravimetry" provides an overview of the gravimetric methods to determine most accurately the gravity acceleration at given locations.
The book describes the experimental techniques employed to study surfaces and interfaces. The emphasis is on the experimental method. Therefore all chapters start with an introduction of the scientific problem, the theory necessary to understand how the technique works and how to understand the results. Descriptions of real experimental setups, experimental results at different systems are given to show both the strength and the limits of the technique. In a final part the new developments and possible extensions of the techniques are presented. The included techniques provide microscopic as well as macroscopic information. They cover most of the techniques used in surface science.
Precision Nanometrology describes the new field of precision nanometrology, which plays an important part in nanoscale manufacturing of semiconductors, optical elements, precision parts and similar items. It pays particular attention to the measurement of surface forms of precision workpieces and to stage motions of precision machines. The first half of the book is dedicated to the description of optical sensors for the measurement of angle and displacement, which are fundamental quantities for precision nanometrology. The second half presents a number of scanning-type measuring systems for surface forms and stage motions. The systems discussed include: * error separation algorithms and systems for measurement of straightness and roundness, * the measurement of micro-aspherics, * systems based on scanning probe microscopy, and * scanning image-sensor systems. Precision Nanometrology presents the fundamental and practical technologies of precision nanometrology with a helpful selection of algorithms, instruments and experimental data. It will be beneficial for researchers, engineers and postgraduate students involved in precision engineering, nanotechnology and manufacturing.
Since the discovery of the giant magnetoresistance (GMR) effect in 1988, spintronics has been presented as a new technology paradigm, awarded by the Nobel Prize in Physics in 2007. Initially used in read heads of hard disk drives, and while disputing a piece of the market to the flash memories, GMR devices have broadened their range of usage by growing towards magnetic field sensing applications in a huge range of scenarios. Potential applications at the time of the discovery have become real in the last two decades. Definitively, GMR was born to stand. In this sense, selected successful approaches of GMR based sensors in different applications: space, automotive, microelectronics, biotechnology ... are collected in the present book. While keeping a practical orientation, the fundamentals as well as the current trends and challenges of this technology are also analyzed. In this sense, state of the art contributions from academy and industry can be found through the contents. This book can be used by starting researchers, postgraduate students and multidisciplinary scientists in order to have a reference text in this topical fascinating field.
This Dictionary of Weighing Terms is a comprehensive practical guide to the terminology of weighing for all users of weighing instruments in industry and science. It explains more than 1000 terms of weighing technology and related areas; numerous illustrations assist understanding. The Dictionary of Weighing Terms is a joint work of the German Federal Institute of Physics and Metrology (PTB) and METTLER TOLEDO, the weighing instruments manufacturer. Special thanks go to Peter Brandes, Michael Denzel, and Dr. Oliver Mack of PTB, and to Richard Davis of BIPM, who with their technical knowledge have contributed to the success of this work. The Dictionary contains terms from the following fields: fundamentals of weighing, application and use of weighing instruments, international standards, legal requirements for weighing instruments, weighing accuracy. An index facilitates rapid location of the required term. The authors welcome suggestions and corrections at www.mt.com/w eighing-terms. Braunschweig (DE) and Greifensee (CH), The Authors Summer 2009 Foreword Since its founding in 1875, the International Bureau of Weights and Measures (BIPM) has had a unique role in mass metrology. The definition of the kilogram depends on an artefact conserved and used within our laboratories. The mass embodied in this - tefact defines the kilogram, and this information is disseminated throughout the world to promote uniformity of measurements. Although the definition of the kilogram may change in the re- tively near future, reflecting the success of new technologies and new requirements, the task of ensuring world-wide uniformity of mass measurements will remain.
This book describes the state-of-the art instruments for measuring the solar irradiance from soft x-ray to the near infrared and the total solar irradiance. Furthermore, the SORCE mission and early results on solar variability are presented along with papers that provide an overview of solar influences on Earth. This collection of papers provides the only detailed description of the SORCE mission and its instruments.
Dealing with Uncertainties is an innovative monograph that lays special emphasis on the deductive approach to uncertainties and on the shape of uncertainty distributions. This perspective has the potential for dealing with the uncertainty of a single data point and with sets of data that have different weights. It is shown that the inductive approach that is commonly used to estimate uncertainties is in fact not suitable for these two cases. The approach that is used to understand the nature of uncertainties is novel in that it is completely decoupled from measurements. Uncertainties which are the consequence of modern science provide a measure of confidence both in scientific data and in information in everyday life. Uncorrelated uncertainties and correlated uncertainties are fully covered and the weakness of using statistical weights in regression analysis is discussed. The text is abundantly illustrated with examples and includes more than 150 problems to help the reader master the subject.
"Natural Gas Hydrates: Experimental Techniques and Their Applications" attempts to broadly integrate the most recent knowledge in the fields of hydrate experimental techniques in the laboratory. The book examines various experimental techniques in order to provide useful parameters for gas hydrate exploration and exploitation. It provides experimental techniques for gas hydrates, including the detection techniques, the thermo-physical properties, permeability and mechanical properties, geochemical abnormalities, stability and dissociation kinetics, exploitation conditions, as well as modern measurement technologies etc. This book will be of interest to experimental scientists who engage in gas hydrate experiments in the laboratory, and is also intended as a reference work for students concerned with gas hydrate research. Yuguang Ye is a distinguished professor of Experimental Geology at Qingdao Institute of Marine Geology, China Geological Survey, China. Professor Changling Liu works at the Qingdao Institute of Marine Geology, China Geological Survey, China.
The ATLAS detector at the CERN Large Hadron Collider is an apparatus of unprecedented complexity, designed to probe physics in proton-proton collisions at centre-of-mass energies up to 14 TeV. It was installed in its underground cavern at the LHC during the period 2004 to 2008. Testing of individual subsystems began immediately with calibration systems and cosmic rays, and by 2008 full detector systems could be operated with the planned infrastructure, readout, and monitoring systems. Several commissioning runs of the full detector were organized in 2008 and 2009. During these runs the detector was operated continuously for several months with its readout triggered by cosmic ray muons. At the same time, regular calibrations of individual detector systems were made. In the course of these runs, signals from tens of millions of cosmic ray events were recorded. These commissioning runs continued until the first beam-beam collisions in late 2009. This volume is a collection of seven performance papers based on data collected during this commissioning period. Five papers deal with the response of individual detector systems. One paper describes the performance of the simulation infrastructure used to model the detector's response to both cosmic rays and to the later beam-beam collisions. The final paper describes measurements drawing on the integrated performance of several detector systems. It studies lepton identification, the response to low energy electrons, muon energy loss in the calorimeters, missing ET effects, and the combined performance for muons when both the muon spectrometer and the inner tracking detector are used. These papers summarize the studies of the ATLAS detector performance and readiness prior to the start of colliding beam data. They are reprinted from The European Physical Journal C where they were published between summer 2010 and spring 2011.
Infrared thermography is a measurement technique that enables to obtain non intrusive measurements of surface temperatures. One of the interesting features of this technique is its ability to measure a full two dimensional map of the surface temperature and for this reason it has been widely used as a flow visualization technique. Since the temperature measurements can be extremely accurate it is possible, by using a heat flux sensor, also to measure convective heat transfer coefficient distributions on a surface making the technique de facto quantitative. This book, starting from the basic theory of infrared thermography and heat flux sensor guides, both the experienced researcher and the young student, in the correct application of this powerful technique to various practical problems. A significant number of examples and applications are also examined in detail. |
You may like...
Disciple - Walking With God
Rorisang Thandekiso, Nkhensani Manabe
Paperback
(1)
|