![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
This is the first book to show how to apply the principles of quality assurance to the identification of analytes (qualitative chemical analysis). After presenting the principles of identification and metrological basics, the author focuses on the reliability and the errors of chemical identification. This is then applied to practical examples such as EPA methods, EU, FDA, or WADA regulations. Two whole chapters are devoted to the analysis of unknowns and identification of samples such as foodstuffs or oil pollutions. Essential reading for researchers and professionals dealing with the identification of chemical compounds and the reliability of chemical analysis.
This book presents recent advances and developments in control, automation, robotics, and measuring techniques. It presents contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.
This book gives a detailed review on ground-based aerosol optical depth measurement with emphasis on the calibration issue. The review is written in chronological sequence to render better comprehension on the evolution of the classical Langley calibration from the past to present. It not only compiles the existing calibration methods but also presents a novel calibration algorithm in Langley sun-photometry over low altitude sites which conventionally is a common practice performed at high observatory stations. The proposed algorithm avoids travelling to high altitudes for frequent calibration that is difficult both in logistics and financial prospects. We addressed the problem by combining clear-sky detection model and statistical filter to strictly imitate the ideal clear-sky condition at high altitude for measurements taken over low altitudes. In this way, the possible temporal atmospheric drifts, abundant aerosol loadings and short time interval cloud transits are properly constrained. We believe that this finding has an integral part of practicality and versatility in ground-based aerosol optical depth measurement, which is nowadays an important climate agent in many atmospheric studies. Finally, the outcome of this book introduces a new calibration technique for the study and measurement of aerosol monitoring with emphasis on aerosol optical depth that we believe could be very beneficial to researchers and scientists in the similar area.
Tests of the current understanding of physics at the highest energies achievable in man-made experiments are performed at CERN's Large Hadron Collider. In the theory of the strong force within the Standard Model of particle physics - Quantum ChromoDynamics or QCD - confined quarks and gluons from the proton-proton scattering manifest themselves as groups of collimated particles. These particles are clustered into physically measurable objects called hadronic jets. As jets are widely produced at hadron colliders, they are the key physics objects for an early "rediscovery of QCD". This thesis presents the first jet measurement from the ATLAS Collaboration at the LHC and confronts the experimental challenges of precision measurements. Inclusive jet cross section data are then used to improve the knowledge of the momentum distribution of quarks and gluons within the proton and of the magnitude of the strong force.
This volume surveys recent research on autonomous sensor networks from the perspective of enabling technologies that support medical, environmental and military applications. State of the art, as well as emerging concepts in wireless sensor networks, body area networks and ambient assisted living introduce the reader to the field, while subsequent chapters deal in depth with established and related technologies, which render their implementation possible. These range from smart textiles and printed electronic devices to implanted devices and specialized packaging, including the most relevant technological features. The last four chapters are devoted to customization, implementation difficulties and outlook for these technologies in specific applications.
This book by Helmut Wiedemann is a well-established, classic text, providing an in-depth and comprehensive introduction to the field of high-energy particle acceleration and beam dynamics. The present 4th edition has been significantly revised, updated and expanded. The newly conceived Part I is an elementary introduction to the subject matter for undergraduate students. Part II gathers the basic tools in preparation of a more advanced treatment, summarizing the essentials of electrostatics and electrodynamics as well as of particle dynamics in electromagnetic fields. Part III is an extensive primer in beam dynamics, followed, in Part IV, by an introduction and description of the main beam parameters and including a new chapter on beam emittance and lattice design. Part V is devoted to the treatment of perturbations in beam dynamics. Part VI then discusses the details of charged particle acceleration. Parts VII and VIII introduce the more advanced topics of coupled beam dynamics and describe very intense beams - a number of additional beam instabilities are introduced and reviewed in this new edition. Part IX is an exhaustive treatment of radiation from accelerated charges and introduces important sources of coherent radiation such as synchrotrons and free-electron lasers. The appendices at the end of the book gather useful mathematical and physical formulae, parameters and units. Solutions to many end-of-chapter problems are given. This textbook is suitable for an intensive two-semester course starting at the senior undergraduate level.
The book is a comprehensive edition which considers the interactions of atoms, ions and molecules with charged particles, photons and laser fields and reflects the present understanding of atomic processes such as electron capture, target and projectile ionisation, photoabsorption and others occurring in most of laboratory and astrophysical plasma sources including many-photon and many-electron processes. The material consists of selected papers written by leading scientists in various fields.
The "Rudolf Moessbauer Story" recounts the history of the discovery of the "Moessbauer Effect" in 1958 by Rudolf Moessbauer as a graduate student of Heinz Maier-Leibnitz for which he received the Nobel Prize in 1961 when he was 32 years old. The development of numerous applications of the Moessbauer Effect in many fields of sciences , such as physics, chemistry, biology and medicine is reviewed by experts who contributed to this wide spread research. In 1978 Moessbauer focused his research interest on a new field "Neutrino Oscillations" and later on the study of the properties of the neutrinos emitted by the sun.
This book fulfills the global need to evaluate measurement results along with the associated uncertainty. In the book, together with the details of uncertainty calculations for many physical parameters, probability distributions and their properties are discussed. Definitions of various terms are given and will help the practicing metrologists to grasp the subject. The book helps to establish international standards for the evaluation of the quality of raw data obtained from various laboratories for interpreting the results of various national metrology institutes in an international inter-comparisons. For the routine calibration of instruments, a new idea for the use of pooled variance is introduced. The uncertainty calculations are explained for (i) independent linear inputs, (ii) non-linear inputs and (iii) correlated inputs. The merits and limitations of the Guide to the Expression of Uncertainty in Measurement (GUM) are discussed. Monte Carlo methods for the derivation of the output distribution from the input distributions are introduced. The Bayesian alternative for calculation of expanded uncertainty is included. A large number of numerical examples is included.
Recent state-of-the-art technologies in fabricating low-loss optical and mechanical components have significantly motivated the study of quantum-limited measurements with optomechanical devices. Such research is the main subject of this thesis. In the first part, the author considers various approaches for surpassing the standard quantum limit for force measurements. In the second part, the author proposes different experimental protocols for using optomechanical interactions to explore quantum behaviors of macroscopic mechanical objects. Even though this thesis mostly focuses on large-scale laser interferometer gravitational-wave detectors and related experiments, the general approaches apply equally well for studying small-scale optomechanical devices. The author is the winner of the 2010 Thesis prize awarded by the Gravitational Wave International Committee.
Michael Schenk evaluates new technologies and methods, such as cryogenic read-out electronics and a UV laser system, developed to optimise the performance of large liquid argon time projection chambers (LArTPC). Amongst others, the author studies the uniformity of the electric field produced by a Greinacher high-voltage generator operating at cryogenic temperatures, measures the linear energy transfer (LET) of muons and the longitudinal diffusion coefficient of electrons in liquid argon. The results are obtained by analysing events induced by cosmic-ray muons and UV laser beams. The studies are carried out with ARGONTUBE, a prototype LArTPC in operation at the University of Bern, Switzerland, designed to investigate the feasibility of drift distances of up to five metres for electrons in liquid argon.
Building on its heritage in planetary science, remote sensing of the Earth's at- sphere and ionosphere with occultation methods has undergone remarkable dev- opments since the rst GPS/Met 'proof of concept' mission in 1995. Signals of Global Navigation Satellite Systems (GNSS) satellites are exploited by radio occ- tation while natural signal sources are used in solar, lunar, and stellar occultations. A range of atmospheric variables is provided reaching from fundamental atmospheric parameters such as density, pressure, and temperature to water vapor, ozone, and othertracegasspecies. Theutilityforatmosphereandclimatearisesfromtheunique properties of self-calibration, high accuracy and vertical resolution, global coverage, and (if using radio signals) all-weather capability. Occultations have become a va- able data source for atmospheric physics and chemistry, operational meteorology, climate research as well as for space weather and planetary science. The 3rd International Workshop on Occultations for Probing Atmosphere and Climate (OPAC-3) was held September 17-21, 2007, in Graz, Austria. OPAC-3 aimed at providing a casual forum and stimulating atmosphere for scienti c disc- sion, co-operation initiatives, and mutual learning and support amongst members of alldifferentoccultationcommunities. Theworkshopwasattendedby40participants from 14 different countries who actively contributed to a scienti c programme of high quality and to an excellent workshop atmosphere. The programme included 6 invited keynote presentations and 16 invited pres- tations, complemented by about 20 contributed ones including 8 posters.
In the last quarter century, delamination has come to mean more than just a failure in adhesion between layers of bonded composite plies that might affect their load-bearing capacity. Ever-increasing computer power has meant that we can now detect and analyze delamination between, for example, cell walls in solid wood. This fast-moving and critically important field of study is covered in a book that provides everyone from manufacturers to research scientists the state of the art in wood delamination studies. Divided into three sections, the book first details the general aspects of the subject, from basic information including terminology, to the theoretical basis for the evaluation of delamination. A settled terminology in this subject area is a first key goal of the book, as the terms which describe delamination in wood and wood-based composites are numerous and often confusing. The second section examines different and highly specialized methods for delamination detection such as confocal laser scanning microscopy, light microscopy, scanning electron microscopy and ultrasonics. Ways in which NDE (non-destructive evaluation) can be employed to detect and locate defects are also covered. The book's final section focuses on the practical aspects of this defect in a wide range of wood products covering the spectrum from trees, logs, laminated panels and glued laminated timbers to parquet floors. Intended as a primary reference, this book covers everything from the microscopic, anatomical level of delamination within solid wood sections to an examination of the interface of wood and its surface coatings. It provides readers with the perspective of industry as well as laboratory and is thus a highly practical sourcebook for wood engineers working in manufacturing as well as a comprehensively referenced text for materials scientists wrestling with the theory underlying the subject.
The characteristics of electrical contacts have long attracted the attention of researchers since these contacts are used in every electrical and electronic device. Earlier studies generally considered electrical contacts of large dimensions, having regions of current concentration with diameters substantially larger than the characteristic dimensions of the material: the interatomic distance, the mean free path for electrons, the coherence length in the superconducting state, etc. [110]. The development of microelectronics presented to scientists and engineers the task of studying the characteristics of electrical contacts with ultra-small dimensions. Characteristics of point contacts such as mechanical stability under continuous current loads, the magnitudes of electrical fluctuations, inherent sensitivity in radio devices and nonlinear characteristics in connection with electromagnetic radiation can not be understood and altered in the required way without knowledge of the physical processes occurring in contacts. Until recently it was thought that the electrical conductivity of contacts with direct conductance (without tunneling or semiconducting barriers) obeyed Ohm's law. Nonlinearities of the current-voltage characteristics were explained by joule heating of the metal in the region of the contact. However, studies of the current-voltage characteristics of metallic point contacts at low (liquid helium) temperatures [142] showed that heating effects were negligible in many cases and the nonlinear characteristics under these conditions were observed to take the form of the energy dependent probability of inelastic electron scattering, induced by various mechanisms.
The high accuracy of modern astronomical spatial-temporal reference systems has made them considerably complex. This book offers a comprehensive overview of such systems. It begins with a discussion of 'The Problem of Time', including recent developments in the art of clock making (e.g., optical clocks) and various time scales. The authors address the definitions and realization of spatial coordinates by reference to remote celestial objects such as quasars. After an extensive treatment of classical equinox-based coordinates, new paradigms for setting up a celestial reference system are introduced that no longer refer to the translational and rotational motion of the Earth. The role of relativity in the definition and realization of such systems is clarified. The topics presented in this book are complemented by exercises (with solutions). The authors offer a series of files, written in Maple, a standard computer algebra system, to help readers get a feel for the various models and orders of magnitude. Beyond astrometry, the main fields of application of high-precision astronomical spatial-temporal reference systems and frames are navigation (GPS, interplanetary spacecraft navigation) and global geodynamics, which provide a high-precision Celestial Reference System and its link to any terrestrial spatial-temporal reference system. Mankind's urgent environmental questions can only be answered in the context of appropriate reference systems in which both aspects, space and time, are realized with a sufficiently high level of accuracy. This book addresses all those interested in high-precision reference systems and the various techniques (GPS, Very Long Baseline Interferometry, Satellite Laser Ranging, Lunar Laser Ranging) necessary for their realization, including the production and dissemination of time signals.
This book gathers the proceedings of The Hadron Collider Physics Symposia (HCP) 2005, and reviews the state-of-the-art in the key physics directions of experimental hadron collider research. Topics include QCD physics, precision electroweak physics, c-, b-, and t-quark physics, physics beyond the Standard Model, and heavy ion physics. The present volume serves as a reference for everyone working in the field of accelerator-based high-energy physics.
The search for table-top and repetitive pump schemes during the last decade has been the driving force behind the spectacular advances demonstrated during the 10th International Conference on X-Ray Lasers, organized in 2006 in Berlin. The proceedings of this series of conferences constitute a comprehensive source of reference of the acknowledged state-of the-art in this specific area of laser and plasma physics.
This book provides an in-depth overview of on chip instrumentation technologies and various approaches taken in adding instrumentation to System on Chip (ASIC, ASSP, FPGA, etc.) design that are collectively becoming known as Design for Debug (DfD). On chip instruments are hardware based blocks that are added to a design for the specific purpose and improving the visibility of internal or embedded portions of the design (specific instruction flow in a processor, bus transaction in an on chip bus as examples) to improve the analysis or optimization capabilities for a SoC. DfD is the methodology and infrastructure that surrounds the instrumentation. Coverage includes specific design examples and discussion of implementations and DfD tradeoffs in a decision to design or select instrumentation or SoC that include instrumentation. Although the focus will be on hardware implementations, software and tools will be discussed in some detail.
The object of this NATO Advanced Study Institute was to pre sent a tutorial 'introduction both to the basic physics of recent spectacular advances achieved in the field of metrology and to the determination of fundamental physical constants. When humans began to qualify their description of natural phenomena, metrology, the science of measurement, developed along side geometry and mathematics. However, flam antiquity to modern times, the role of metrology was mostly restricted to the need of commercial, social or scientific transactions of local or at most national scope. Beginning with the Renaissance, and particularly in western Europe during the last century, metrology rapidly developed an international character as a result of growing needs for more accurate measurements and common standards in the emerging indus trial society. Although the concerns of metrology are deeply rooted to fundamental sciences, it was, until recently, perceived by much of the scientific community as mostly custodial in character."
Concepts of nonlinear physics are applied to an increasing number of research disciplines. With this volume, the editors offer a selection of articles on nonlinear topics in progress, ranging from physics and chemistry to biology and some applications of social science. The book covers quantum optics, electron crystallization, cellular or flow patterns in fluids and in granular media, biological systems, and the control of brain structures via neuronal excitation. Chemical patterns are looked at both in bulk solutions and on surfaces in heterogeneous systems. From regular structures, the authors turn to the more complex behavior in biology and physics, such as hydrodynamical turbulence, low-dimensional dynamics in solid-state physics, and gravity.
This book provides tools well suited for the quantitative
investigation of semiconductor electron microscopy. These tools
allow for the accurate determination of the composition of ternary
semiconductor nanostructures with a spatial resolution at near
atomic scales. The book focuses on new methods including strain
state analysis as well as evaluation of the composition via the
lattice fringe analysis (CELFA) technique. The basics of these
procedures as well as their advantages, drawbacks and sources of
error are all discussed. The techniques are applied to quantum
wells and dots in order to give insight into kinetic growth effects
such as segregation and migration. In the first part of the book
the fundamentals of transmission electron microscopy are provided.
These are needed for an understanding of the digital image analysis
techniques described in the second part of the book. There the
reader will find information on different methods of
The fifteenth European Conference on Few-Body Problems in Physics has taken place during the week of June 5th to 9th, in the lovely village of Peniscola, approximately midway between Barcelona and Valencia on the Mediterranean coast. This conference continues the tradition initiated in 1972 at Budapest, where the first conference took place, and followed in Graz (1973), Tiibingen (1975), Vlieland (1976), Uppsala (1977), Dubna (1979), Sesimbra (1980), Fer- rara (1981), Tbilisi (1984), Fontevraud (1987), Uzhgorod (1990), Elba (1991) and Amsterdam (1993). During this week, a total of one hundred and fifty one scientist were exchang- ing their knowledge and initiatives in this broad field of Few-Body Physics. Even if the name of the conference restricts its domain to Europe, there has been an important participation of scientists from non-European countries. A conference with more than twenty years of tradition is already an au- tonomous being, with a noticeable inertia. Nevertheless, it is a reasonable thought to bend this inertia trying to introduce some innovation, of course, without any damage to the basic structure and objectives of the conference.
"Spreadsheets in Science and Engineering" shows scientists and engineers at all levels how to analyze, validate and calculate data and how the analytical and graphic capabilities of spreadsheet programs (ExcelR) can solve these tasks in their daily work. The examples on the CD-ROM accompanying the book include material of undergraduate to current research level in disciplines ranging from chemistry and chemical engineering to molecular biology and geology.
|
You may like...
Effective Multilateralism - Through the…
Jochen Prantl
Hardcover
Labour Relations in South Africa
Dr Hanneli Bendeman, Dr Bronwyn Dworzanowski-Venter
Paperback
Practical Industrial Data Networks…
Steve Mackay, Edwin Wright, …
Paperback
R1,452
Discovery Miles 14 520
|