![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific standards
The expression of uncertainty in measurement poses a challenge since it involves physical, mathematical, and philosophical issues. This problem is intensified by the limitations of the probabilistic approach used by the current standard (the GUM Instrumentation Standard). This text presents an alternative approach. It makes full use of the mathematical theory of evidence to express the uncertainty in measurements. Coverage provides an overview of the current standard, then pinpoints and constructively resolves its limitations. Numerous examples throughout help explain the book 's unique approach.
The field of Adaptive Optics (AO) for astronomy has matured in recent years, and diffraction-limited image resolution in the near-infrared is now routinely achieved by ground-based 8 to 10m class telescopes. This book presents the proceedings of the ESO Workshop on Science with Adaptive Optics held in the fall of 2003. The book provides an overview on AO instrumentation, data acquisition and reduction strategies, and covers observations of the sun, solar system objects, circumstellar disks, substellar companions, HII regions, starburst environments, late-type stars, the galactic center, active galaxies, and quasars. The contributions present a vivid picture of the multitude of science topics being addressed by AO in observational astronomy.
This book compares and offers a comprehensive overview of nine analytical techniques important in material science and many other branches of science. All these methods are already well adapted to applications in diverse fields such as medical, environmental studies, archaeology, and materials science. This clearly presented reference describes and compares the principles of the methods and the various source and detector types.
This book summarizes the experience of many years of teamwork with my group, the beam diagnostics group of GSI. For a long time the group was also responsible for operating the machines and application programming. In my opinion, this connection was very e?cient: ?rst, because a beam diagnostic system has to place powerful tools at the operators' disposal; second, because data evaluation and presentation of results for machine operation demand application programs which can be handled not only by skilled experts. On the other hand, accelerator developments and improvements as well as commissioning of new machines by specialists require more complex measu- ments than those for routine machine operation. A modern beam diagnostic system, including the software tools, has to cover these demands, too. Therefore, this book should motivate physicists, constructors, electronic engineers, and computer experts to work together during the design and daily use of a beam diagnostic system. This book aims to give them ideas and tools for their work. I would not have been able to write this book without a good education in physics and many discussions with competent leaders, mentors, and c- leagues. After working about 40 years in teams on accelerators, there are so many people I have to thank that it is impossible to mention them all by name here.
Civil infrastructure systems are generally the most expensive assets in any country, and these systems are deteriorating at an alarming rate. In addition, these systems have a long service life in comparison to most other commercial products. As well, the introduction of intelligent materials and innovative design approaches in these systems is painfully slow due to heavy relianceon traditional construction and maintenance practices, and the conservative nature of design codes. Feedback on the "state of the health" of constructed systems is practically nonexistent. In the quest for lighter, stronger and corrosion-resistant structures, the replacement of ferrous materials by high-strength fibrous ones is being actively pursued in several countries around the world, both with respect to the design of new structures as well as for the rehabilitation and strengthening of existing ones. In North America, active research in the design of new highway bridges is focused on a number of specialty areas, including the replacement of steel reinforcing bars in concrete deck slabs by randomly distributed low-modulus fibers, and the replacement of steel prestressing cables for concrete components by tendons comprising super-strong fibers. Research is also being conducted on using FRPs to repair and strengthen existing structures.
Research in the field of shock physics and ballistic impact has always been intimately tied to progress in development of facilities for accelerating projectiles to high velocity and instrumentation for recording impact phenomena. The chapters of this book, written by leading US and European experts, cover a broad range of topics and address researchers concerned with questions of material behaviour under impulsive loading and the equations of state of matter, as well as the design of suitable instrumentation such as gas guns and high-speed diagnostics. Applications include high-speed impact dynamics, the inner composition of planets, syntheses of new materials and materials processing. Among the more technologically oriented applications treated is the testing of the flight characteristics of aeroballistic models and the assessment of impacts in the aerospace industry.
In this book, Grabe illustrates the breakdown of traditional error calculus in the face of modern measurement techniques. Revising Gauss error calculus ab initio, he treats random and unknown systematic errors on an equal footing from the outset. Furthermore, Grabe also proposes what may be called well defined measuring conditions, a prerequisite for defining confidence intervals that are consistent with basic statistical concepts. The resulting measurement uncertainties are as robust and reliable as required by modern-day science, engineering and technology."
Pixel detectors are a particularly important class of particle and radiation detection devices. They have an extremely broad spectrum of applications, ranging from high-energy physics to the photo cameras of everyday life. This book is a general purpose introduction into the fundamental principles of pixel detector technology and semiconductor-based hybrid pixel devices. Although these devices were developed for high-energy ionizing particles and radiation beyond visible light, they are finding new applications in many other areas. This book will therefore benefit all scientists and engineers working in any laboratory involved in developing or using particle detection.
A broad, almost encyclopedic overview of spectroscopic and other analytical techniques useful for investigations of phase boundaries in electrochemistry is presented. The analysis of electrochemical interfaces and interphases on a microscopic, even molecular level, is of central importance for an improved understanding of the structure and dynamics of these phase boundaries. The gained knowledge will be needed for improvements of methods and applications reaching from electrocatalysis, electrochemical energy conversion, biocompatibility of metals, corrosion protection to galvanic surface treatment and finishing. The book provides an overview as complete as possible and enables the reader to choose methods most suitable for tackling his particular task. It is nevertheless compact and does not flood the reader with the details of review papers.
Nuclear and radioactive agents are considerable concerns especially after the early 1990s and more attention has been focused on the radiation detection technologies. This book comprises the selected presentations of NATO Advanced Training Course held 26-30 May 2008 in Mugla, Turkey. The contributions represent a wide range of documents related to control, monitoring and measurement methods of nuclear / radioactive isotopes and agents for both fundamental and applied works dealing with their use for different purposes. This book presents environmental data from many locations of different countries and also contains the contributions in the detection/monitoring programs of some authors from CIS countries. The basic goal of this book is to deal with recent developments and applications of environmental monitoring and measurement techniques of environmental radionuclides and nuclear agents as well as the auxiliary techniques. The many recent examples contributed by authors will be useful in monitoring/ measurement studies of radioactive/nuclear agents in the present environment, and can help, not only in carrying out outdoor and laboratory experiments, but also in protection of possible sources of radionuclides and nuclear agents. Especially the contributions of experts and specialists involved in this book assured the highest level of knowledge in the field of techniques for the detection of radioactive and nuclear agents.
Since their discovery by Becquerel and RAntgen, ionising radiation and detectors for ionising radiation have played an ever more important role in medical diagnostics and therapy. The use of scintillating materials in the detection of ionising radiation for medical imaging is the main topic of this book intended for an audience of physicists and engineers. The book will be useful both to new researchers entering the field and to experts interested to learn about the latest developments. It starts with an overview of the state of the art in using radiation detectors for medical imaging, followed by an in depth discussion of all aspects of the use of scintillating materials for this application. Possibilities to improve the performance of existing scintillating materials and completely new ideas on how to use scintillating materials are discussed in detail. The first 4 chapters contain a general overview of the applications of radiation detectors in medicine and present a closer look at the 3 most important subfields, X-ray imaging, gamma ray imaging and PET. One chapter is devoted to semiconductor detectors, a promising new area, and two chapters are devoted to recent technical advances in PET. The remaining 5 chapters deal with scintillating materials and their use in medical imaging.
"Neutrinos and Explosive Events in the Universe" brought together experts from diverse disciplines to offer a detailed view of the exciting new work in this part of High Energy Astrophysics. Sponsored by NATO as an Advanced Study Institute, and coordinated under the auspices of the International School of Cosmic Ray Astrophysics (14th biennial course), the ASI featured a full program of lectures and discussion in the ambiance of the Ettore Majorana Centre in Erice, Italy, including visits to the local Dirac and Chalonge museum collections as well as a view of the cultural heritage of southern Sicily. Enri- ment presentations on results from the Spitzer Infrared Space Telescope and the Origin of Complexity complemented the program. This course was the best attended in the almost 30 year history of the School with 121 participants from 22 countries. The program provided a rich ex- rience, both introductory and advanced, to fascinating areas of observational Astrophysics Neutrino Astronomy, High Energy Gamma Ray Astronomy, P- ticle Astrophysics and the objects most likely responsible for the signals - plosions and related phenomena, ranging from Supernovae to Black Holes to the Big Bang. Contained in this NATO Science Series volume is a summative formulation of the physics and astrophysics of this newly emerging research area that already has been, and will continue to be, an important contributor to understanding our high energy universe.
Over the last decades, technological progress has brought about a multitude of standardization problems. For instance, compatibility standards ensure the interoperability of goods, which is of decisive importance when users face positive externalities in consumption. Consumers' expectations are key to the problem of whether a new technology will prevail as de-facto standard or not. Early adopters must be confident that the network good will be successful. Thus, it may be worthwhile for firms to influence consumers' expectations. Consisting of three models on various aspects of standardization and expectations, this book aims at deepening our understanding of how standards and expectations interact. The models are applied to problems such as "Inter-Technology vs. Intra-Technology Competition" and "Standardization of Nascent Technologies."
This book systematically summarizes the accuracy, precision, and repeatability levels of field-based tests applied in soccer. It considers such details as the effectiveness of tests for different age categories and sexes. In this book, the readers will be able to check all the field-based tests conceived for fitness assessment in soccer through a large systematic review made to the literature. In addition a brief characterization of each test and presentation of the concurrent validity and repeatability levels for each test will be provided. Finally, the book contains a general discussion of the implications of the tests for different methodological approaches to training. It will be use to sports scientists and practitioners.
A discussion of recently developed experimental methods for noise research in nanoscale electronic devices, conducted by specialists in transport and stochastic phenomena in nanoscale physics. The approach described is to create methods for experimental observations of noise sources, their localization and their frequency spectrum, voltage-current and thermal dependences. Our current knowledge of measurement methods for mesoscopic devices is summarized to identify directions for future research, related to downscaling effects. The directions for future research into fluctuation phenomena in quantum dot and quantum wire devices are specified. Nanoscale electronic devices will be the basic components for electronics of the 21st century. From this point of view the signal-to-noise ratio is a very important parameter for the device application. Since the noise is also a quality and reliability indicator, experimental methods will have a wide application in the future.
The way science is done has changed radically in recent years. Scientific research and institutions, which have long been characterized by passion, dedication and reliability, have increasingly less capacity for more ethical pursuits, and are pressed by hard market laws. From the vocation of a few, science has become the profession of many - possibly too many. These trends come with consequences and risks, such as the rise in fraud, plagiarism, and in particular the sheer volume of scientific publications, often of little relevance. The solution? A slow approach with more emphasis on quality rather than quantity that will help us to rediscover the essential role of the responsible scientist. This work is a critical review and assessment of present-day policies and behavior in scientific production and publication. It touches on the tumultuous growth of scientific journals, in parallel with the growth of self-declared scientists over the world. The author's own reflections and experiences help us to understand the mechanisms of contemporary science. Along with personal reminiscences of times past, the author investigates the loopholes and hoaxes of pretend journals and nonexistent congresses, so common today in the scientific arena. The book also discusses the problems of bibliometric indices, which have resulted in large part from the above distortions of scientific life.
It is now widely recognized that measurement data should be
properly analyzed to include an assessment of their associated
uncertainty. Since this parameter allows for a meaningful
comparison of the measurement results and for an evaluation of
their reliability, its expression is important not only in the
specialized field of scientific metrology, but also in industry,
trade, and commerce. General rules for evaluating and expressing
the uncertainty are given in the internationally accepted ISO Guide
to the Expression of Uncertainty in Measurement, generally known as
the GUM.
This Open Access book discusses an extension to low-coherence interferometry by dispersion-encoding. The approach is theoretically designed and implemented for applications such as surface profilometry, polymeric cross-linking estimation and the determination of thin-film layer thicknesses. During a characterization, it was shown that an axial measurement range of 79.91 m with an axial resolution of 0.1 nm is achievable. Simultaneously, profiles of up to 1.5 mm in length were obtained in a scan-free manner. This marked a significant improvement in relation to the state-of-the-art in terms of dynamic range. Also, the axial and lateral measurement range were decoupled partially while functional parameters such as surface roughness were estimated. The characterization of the degree of polymeric cross-linking was performed as a function of the refractive index. It was acquired in a spatially-resolved manner with a resolution of 3.36 x 10-5. This was achieved by the development of a novel mathematical analysis approach.
Die Forschung und Anwendungsentwicklung in dem Bereich chemischer und biochemischer Sensoren ist weiterhin in einem schnellen Wachstum begriffen. Die Erfahrungen des letzten Jahrzehnts haben jedoch gezeigt, dass die erfolgreiche Entwicklung solcher Sensoren, die auch den harten Routinebedingungen in den vielf ltigen Anwendungsgebieten widerstehen, nur dann m glich ist, wenn Chemiker und Ingenieure kooperieren. Daher ist es das Ziel dieses Lehrbuches, sowohl Chemikern als auch Ingenieuren, Lebensmittel- und Biotechnologen in einer streng systematischen aber sehr praxisorientierten Darstellung die Technologie und die Anwendung chemischer Sensoren nahezubringen. Der interdisziplin re Ansatz berbr ckt die unterschiedlichen Denkweisen in Chemie, Physik und Ingenieurwissenschaften erfolgreich.
This volume provides a brief but important summary of the essential tests that need to be performed on all new compounds--whether they are new drugs, pesticides or food additives--before they can be registered for use in the United Kingdom. These basic tests for mutagenicity, originally drawn up by the United Kingdom Environmental Mutagen Society in 1983 have now been fully revised under the auspices of expert working groups from academia and industry and in collaboration with the UK Department of Health. This volume therefore provides the latest official guidelines and recommendations for all scientists involved in the testing and registration of new compounds not only in the UK, but in wider international context. The four main test procedures for measuring mutagenicity described in this volume are bacterial mutation assays, metaphase chromosome aberration assays in vitro, gene mutation assays in cultured mammalian cells, and in vivo cytogenetics assays. Each of these tests is fully explained and described in practical and procedural detail, with additional information on the presentation and data processing of results.
This book includes a collection of standards-specific case studies. The case studies offer an opportunity to combine the teaching preferences of educators with the goals of the SEC (Standards Education Committee); providing students with real-world insight into the technical, political, and economic arenas of engineering. * Encourages students to think critically about standards development and technology solutions * Reinforces the usage of standards as an impetus for innovation * Will help understand the dynamics and impacts of standards A curriculum guide is available to instructors who have adopted the book for a course. To obtain the guide, please send a request to: [email protected]. |
![]() ![]() You may like...
This Is Jesus Christ - An Interactive…
Edward Kenneth Watson
Hardcover
|