![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific standards
This book compares and offers a comprehensive overview of nine analytical techniques important in material science and many other branches of science. All these methods are already well adapted to applications in diverse fields such as medical, environmental studies, archaeology, and materials science. This clearly presented reference describes and compares the principles of the methods and the various source and detector types.
Research in the field of shock physics and ballistic impact has always been intimately tied to progress in development of facilities for accelerating projectiles to high velocity and instrumentation for recording impact phenomena. The chapters of this book, written by leading US and European experts, cover a broad range of topics and address researchers concerned with questions of material behaviour under impulsive loading and the equations of state of matter, as well as the design of suitable instrumentation such as gas guns and high-speed diagnostics. Applications include high-speed impact dynamics, the inner composition of planets, syntheses of new materials and materials processing. Among the more technologically oriented applications treated is the testing of the flight characteristics of aeroballistic models and the assessment of impacts in the aerospace industry.
In this book, Grabe illustrates the breakdown of traditional error calculus in the face of modern measurement techniques. Revising Gauss error calculus ab initio, he treats random and unknown systematic errors on an equal footing from the outset. Furthermore, Grabe also proposes what may be called well defined measuring conditions, a prerequisite for defining confidence intervals that are consistent with basic statistical concepts. The resulting measurement uncertainties are as robust and reliable as required by modern-day science, engineering and technology."
Civil infrastructure systems are generally the most expensive assets in any country, and these systems are deteriorating at an alarming rate. In addition, these systems have a long service life in comparison to most other commercial products. As well, the introduction of intelligent materials and innovative design approaches in these systems is painfully slow due to heavy relianceon traditional construction and maintenance practices, and the conservative nature of design codes. Feedback on the "state of the health" of constructed systems is practically nonexistent. In the quest for lighter, stronger and corrosion-resistant structures, the replacement of ferrous materials by high-strength fibrous ones is being actively pursued in several countries around the world, both with respect to the design of new structures as well as for the rehabilitation and strengthening of existing ones. In North America, active research in the design of new highway bridges is focused on a number of specialty areas, including the replacement of steel reinforcing bars in concrete deck slabs by randomly distributed low-modulus fibers, and the replacement of steel prestressing cables for concrete components by tendons comprising super-strong fibers. Research is also being conducted on using FRPs to repair and strengthen existing structures.
Authored by leading international researchers, this monograph introduces and reviews developed tomograhic methods for discovering 2D and 3D structures of the ionosphere, and discusses the experimental implementation of these methods. The detailed derivations and explanations make this book an excellent starting point for non-specialists.
a R. Fleischer, T. Hurth, M. L. Mangano Physics Department, CERN, 1211 Geneva, Switzerland In the history of quantum and particle physics, discrete system. In this past decade, the key player has been the B-meson system, and we also witnessed the appearance on symmetries and their violation have played an outstanding + ? role. First, the assumption of the conservation of P (parity), stage of the top quark. Thanks to thee e B factories with C (charge conjugation), CP and CPT (T denotes time rever- their detectors BaBar (SLAC) and Belle (KEK), CP vio- tion is now also rmly seen in B-meson decays, where the sal) helped theorists to restrict theoretical predictions, such 0 "golden" decay B ?J/?K shows CP-violating effects as in Fermi's 1934 seminal paper on weak interactions. In S d at the level of 70%. These effects can be translated into the 1957, the observation of P (and C) violation in weak int- angle? of the "unitarity triangle" (UT), which characterizes actions gave a new impact and led to the conjecture that CP the Kobayashi-Maskawa mechanism of CP violation. S- was still a conserved symmetry. In 1963, one year before + ? eral strategies to determine the other angles of the triangle, the surprising observation of CP violation in K ?? ? L ? and ?, have been proposed and successfully applied to decays, the concept of quark- avour mixing was introduced theB-factory data.
This will be a required acquisition text for academic libraries. More than ten years after its discovery, still relatively little is known about the top quark, the heaviest known elementary particle. This extensive survey summarizes and reviews top-quark physics based on the precision measurements at the Fermilab Tevatron Collider, as well as examining in detail the sensitivity of these experiments to new physics. Finally, the author provides an overview of top quark physics at the Large Hadron Collider.
A timely and comprehensive survey, Excimer Laser Technology reports on the current status and range of the underlying technology, applications and devices of this commonly used laser source, as well as the future of new technologies, such as F2 laser technology.
Microscale Diagnostic Techniques highlights the most innovative and powerful developments in microscale diagnostics. It provides a resource for scientists and researchers interested in learning about the techniques themselves, including their capabilities and limitations. The fields of Micro- and Nanotechnology have emerged over the past decade as a major focus of modern scientific and engineering research and technology. Driven by advances in microfabrication, the investigation, manipulation and engineering of systems characterized by micrometer and, more recently, nanometer scales have become commonplace throughout all technical disciplines. With these developments, an entirely new collection of experimental techniques has been developed to explore and characterize such systems.
Pixel detectors are a particularly important class of particle and radiation detection devices. They have an extremely broad spectrum of applications, ranging from high-energy physics to the photo cameras of everyday life. This book is a general purpose introduction into the fundamental principles of pixel detector technology and semiconductor-based hybrid pixel devices. Although these devices were developed for high-energy ionizing particles and radiation beyond visible light, they are finding new applications in many other areas. This book will therefore benefit all scientists and engineers working in any laboratory involved in developing or using particle detection.
A broad, almost encyclopedic overview of spectroscopic and other analytical techniques useful for investigations of phase boundaries in electrochemistry is presented. The analysis of electrochemical interfaces and interphases on a microscopic, even molecular level, is of central importance for an improved understanding of the structure and dynamics of these phase boundaries. The gained knowledge will be needed for improvements of methods and applications reaching from electrocatalysis, electrochemical energy conversion, biocompatibility of metals, corrosion protection to galvanic surface treatment and finishing. The book provides an overview as complete as possible and enables the reader to choose methods most suitable for tackling his particular task. It is nevertheless compact and does not flood the reader with the details of review papers.
This book summarizes the experience of many years of teamwork with my group, the beam diagnostics group of GSI. For a long time the group was also responsible for operating the machines and application programming. In my opinion, this connection was very e?cient: ?rst, because a beam diagnostic system has to place powerful tools at the operators' disposal; second, because data evaluation and presentation of results for machine operation demand application programs which can be handled not only by skilled experts. On the other hand, accelerator developments and improvements as well as commissioning of new machines by specialists require more complex measu- ments than those for routine machine operation. A modern beam diagnostic system, including the software tools, has to cover these demands, too. Therefore, this book should motivate physicists, constructors, electronic engineers, and computer experts to work together during the design and daily use of a beam diagnostic system. This book aims to give them ideas and tools for their work. I would not have been able to write this book without a good education in physics and many discussions with competent leaders, mentors, and c- leagues. After working about 40 years in teams on accelerators, there are so many people I have to thank that it is impossible to mention them all by name here.
Nuclear and radioactive agents are considerable concerns especially after the early 1990s and more attention has been focused on the radiation detection technologies. This book comprises the selected presentations of NATO Advanced Training Course held 26-30 May 2008 in Mugla, Turkey. The contributions represent a wide range of documents related to control, monitoring and measurement methods of nuclear / radioactive isotopes and agents for both fundamental and applied works dealing with their use for different purposes. This book presents environmental data from many locations of different countries and also contains the contributions in the detection/monitoring programs of some authors from CIS countries. The basic goal of this book is to deal with recent developments and applications of environmental monitoring and measurement techniques of environmental radionuclides and nuclear agents as well as the auxiliary techniques. The many recent examples contributed by authors will be useful in monitoring/ measurement studies of radioactive/nuclear agents in the present environment, and can help, not only in carrying out outdoor and laboratory experiments, but also in protection of possible sources of radionuclides and nuclear agents. Especially the contributions of experts and specialists involved in this book assured the highest level of knowledge in the field of techniques for the detection of radioactive and nuclear agents.
Since their discovery by Becquerel and RAntgen, ionising radiation and detectors for ionising radiation have played an ever more important role in medical diagnostics and therapy. The use of scintillating materials in the detection of ionising radiation for medical imaging is the main topic of this book intended for an audience of physicists and engineers. The book will be useful both to new researchers entering the field and to experts interested to learn about the latest developments. It starts with an overview of the state of the art in using radiation detectors for medical imaging, followed by an in depth discussion of all aspects of the use of scintillating materials for this application. Possibilities to improve the performance of existing scintillating materials and completely new ideas on how to use scintillating materials are discussed in detail. The first 4 chapters contain a general overview of the applications of radiation detectors in medicine and present a closer look at the 3 most important subfields, X-ray imaging, gamma ray imaging and PET. One chapter is devoted to semiconductor detectors, a promising new area, and two chapters are devoted to recent technical advances in PET. The remaining 5 chapters deal with scintillating materials and their use in medical imaging.
"Neutrinos and Explosive Events in the Universe" brought together experts from diverse disciplines to offer a detailed view of the exciting new work in this part of High Energy Astrophysics. Sponsored by NATO as an Advanced Study Institute, and coordinated under the auspices of the International School of Cosmic Ray Astrophysics (14th biennial course), the ASI featured a full program of lectures and discussion in the ambiance of the Ettore Majorana Centre in Erice, Italy, including visits to the local Dirac and Chalonge museum collections as well as a view of the cultural heritage of southern Sicily. Enri- ment presentations on results from the Spitzer Infrared Space Telescope and the Origin of Complexity complemented the program. This course was the best attended in the almost 30 year history of the School with 121 participants from 22 countries. The program provided a rich ex- rience, both introductory and advanced, to fascinating areas of observational Astrophysics Neutrino Astronomy, High Energy Gamma Ray Astronomy, P- ticle Astrophysics and the objects most likely responsible for the signals - plosions and related phenomena, ranging from Supernovae to Black Holes to the Big Bang. Contained in this NATO Science Series volume is a summative formulation of the physics and astrophysics of this newly emerging research area that already has been, and will continue to be, an important contributor to understanding our high energy universe.
Over the last decades, technological progress has brought about a multitude of standardization problems. For instance, compatibility standards ensure the interoperability of goods, which is of decisive importance when users face positive externalities in consumption. Consumers' expectations are key to the problem of whether a new technology will prevail as de-facto standard or not. Early adopters must be confident that the network good will be successful. Thus, it may be worthwhile for firms to influence consumers' expectations. Consisting of three models on various aspects of standardization and expectations, this book aims at deepening our understanding of how standards and expectations interact. The models are applied to problems such as "Inter-Technology vs. Intra-Technology Competition" and "Standardization of Nascent Technologies."
This book systematically summarizes the accuracy, precision, and repeatability levels of field-based tests applied in soccer. It considers such details as the effectiveness of tests for different age categories and sexes. In this book, the readers will be able to check all the field-based tests conceived for fitness assessment in soccer through a large systematic review made to the literature. In addition a brief characterization of each test and presentation of the concurrent validity and repeatability levels for each test will be provided. Finally, the book contains a general discussion of the implications of the tests for different methodological approaches to training. It will be use to sports scientists and practitioners.
A discussion of recently developed experimental methods for noise research in nanoscale electronic devices, conducted by specialists in transport and stochastic phenomena in nanoscale physics. The approach described is to create methods for experimental observations of noise sources, their localization and their frequency spectrum, voltage-current and thermal dependences. Our current knowledge of measurement methods for mesoscopic devices is summarized to identify directions for future research, related to downscaling effects. The directions for future research into fluctuation phenomena in quantum dot and quantum wire devices are specified. Nanoscale electronic devices will be the basic components for electronics of the 21st century. From this point of view the signal-to-noise ratio is a very important parameter for the device application. Since the noise is also a quality and reliability indicator, experimental methods will have a wide application in the future.
The way science is done has changed radically in recent years. Scientific research and institutions, which have long been characterized by passion, dedication and reliability, have increasingly less capacity for more ethical pursuits, and are pressed by hard market laws. From the vocation of a few, science has become the profession of many - possibly too many. These trends come with consequences and risks, such as the rise in fraud, plagiarism, and in particular the sheer volume of scientific publications, often of little relevance. The solution? A slow approach with more emphasis on quality rather than quantity that will help us to rediscover the essential role of the responsible scientist. This work is a critical review and assessment of present-day policies and behavior in scientific production and publication. It touches on the tumultuous growth of scientific journals, in parallel with the growth of self-declared scientists over the world. The author's own reflections and experiences help us to understand the mechanisms of contemporary science. Along with personal reminiscences of times past, the author investigates the loopholes and hoaxes of pretend journals and nonexistent congresses, so common today in the scientific arena. The book also discusses the problems of bibliometric indices, which have resulted in large part from the above distortions of scientific life.
It is now widely recognized that measurement data should be
properly analyzed to include an assessment of their associated
uncertainty. Since this parameter allows for a meaningful
comparison of the measurement results and for an evaluation of
their reliability, its expression is important not only in the
specialized field of scientific metrology, but also in industry,
trade, and commerce. General rules for evaluating and expressing
the uncertainty are given in the internationally accepted ISO Guide
to the Expression of Uncertainty in Measurement, generally known as
the GUM.
This Open Access book discusses an extension to low-coherence interferometry by dispersion-encoding. The approach is theoretically designed and implemented for applications such as surface profilometry, polymeric cross-linking estimation and the determination of thin-film layer thicknesses. During a characterization, it was shown that an axial measurement range of 79.91 m with an axial resolution of 0.1 nm is achievable. Simultaneously, profiles of up to 1.5 mm in length were obtained in a scan-free manner. This marked a significant improvement in relation to the state-of-the-art in terms of dynamic range. Also, the axial and lateral measurement range were decoupled partially while functional parameters such as surface roughness were estimated. The characterization of the degree of polymeric cross-linking was performed as a function of the refractive index. It was acquired in a spatially-resolved manner with a resolution of 3.36 x 10-5. This was achieved by the development of a novel mathematical analysis approach.
Die Forschung und Anwendungsentwicklung in dem Bereich chemischer und biochemischer Sensoren ist weiterhin in einem schnellen Wachstum begriffen. Die Erfahrungen des letzten Jahrzehnts haben jedoch gezeigt, dass die erfolgreiche Entwicklung solcher Sensoren, die auch den harten Routinebedingungen in den vielf ltigen Anwendungsgebieten widerstehen, nur dann m glich ist, wenn Chemiker und Ingenieure kooperieren. Daher ist es das Ziel dieses Lehrbuches, sowohl Chemikern als auch Ingenieuren, Lebensmittel- und Biotechnologen in einer streng systematischen aber sehr praxisorientierten Darstellung die Technologie und die Anwendung chemischer Sensoren nahezubringen. Der interdisziplin re Ansatz berbr ckt die unterschiedlichen Denkweisen in Chemie, Physik und Ingenieurwissenschaften erfolgreich.
This volume provides a brief but important summary of the essential tests that need to be performed on all new compounds--whether they are new drugs, pesticides or food additives--before they can be registered for use in the United Kingdom. These basic tests for mutagenicity, originally drawn up by the United Kingdom Environmental Mutagen Society in 1983 have now been fully revised under the auspices of expert working groups from academia and industry and in collaboration with the UK Department of Health. This volume therefore provides the latest official guidelines and recommendations for all scientists involved in the testing and registration of new compounds not only in the UK, but in wider international context. The four main test procedures for measuring mutagenicity described in this volume are bacterial mutation assays, metaphase chromosome aberration assays in vitro, gene mutation assays in cultured mammalian cells, and in vivo cytogenetics assays. Each of these tests is fully explained and described in practical and procedural detail, with additional information on the presentation and data processing of results.
Characterization enables a microscopic understanding of the fundamental properties of materials (Science) to predict their macroscopic behaviour (Engineering). With this focus, Principles of Materials Characterization and Metrology presents a comprehensive discussion of the principles of materials characterization and metrology. Characterization techniques are introduced through elementary concepts of bonding, electronic structure of molecules and solids, and the arrangement of atoms in crystals. Then, the range of electrons, photons, ions, neutrons and scanning probes, used in characterization, including their generation and related beam-solid interactions that determine or limit their use, is presented. This is followed by ion-scattering methods, optics, optical diffraction, microscopy, and ellipsometry. Generalization of Fraunhofer diffraction to scattering by a three-dimensional arrangement of atoms in crystals leads to X-ray, electron, and neutron diffraction methods, both from surfaces and the bulk. Discussion of transmission and analytical electron microscopy, including recent developments, is followed by chapters on scanning electron microscopy and scanning probe microscopies. The book concludes with elaborate tables to provide a convenient and easily accessible way of summarizing the key points, features, and inter-relatedness of the different spectroscopy, diffraction, and imaging techniques presented throughout. Principles of Materials Characterization and Metrology uniquely combines a discussion of the physical principles and practical application of these characterization techniques to explain and illustrate the fundamental properties of a wide range of materials in a tool-based approach. Based on forty years of teaching and research, this book incorporates worked examples, to test the reader's knowledge with extensive questions and exercises.
|
![]() ![]() You may like...
Modeling and Application of…
Zhiguang Cheng, Norio Takahashi, …
Hardcover
R4,501
Discovery Miles 45 010
A Latter-Day Saint Ode to Jesus - The…
Edward Kenneth Watson
Hardcover
Mathematical Modelling in Solid…
Francesco Dell'Isola, Mircea Sofonea, …
Hardcover
R4,946
Discovery Miles 49 460
Making a Collection Count - A Holistic…
Holly Hibner, Mary Kelly
Paperback
R1,546
Discovery Miles 15 460
|