|
|
Books > Science & Mathematics > Science: general issues > Scientific standards
Metrology is the science of measurements. As such, it deals with
the problem of obtaining knowledge of physical reality through its
quantifiable properties. The problems of measurement and of
measurement accuracy are central to all natural and technical
sciences. Now in its second edition, this monograph conveys the
fundamental theory of measurement and provides some algorithms for
result testing and validation.
Why psychology is in peril as a scientific discipline--and how to
save it Psychological science has made extraordinary discoveries
about the human mind, but can we trust everything its practitioners
are telling us? In recent years, it has become increasingly
apparent that a lot of research in psychology is based on weak
evidence, questionable practices, and sometimes even fraud. The
Seven Deadly Sins of Psychology diagnoses the ills besetting the
discipline today and proposes sensible, practical solutions to
ensure that it remains a legitimate and reliable science in the
years ahead. In this unflinchingly candid manifesto, Chris Chambers
draws on his own experiences as a working scientist to reveal a
dark side to psychology that few of us ever see. Using the seven
deadly sins as a metaphor, he shows how practitioners are
vulnerable to powerful biases that undercut the scientific method,
how they routinely torture data until it produces outcomes that can
be published in prestigious journals, and how studies are much less
reliable than advertised. He reveals how a culture of secrecy
denies the public and other researchers access to the results of
psychology experiments, how fraudulent academics can operate with
impunity, and how an obsession with bean counting creates perverse
incentives for academics. Left unchecked, these problems threaten
the very future of psychology as a science--but help is here.
Outlining a core set of best practices that can be applied across
the sciences, Chambers demonstrates how all these sins can be
corrected by embracing open science, an emerging philosophy that
seeks to make research and its outcomes as transparent as possible.
Semiconductors and Modern Electronics is a brief introduction to
the physics behind semiconductor technologies. Chuck Winrich, a
physics professor at Babson College, explores the topic of
semiconductors from a qualitative approach to understanding the
theories and models used to explain semiconductor devices.
Applications of semiconductors are explored and understood through
the models developed in the book. The qualitative approach in this
book is intended to bring the advanced ideas behind semiconductors
to the broader audience of students who will not major in physics.
Much of the inspiration for this book comes from Dr. Winrich's
experience teaching a general electronics course to students
majoring in business. The goal of that class, and this book, is to
bring forward the science behind semiconductors, and then to look
at how that science affects the lives of people.
Electrostatic Accelerators have been at the forefront of modern
technology since the development by Sir John Cockroft and Ernest
Walton in 1932 of the first accelerator, which was the first to
achieve nuclear transmutation and earned them the Nobel Prize in
Physics in 1951. The applications of Cockroft and Walton's
development have been far reaching, even into our kitchens where it
is employed to generate the high voltage needed for the magnetron
in microwave ovens. Other electrostatic accelerator related Nobel
prize winning developments that have had a major socio-economic
impact are; the electron microscope where the beams of electrons
are produced by an electrostatic accelerator, X-rays and computer
tomography (CT) scanners where the X-rays are produced using an
electron accelerator and microelectronic technology where ion
implantation is used to dope the semiconductor chips which form the
basis of our computers, mobile phones and entertainment systems.
Although the Electrostatic Accelerator field is over 90 years old,
and only a handful of accelerators are used for their original
purpose in nuclear physics, the field and the number of
accelerators is growing more rapidly than ever. The objective of
this book is to collect together the basic science and technology
that underlies the Electrostatic Accelerator field so it can serve
as a handbook, reference guide and textbook for accelerator
engineers as well as students and researchers who work with
Electrostatic Accelerators.
Atomic Physics provides a concise treatment of atomic physics and a
basis to prepare for work in other disciplines that are underpinned
by atomic physics such as chemistry, biology and several aspects of
engineering science. The focus is mainly on atomic structure since
this is what is primarily responsible for the physical properties
of atoms. After a brief introduction to some basic concepts, the
perturbation theory approach follows the hierarchy of interactions
starting with the largest. The other interactions of spin, and
angular momentum of the outermost electrons with each other, the
nucleus and external magnetic fields are treated in order of
descending strength. A spectroscopic perspective is generally taken
by relating the observations of atomic radiation emitted or
absorbed to the internal energy levels involved. X-ray spectra are
then discussed in relation to the energy levels of the innermost
electrons. Finally, a brief description is given of some modern,
laser based, spectroscopic methods for the high resolution study of
the nest details of atomic structure.
This thesis describes the first detection of a nuclear transition
that had been sought for 40 years, and marks the essential first
step toward developing nuclear clocks. Atomic clocks are currently
the most reliable timekeepers. Still, they could potentially be
outperformed by nuclear clocks, based on a nuclear transition
instead of the atomic transitions employed to date. An elusive,
extraordinary state in thorium-229 seems to be the only nuclear
transition suitable for this purpose and feasible using currently
available technology. Despite repeated efforts over the past 40
years, until recently we had not yet successfully detected the
decay of this elusive state. Addressing this gap, the thesis lays
the foundation for the development of a new, better frequency
standard, which will likely have numerous applications in satellite
navigation and rapid data transfer. Further, it makes it possible
to improve the constraints for time variations of fundamental
constants and opens up the field of nuclear coherent control.
Albert Einstein's General Theory of Relativity, published in 1915,
made a remarkable prediction: gravitational radiation. Just like
light (electromagnetic radiation), gravity could travel through
space as a wave and affect any objects it encounters by alternately
compressing and expanding them. However, there was a problem. The
force of gravity is around a trillion, trillion, trillion times
weaker than electromagnetism so the calculated compressions and
expansions were incredibly small, even for gravity waves resulting
from a catastrophic astrophysical event such as a supernova
explosion in our own galaxy. Discouraged by this result, physicists
and astronomers didn't even try to detect these tiny, tiny effects
for over 50 years. Then, in the late 1960s and early 1970s, two
events occurred which started the hunt for gravity waves in
earnest. The first was a report of direct detection of gravity
waves thousands of times stronger than even the most optimistic
calculation. Though ultimately proved wrong, this result started
scientists thinking about what instrumentation might be necessary
to detect these waves. The second was an actual, though indirect,
detection of gravitational radiation due to the effects it had on
the period of rotation of two 'neutron stars' orbiting each other.
In this case, the observations were in exact accord with
predictions from Einstein's theory, which confirmed that a direct
search might ultimately be successful. Nevertheless, it took
another 40 years of development of successively more sensitive
detectors before the first real direct effects were observed in
2015, 100 years after gravitational waves were first predicted.
This is the story of that hunt, and the insight it is producing
into an array of topics in modern science, from the creation of the
chemical elements to insights into the properties of gravity
itself.
Measurement techniques form the basis of scientific, engineering,
and industrial innovations. The methods and instruments of
measurement for different fields are constantly improving, and it's
necessary to address not only their significance but also the
challenges and issues associated with them. Strategic Applications
of Measurement Technologies and Instrumentation is a collection of
innovative research on the methods and applications of measurement
techniques in medical and scientific discoveries, as well as modern
industrial applications. The book is divided into two sections with
the first focusing on the significance of measurement strategies in
physics and biomedical applications and the second examining
measurement strategies in industrial applications. Highlighting a
range of topics including material assessment, measurement
strategies, and nanoscale materials, this book is ideally designed
for engineers, academicians, researchers, scientists, software
developers, graduate students, and industry professionals.
The International Linear Collider (ILC) is a mega-scale,
technically complex project, requiring large financial resources
and cooperation of thousands of scientists and engineers from all
over the world. Such a big and expensive project has to be
discussed publicly, and the planned goals have to be clearly
formulated. This book advocates for the demand for the project,
motivated by the current situation in particle physics. The natural
and most powerful way of obtaining new knowledge in particle
physics is to build a new collider with a larger energy. In this
approach, the Large Hadron Collider (LHC) was created and is now
operating at the world record center of-mass energy of 13 TeV.
Although the design of colliders with a larger energy of 50-100 TeV
has been discussed, the practical realization of such a project is
not possible for another 20-30 years. Of course, many new results
are expected from LHC over the next decade. However, we must also
think about other opportunities, and in particular, about the
construction of more dedicated experiments. There are many
potentially promising projects, however, the most obvious
possibility to achieve significant progress in particle physics in
the near future is the construction of a linear e+e- collider with
energies in the range (250-1000) GeV. Such a project, the ILC, is
proposed to be built in Kitakami, Japan. This book will discuss why
this project is important and which new discoveries can be expected
with this collider.
Metrological data is known to be blurred by the imperfections of
the measuring process. In retrospect, for about two centuries
regular or constant errors were no focal point of experimental
activities, only irregular or random error were. Today's notation
of unknown systematic errors is in line with this. Confusingly
enough, the worldwide practiced approach to belatedly admit those
unknown systematic errors amounts to consider them as being random,
too. This book discusses a new error concept dispensing with the
common practice to randomize unknown systematic errors. Instead,
unknown systematic errors will be treated as what they physically
are- namely as constants being unknown with respect to magnitude
and sign. The ideas considered in this book issue a proceeding
steadily localizing the true values of the measurands and
consequently traceability.
Since the turn of the century, the increasing availability of
photoelectron imaging experiments, along with the increasing
sophistication of experimental techniques, and the availability of
computational resources for analysis and numerics, has allowed for
significant developments in such photoelectron metrology. Quantum
Metrology with Photoelectrons, Volume 1: Foundations discusses the
fundamental concepts along with recent and emerging applications.
The core physics is that of photoionization, and Volume 1 addresses
this topic. The foundational material is presented in part as a
tutorial with extensive numerical examples and also in part as a
collected reference to the relevant theoretical treatments from the
literature for a range of cases. Topics are discussed with an eye
to developing general quantum metrology schemes, in which full
quantum state reconstruction of the photoelectron wavefunction is
the goal. In many cases, code and/or additional resources are
available online. Consequently, it is hoped that readers at all
levels will find something of interest and that the material
provides something rather different from existing textbooks.
Two of the most powerful tools used to study magnetic materials are
inelastic neutron scattering and THz spectroscopy. Because the
measured spectra provide a dynamical fingerprint of a magnetic
material, those tools enable scientists to unravel the structure of
complex magnetic states and to determine the microcscopic
interactions that produce them. This book discusses the
experimental techniques of inleastic neutron scattering and THz
spectroscopy and provides the theoretical tools required to analyze
their measurements using spin-wave theory. For most materials, this
analysis can resolve the microscopic magnetic interactions such as
exchange, anisotropy, and Dzyaloshinskii-Moriya interactions.
Assuming a background in elementary statistical mechanics and a
familiarity with the quantized harmonic oscillator, this book
presents a comprehensive review of spin-wave theory and its
applications to both inelastic neutron scattering and THz
spectroscopy. Spin-wave theory is used to study several model
magnetic systems, including non-collinear magnets such as spirals
and cycloids that are produced by geometric frustration, competing
exchange interactions, or Dzyaloshinskii-Moriya interactions.
Several case studies utilizing spin-wave theory to analyze
inelastic neutron-scattering and THz spectroscopy measurements are
presented. These include both single crystals and powders and both
oxides and molecule-based magnets. In addition to sketching the
numerical techniques used to fit dynamical spectra based on
microscopic models, this book also contains over 70 exercises that
can be performed by beginning graduate students.
Time-resolved optical stimulation of luminescence has become
established as an important method for measurement of optically
stimulated luminescence. Its enduring appeal is easy to see with
the number of materials studied growing from the initial focus on
natural minerals such as quartz and feldspar to synthetic
dosimeters such as i !-Al2O3:C, BeO and YAlO3:Mn2+. The aim of
time-resolved optical stimulation is to separate in time the
stimulation and emission of luminescence. The luminescence is
stimulated from a sample using a brief light pulse. The ensuing
luminescence can be monitored either during stimulation in the
presence of scattered stimulating light or after the light-pulse.
The time-resolved luminescence spectrum measured in this way can be
resolved into components each with a distinct lifetime. The
lifetimes are linked to physical processes of luminescence and thus
provide a means to study dynamics involving charge transfer between
point-defects in materials. This book is devoted to time-resolved
optically stimulated luminescence and is suitable for researchers
with an interest in the study of point-defects using luminescence
methods. The book first sets the method within the context of
luminescence field at large and then provides an overview of the
instrumentation used. There is much attention on models for
time-resolved optically stimulated luminescence, two of which are
analytical and the third of which is based on computational
simulation of experimental results. To bring relevance to the
discussion, the book draws on examples from studies on quartz and
a-Al2O3:C, two materials widely investigated using this method. The
book shows how kinetic analysis for various thermal effects such as
thermal quenching and thermal assistance can be investigated using
time-resolved luminescence. Although use of light sums is an
obvious choice for this, contemporary work is discussed to show the
versatility of using other alternative methods such the dynamic
throughput.
This book describes modern focused ion beam microscopes and
techniques and how they can be used to aid materials metrology and
as tools for the fabrication of devices that in turn are used in
many other aspects of fundamental metrology. Beginning with a
description of the currently available instruments including the
new addition to the field of plasma-based sources, it then gives an
overview of ion solid interactions and how the different types of
instrument can be applied. Chapters then describe how these
machines can be applied to the field of materials science and
device fabrication giving examples of recent and current activity
in both these areas.
Since the turn of the century, the increasing availability of
photoelectron imaging experiments, along with the increasing
sophistication of experimental techniques, and the availability of
computational resources for analysis and numerics, has allowed for
significant developments in such photoelectron metrology. Quantum
Metrology with Photoelectrons, Volume 2: Applications and Advances
discusses the fundamental concepts along with recent and emerging
applications. Volume 2 explores the applications and development of
quantum metrology schemes based on photoelectron measurements. The
author begins with a brief historical background on ""complete""
photoionization experiments, followed by the details of state
reconstruction methodologies from experimental measurements. Three
specific applications of quantum metrology schemes are discussed in
detail. In addition, the book provides advances, future directions,
and an outlook including (ongoing) work to generalise these schemes
and extend them to dynamical many-body systems. Volume 2 will be of
interest to readers wishing to see the (sometimes messy) details of
state reconstruction from photoelectron measurements as well as
explore the future prospects for this class of metrology.
Photoemission (also known as photoelectron) spectroscopy refers to
the process in which an electron is removed from a specimen after
the atomic absorption of a photon. The first evidence of this
phenomenon dates back to 1887 but it was not until 1905 that
Einstein offered an explanation of this effect, which is now
referred to as ""the photoelectric effect"".Quantitative Core Level
Photoelectron Spectroscopy: A Primer tackles the pragmatic aspects
of the photoemission process with the aim of introducing the reader
to the concepts and instrumentation that emerge from an
experimental approach. The basic elements implemented for the
technique are discussed and the geometry of the instrumentation is
explained. The book covers each of the features that have been
observed in the X-ray photoemission spectra and provides the tools
necessary for their understanding and correct identification.
Charging effects are covered in the penultimate chapter with the
final chapter bringing closure to the basic uses of the X-ray
photoemission process, as well as guiding the reader through some
of the most popular applications used in current research.
|
|