![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific standards
This book provides a concise survey of modern theoretical concepts of X-ray materials analysis. The principle features of the book are: basics of X-ray scattering, interaction between X-rays and matter and new theoretical concepts of X-ray scattering. The various X-ray techniques are considered in detail: high-resolution X-ray diffraction, X-ray reflectivity, grazing-incidence small-angle X-ray scattering and X-ray residual stress analysis. All the theoretical methods presented use the unified physical approach. This makes the book especially useful for readers learning and performing data analysis with different techniques. The theory is applicable to studies of bulk materials of all kinds, including single crystals and polycrystals as well as to surface studies under grazing incidence. The book appeals to researchers and graduate students alike.
This thesis is devoted to ANTARES, the first underwater neutrino telescope in the Mediterranean sea. As the main scientific analysis, a search for high-energy neutrino emission from the region of the Fermi bubbles has been performed using data from the ANTARES detector. A method for the background estimation using off-zones has been developed specially for this measurement. A new likelihood for the limits calculation which treats both observations in the on-zone and in the off-zone in the similar way and also includes different systematic uncertainties has been constructed. The analysis of 2008–2011 ANTARES data yielded a 1.2 σ excess of events in the Fermi bubble regions, compatible with the no-signal hypothesis. For the optimistic case of no energy cutoff in the flux, the upper limit is within a factor of three of the prediction of the purely hadronic model based on the measured gamma-ray flux. The sensitivity improves as more data are accumulated (more than 65% gain in the sensitivity is expected once 2012–2016 data are added to the analysis).
In this book the applicability and the utility of two statistical approaches for understanding dark energy and dark matter with gravitational lensing measurement are introduced. For cosmological constraints on the nature of dark energy, morphological statistics called Minkowski functionals (MFs) to extract the non-Gaussian information of gravitational lensing are studied. Measuring lensing MFs from the Canada-France-Hawaii Telescope Lensing survey (CFHTLenS), the author clearly shows that MFs can be powerful statistics beyond the conventional approach with the two-point correlation function. Combined with the two-point correlation function, MFs can constrain the equation of state of dark energy with a precision level of approximately 3-4 % in upcoming surveys with sky coverage of 20,000 square degrees. On the topic of dark matter, the author studied the cross-correlation of gravitational lensing and the extragalactic gamma-ray background (EGB). Dark matter annihilation is among the potential contributors to the EGB. The cross-correlation is a powerful probe of signatures of dark matter annihilation, because both cosmic shear and gamma-ray emission originate directly from the same dark matter distribution in the universe. The first measurement of the cross-correlation using a real data set obtained from CFHTLenS and the Fermi Large Area Telescope was performed. Comparing the result with theoretical predictions, an independent constraint was placed on dark matter annihilation. Future lensing surveys will be useful to constrain on the canonical value of annihilation cross section for a wide range of mass of dark matter annihilation. Future lensing surveys will be useful to constrain on the canonical value of annihilation cross section for a wide range of mass of dark matter.
The main goal of the book is to provide a systematic and didactic approach to the physics and technology of free-electron lasers. Numerous figures are used for illustrating the underlying ideas and concepts and links to other fields of physics are provided. After an introduction to undulator radiation and the low-gain FEL, the one-dimensional theory of the high-gain FEL is developed in a systematic way. Particular emphasis is put on explaining and justifying the various assumptions and approximations that are needed to obtain the differential and integral equations governing the FEL dynamics. Analytical and numerical solutions are presented and important FEL parameters are defined, such as gain length, FEL bandwidth and saturation power. One of the most important features of a high-gain FEL, the formation of microbunches, is studied at length. The increase of gain length due to beam energy spread, space charge forces, and three-dimensional effects such as betatron oscillations and optical diffraction is analyzed. The mechanism of Self-Amplified Spontaneous Emission is described theoretically and illustrated with numerous experimental results. Various methods of FEL seeding by coherent external radiation are introduced, together with experimental results. The world’s first soft X-ray FEL, the user facility FLASH at DESY, is described in some detail to give an impression of the complexity of such an accelerator-based light source. The last chapter is devoted to the new hard X-ray FELs which generate extremely intense radiation in the Angstrøm regime. The appendices contain supplementary material and more involved calculations.
The research presented here includes important contributions on the commissioning of the ATLAS experiment and the discovery of the Higgs boson. The thesis describes essential work on the alignment of the inner tracker during the commissioning of the experiment and development of the electron identification algorithm. The subsequent analysis focuses on the search for the Higgs boson in the WW channel, including the development of a method to model the critical W+jet background. In addition, the thesis provides excellent introductions, suitable for non-specialists, to Higgs physics, to the LHC, and to the ATLAS experiment.
This book covers the recent research advancements in the area of charging strategies that can be employed to accommodate the anticipated high deployment of Plug-in Electric Vehicles (PEVs) in smart grids. Recent literature has focused on various potential issues of uncoordinated charging of PEVs and methods of overcoming such challenges. After an introduction to charging coordination paradigms of PEVs, this book will present various ways the coordinated control can be accomplished. These innovative approaches include hierarchical coordinated control, model predictive control, optimal control strategies to minimize load variance, smart PEV load management based on load forecasting, integrating renewable energy sources such as photovoltaic arrays to supplement grid power, using wireless communication networks to coordinate the charging load of a smart grid and using market price of electricity and customers payment to coordinate the charging load. Hence, this book proposes many new strategies proposed recently by the researchers around the world to address the issues related to coordination of charging load of PEVs in a future smart grid.
Micro-X-ray fluorescence offers the possibility for a position- sensitive and non-destructive analysis that can be used for the analysis of non-homogeneous materials and layer systems. This analytical technique has shown a dynamic development in the last 15 years and is used for the analysis of small particles, inclusions, of elemental distributions for a wide range of different applications both in research and quality control. The first experiments were performed on synchrotrons but there is a requirement for laboratory instruments which offers a fast and immediate access for analytical results. The book discuss the main components of a µ-XRF instrument and the different measurement modes, it gives an overview about the various instruments types, considers the special requirements for quantification of non-homogeneous materials and presents a wide range of application for single point and multi-point analysis as well as for distribution analysis in one, two and three dimensions.
This book presents and introduces ellipsometry in nanoscience and nanotechnology making a bridge between the classical and nanoscale optical behaviour of materials. It delineates the role of the non-destructive and non-invasive optical diagnostics of ellipsometry in improving science and technology of nanomaterials and related processes by illustrating its exploitation, ranging from fundamental studies of the physics and chemistry of nanostructures to the ultimate goal of turnkey manufacturing control. This book is written for a broad readership: materials scientists, researchers, engineers, as well as students and nanotechnology operators who want to deepen their knowledge about both basics and applications of ellipsometry to nanoscale phenomena. It starts as a general introduction for people curious to enter the fields of ellipsometry and polarimetry applied to nanomaterials and progresses to articles by experts on specific fields that span from plasmonics, optics, to semiconductors and flexible electronics. The core belief reflected in this book is that ellipsometry applied at the nanoscale offers new ways of addressing many current needs. The book also explores forward-looking potential applications.
Housed by a 4 m diameter tunnel of 27 km circumference, with huge underground labs and numerous surface facilities, and set up with a precision of 0.1 mm per kilometer, the Large Electron-Positron Collider (LEP) was not only the largest but also one of the most sophisticated scientific research instruments ever created by Man. Located at CERN, near Geneva, LEP was built during the years 1983 - 1989, was operational until 2000, and corroborated the standard model of particle physics through continuous high precision measurements. The Author, director-general of CERN during the crucial period of the construction of LEP, recounts vividly the convoluted decision-making and technical implementation processes - the tunnel alone being a highly challenging geo- and civil engineering project - and the subsequent extremely fruitful period of scientific research. Finally he describes the difficult decision to close down LEP, at a time when the discovery of the Higgs boson seemed within reach. LEP was eventually dismantled in 2000, enabling the tunnel to be reused for building the next generation machine, the much more powerful Large Hadron Collider (LHC), an upgrade then called LEP3 and foreseen from the beginning. It became operational just as this account was being completed. Written by the main protagonist responsible for making LEP a reality, this is the definitive inside story of a remarkable machine and the many thousands of scientists and engineers from around the world, whose efforts contributed to the new knowledge it produced.
Large mass bolometers are used in particle physics experiments to search for rare processes, like neutrinoless double beta decay and dark matter interactions. In the next years the CUORE experiment (a 1 Ton detector composed by 1000 crystals of TeO2 operated as bolometers in a large cryostat at 10mK) will be the particle physics experiment with the highest chance of discovering the Majorana neutrino, a long standing and yet fundamental question of particle physics. The study presented in this book was performed on the bolometers of the CUORE experiment. The response function of these detectors is not linear in the energy range of interest, and it changes with the operating temperature, worsening the performances. The nonlinearity appeared to be dominated by the thermistor and the biasing circuit used to read the bolometer, and was modeled using few measurable parameters. A method to obtain a linear response is the result of this work. It allows a great improvement of the detector operation and data analysis. With a foreword by Fernando Ferroni.
In this book, the anomaly mediated supersymmetry breaking (AMSB) model is explored by searching for charged winos with their subsequent decays collected with the ATLAS detector at the Large Hadron Collider (LHC). The author develops a new method, called "re-tracking," to detect charged winos that decay before reaching the Semiconductor Tracker (SCT) detector. Because the nominal tracking algorithm at the ATLAS experiment requires at least seven successive hits in the inner tracking system, the sensitivity to charged winos having a fraction of a nanosecond in the past analysis was therefore limited. However, re-tracking requires a minimum of three pixel hits and provides a fully efficient tracking capability for charged winos traversing the pixel detector, resulting in around about 100 times greater efficiency for charged winos with a lifetime ~0.2 ns longer than that in past searches. Signal topology is characterized by a jet with large transverse momentum (pT), large missing transverse energy, and a high-pT disappearing track. There are three types of back ground tracks: interacting hadron tracks, charged leptons, and tracks with mismeasured pT. A background estimation based on the Monte Carlo (MC) simulation suffers from large uncertainties due to poor statistics and has difficulty simulating the properties of background tracks. Therefore, a data-driven approach has been developed by the author of the book to estimate the background track-pT spectrum. No significant excess above the background expectation is observed for candidate tracks with large transverse momentum, and constraints on the AMSB model are obtained. The author shows that in the AMSB model, a charged wino mass below 270 GeV is excluded at 95 % confidence level, which also directly constrains the mass of wino dark matter.
Advances in the synthesis of new materials with often complex, nano-scaled structures require increasingly sophisticated experimental techniques that can probe the electronic states, the atomic magnetic moments and the magnetic microstructures responsible for the properties of these materials. At the same time, progress in synchrotron radiation techniques has ensured that these light sources remain a key tool of investigation, e.g. synchrotron radiation sources of the third generation are able to support magnetic imaging on a sub-micrometer scale. With the Sixth Mittelwihr School on Magnetism and Synchrotron Radiation the tradition of teaching the state-of-the-art on modern research developments continues and is expressed through the present set of extensive lectures provided in this volume. While primarily aimed at postgraduate students and newcomers to the field, this volume will also benefit researchers and lecturers actively working in the field.
This thesis reports on the first studies of Standard Model photon production at the Large Hadron Collider (LHC) using the ATLAS detector. Standard Model photon production is a large background in the search for Higgs bosons decaying into photon pairs, and is thus critical to understand. The thesis explains the techniques used to reconstruct and identify photon candidates using the ATLAS detector, and describes a measurement of the production cross section for isolated prompt photons. The thesis also describes a search for the Higgs boson in which the analysis techniques used in the measurement are exploited to reduce and estimate non-prompt backgrounds in diphoton events.
The book offers a thorough introduction to machine vision. It is organized in two parts. The first part covers the image acquisition, which is the crucial component of most automated visual inspection systems. All important methods are described in great detail and are presented with a reasoned structure. The second part deals with the modeling and processing of image signals and pays particular regard to methods, which are relevant for automated visual inspection.
The main goal of this book is to elucidate what kind of experiment must be performed in order to determine the full set of independent parameters which can be extracted and calculated from theory, where electrons, photons, atoms, ions, molecules, or molecular ions may serve as the interacting constituents of matter. The feasibility of such perfect' and-or `complete' experiments, providing the complete quantum mechanical knowledge of the process, is associated with the enormous potential of modern research techniques, both, in experiment and theory. It is even difficult to overestimate the role of theory in setting of the complete experiment, starting with the fact that an experiment can be complete only within a certain theoretical framework, and ending with the direct prescription of what, and in what conditions should be measured to make the experiment `complete'. The language of the related theory is the language of quantum mechanical amplitudes and their relative phases. This book captures the spirit of research in the direction of the complete experiment in atomic and molecular physics, considering some of the basic quantum processes: scattering, Auger decay and photo-ionization. It includes a description of the experimental methods used to realize, step by step, the complete experiment up to the level of the amplitudes and phases. The corresponding arsenal includes, beyond determining the total cross section, the observation of angle and spin resolved quantities, photon polarization and correlation parameters, measurements applying coincidence techniques, preparing initially polarized targets, and even more sophisticated methods. The `complete' experiment is, until today, hardly to perform. Therefore, much attention is paid to the results of state-of-the-art experiments providing detailed information on the process, and their comparison to the related theoretical approaches, just to mention relativistic multi-configurational Dirac-Fock, convergent close-coupling, Breit-Pauli R-matrix, or relativistic distorted wave approaches, as well as Green's operator methods. This book has been written in honor of Herbert Walther and his major contribution to the field but even to stimulate advanced Bachelor and Master students by demonstrating that obviously nowadays atomic and molecular scattering physics yields and gives a much exciting appreciation for further advancing the field.
This thesis addresses in a very new and elegant way several measurements and the extraction of so-called double parton scattering. The new and elegant way lies in the combination of measurements and a very smart extraction of double parton scattering results, which is easy to apply and overcomes many of the technical difficulties of older methods. Many new phenomena in particle physics can be observed when particles are collided at the highest energies; one of the highlights in recent years was the discovery of the Higgs boson at the Large Hadron Collider at CERN. Understanding the production mechanism of the Higgs boson at the LHC requires detailed knowledge of the physics of proton-proton collisions. When the density of partons in the protons becomes large, there is a non-negligible probability that more than one parton participates in the interaction and the so-called double parton scattering becomes important. In some cases very particular final state signatures can be observed, which can be regarded as an indication of such double partonic scattering and where the different interactions can be separated. Such multiple partonic interactions play an important role when precise predictions from known processes are required.
This book reflects the outcome of the 1st International Workshop on Turbulent Spray Combustion held in 2009 in Corsica (France). The focus is on reporting the progress of experimental and numerical techniques in two-phase flows, with emphasis on spray combustion. The motivation for studies in this area is that knowledge of the dominant phenomena and their interactions in such flow systems is essential for the development of predictive models and their use in combustor and gas turbine design. This necessitates the development of accurate experimental methods and numerical modelling techniques. The workshop aimed at providing an opportunity for experts and young researchers to present the state-of-the-art, discuss new developments or techniques and exchange ideas in the areas of experimentations, modelling and simulation of reactive multiphase flows. The first two papers reflect the contents of the invited lectures, given by experts in the field of turbulent spray combustion. The first concerns computational issues, while the second deals with experiments. These lectures initiated very interesting and interactive discussions among the researchers, further pursued in contributed poster presentations. Contributions 3 and 4 focus on some aspects of the impact of the interaction between fuel evaporation and combustion on spray combustion in the context of gas turbines, while the final article deals with the interaction between evaporation and turbulence.
This book provides a comprehensive overview of the operating principles and technology of electron lenses in supercolliders. Electron lenses are a novel instrument for high energy particle accelerators, particularly for the energy-frontier superconducting hadron colliders, including the Tevatron, RHIC, LHC and future very large hadron colliders. After reviewing the issues surrounding beam dynamics in supercolliders, the book offers an introduction to the electron lens method and its application. Further chapters describe the technology behind the electron lenses which have recently been proposed, built and employed for compensation of beam-beam effects and for collimation of high-energy high-intensity beams, for compensation of space-charge effects and several other applications in accelerators. The book will be an invaluable resource for those involved in the design, construction and operation of the next generation of hadron colliders.
For the first time, the authors provide a comprehensive and consistent presentation of all techniques available in this field. They rigorously analyze the behavior of different electrochemical single and multipotential step techniques for electrodes of different geometries and sizes under transient and stationary conditions. The effects of these electrode features in studies of various electrochemical systems (solution systems, electroactive monolayers, and liquid-liquid interfaces) are discussed. Explicit analytical expressions for the current-potential responses are given for all available cases. Applications of each technique are outlined for the elucidation of reaction mechanisms. Coverage is comprehensive: normal pulse voltammetry, double differential pulse voltammetry, reverse pulse voltammetry and other triple and multipulse techniques, such as staircase voltammetry, differential staircase voltammetry, differential staircase voltcoulommetry, cyclic voltammetry, square wave voltammetry and square wave voltcoulommetry.
With his Ph.D. thesis, presented here in the format of a "Springer Theses", Paul Fulda won the 2012 GWIC thesis prize awarded by the Gravitational Wave International Committee. The impact of thermal noise on future gravitational wave detectors depends on the size and shape of the interrogating laser beam. It had been known since 2006 that, in theory, higher-order Laguerre-Gauss modes could reduce thermal noise. Paul Fulda's research brings Laguerre-Gauss modes an enormous step forward. His work includes analytical, numerical and experimental work on table-top setups as well as experiments at the Glasgow 10m prototype interferometer. Using numerical simulations the LG33 mode was selected as the optical mode to be tested. Further research by Paul and his colleagues since then concentrated on this mode. Paul has developed and demonstrated simple and effective methods to create this mode with diffractive optics and successfully demonstrated its compatibility with the essential building blocks of gravitational wave detectors, namely, optical cavities, Michelson interferometers and opto-electronic sensing and control systems. Through this work, Laguerre-Gauss modes for interferometers have been transformed from an essentially unknown entity to a well understood option with an experimental basis.
This book is written for scientists involved in the calibration of viscometers. A detailed description for stepping up procedures to establish the viscosity scale and obtaining sets of master viscometers is given in the book. Uncertainty considerations for standard oils of known viscosity are presented. The modern viscometers based on principles of tuning fork, ultrasonic, PZT, plate waves, Love waves, micro-cantilever and vibration of optical fiber are discussed to inspire the reader to further research and to generate improved versions. The primary standard for viscosity is pure water. Measurements of its viscosity with accuracy/uncertainty achieved are described. The principles of rotational and oscillation viscometers are explained to enhance the knowledge in calibration work. Devices used for specific materials and viscosity in non SI units are discussed with respect to the need to correlate viscosity values obtained by various devices. The description of commercial viscometers meets the needs of the user.
The book is a collection of peer-reviewed scientific papers submitted by active researchers in the 1st International Conference on Advancements of Medical Electronics (ICAME2015). The conference is organized jointly by the Department of Biomedical Engineering and Electronics and Communication Engineering, JIS College of Engineering, West Bengal, India. The primary objective of the conference is to strengthen interdisciplinary research and its applications for the welfare of humanity. A galaxy of academicians, professionals, scientists, statesman and researchers from different parts of the country and abroad got together and shared their knowledge. The book presents research articles of medical image processing & analysis, biomedical instrumentation & measurements, DSP & clinical applications, embedded systems & its applications in healthcare. The book can be referred as a tool for further research.
This book presents the theory of quantum effects used in metrology and results of the author’s own research in the field of quantum electronics. The book provides also quantum measurement standards used in many branches of metrology for electrical quantities, mass, length, time and frequency. This book represents the first comprehensive survey of quantum metrology problems. As a scientific survey, it propagates a new approach to metrology with more emphasis on its connection with physics. This is of importance for the constantly developing technologies and nanotechnologies in particular. Providing a presentation of practical applications of the effects used in quantum metrology for the construction of quantum standards and sensitive electronic components, the book is useful for a wide audience of physicists and metrologists in the broad sense of both terms. In 2014 a new system of units, the so called Quantum SI, is introduced. This book helps to understand and approve the new system to both technology and academic community.
This book is devoted to the analysis of measurement signals which requires specific mathematical operations like Convolution, Deconvolution, Laplace, Fourier, Hilbert, Wavelet or Z transform which are all presented in the present book. The different problems refer to the modulation of signals, filtration of disturbance as well as to the orthogonal signals and their use in digital form for the measurement of current, voltage, power and frequency are also widely discussed. All the topics covered in this book are presented in detail and illustrated by means of examples in MathCad and LabVIEW. This book provides a useful source for researchers, scientists and engineers who in their daily work are required to deal with problems of measurement and signal processing and can also be helpful to undergraduate students of electrical engineering.
This book presents a comprehensive review of the most important methods used in the characterisation of piezoelectric, ferroelectric and pyroelectric materials. It covers techniques for the analysis of bulk materials and thick and thin film materials and devices. There is a growing demand by industry to adapt and integrate piezoelectric materials into ever smaller devices and structures. Such applications development requires the joint development of reliable, robust, accurate and – most importantly – relevant and applicable measurement and characterisation methods and models. In the past few years there has been a rapid development of new techniques to model and measure the variety of properties that are deemed important for applications development engineers and scientists. The book has been written by the leaders in the field and many chapters represent established measurement best practice, with a strong emphasis on application of the methods via worked examples and detailed experimental procedural descriptions. Each chapter contains numerous diagrams, images, and measurement data, all of which are fully referenced and indexed. The book is intended to occupy space in the research or technical lab, and will be a valuable and practical resource for students, materials scientists, engineers, and lab technicians. |
![]() ![]() You may like...
Stereoscopic Image Quality Assessment
Yong Ding, Guangming Sun
Hardcover
R2,927
Discovery Miles 29 270
Key to the Hebrew-Egyptian Mystery in…
James Ral J Ralston (James Ralston)
Hardcover
R904
Discovery Miles 9 040
Advanced Analytical Methods in Tribology
Martin Dienwiebel, Maria-Isabel De Barros Bouchet
Hardcover
R4,739
Discovery Miles 47 390
|