![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
This thesis demonstrates and investigates novel dual-polarization interferometric fiber-optic gyroscope (IFOG) configurations, which utilize optical compensation between two orthogonal polarizations to suppress errors caused by polarization nonreciprocity. Further, it provides a scheme for dual-polarization two-port IFOGs and details their unique benefits. Dual-polarization IFOGs break through the restriction of the "minimal scheme," which conventional IFOGs are based on. These innovative new IFOGs have unique properties: They require no polarizer and have two ports available for signal detection. As such, they open new avenues for IFOGs to achieve lower costs and higher sensitivity.
This book shows how Bohmian mechanics overcomes the need for a measurement postulate involving wave function collapse. The measuring process plays a very important role in quantum mechanics. It has been widely analyzed within the Copenhagen approach through the Born and von Neumann postulates, with later extension due to Luders. In contrast, much less effort has been invested in the measurement theory within the Bohmian mechanics framework. The continuous measurement (sharp and fuzzy, or strong and weak) problem is considered here in this framework. The authors begin by generalizing the so-called Mensky approach, which is based on restricted path integral through quantum corridors. The measuring system is then considered to be an open quantum system following a stochastic Schroedinger equation. Quantum stochastic trajectories (in the Bohmian sense) and their role in basic quantum processes are discussed in detail. The decoherence process is thereby described in terms of classical trajectories issuing from the violation of the noncrossing rule of quantum trajectories.
This book focuses on the development of wellness protocols for smart home monitoring, aiming to forecast the wellness of individuals living in ambient assisted living (AAL) environments. It describes in detail the design and implementation of heterogeneous wireless sensors and networks as applied to data mining and machine learning, which the protocols are based on. Further, it shows how these sensor and actuator nodes are deployed in the home environment, generating real-time data on object usage and other movements inside the home, and therefore demonstrates that the protocols have proven to offer a reliable, efficient, flexible, and economical solution for smart home systems. Documenting the approach from sensor to decision making and information generation, the book addresses various issues concerning interference mitigation, errors, security and large data handling. As such, it offers a valuable resource for researchers, students and practitioners interested in interdisciplinary studies at the intersection of wireless sensing processing, radio communication, the Internet of Things and machine learning, and in how they can be applied to smart home monitoring and assisted living environments.
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations - the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers and scientists in general who study the phenomenon of statistical stability and use statistical methods for high-precision measurements, prediction, and signal processing over long observation intervals.
This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.
This book gathers, for the first time, an overview of nearly all of the magnetic sensors that exist today. The book is offering the readers a thorough and comprehensive knowledge from basics to state-of-the-art and is therefore suitable for both beginners and experts. From the more common and popular AMR magnetometers and up to the recently developed NV center magnetometers, each chapter is describing a specific type of sensor and providing all the information that is necessary to understand the magnetometer behavior including theoretical background, noise model, materials, electronics, design and fabrication techniques, etc.
Sensors and Instrumentation, Volume 5. Proceedings of the 34th IMAC, A Conference and Exposition on Dynamics of Multiphysical Systems: From Active Materials to Vibroacoustics, 2016, the fift h volume of ten from the Conference brings together contributions to this important area of research and engineering. Th e collection presents early findings and case studies on fundamental and applied aspects of Structural Dynamics, including papers on: * Experimental Techniques< * Smart Sensing * Rotational Eff ects * Dynamic Calibration * Systems & Sensing Technologies * Modal Transducers * Novel Excitation Methods
This book is an in-depth guide to effective scientific research. Ranging from the philosophical to the practical, it explains at the outset what science can - and can't - achieve, and discusses its relationship to mathematics and laws. The author then pays extensive attention to the scientific method, including experimental design, verification, uncertainty and statistics. A major aim of the book is to help young scientists reflect upon the deeper aims of their work and make the best use of their talents in contributing to progress. To this end, it also includes sections on planning research, on presenting one's findings in writing, as well as on ethics and the responsibilities of scientists.
This book presents a systematic and comprehensive exposition of the theory of measurement accuracy and provides solutions that fill significant and long-standing gaps in the classical theory. It eliminates the shortcomings of the classical theory by including methods for estimating accuracy of single measurements, the most common type of measurement. The book also develops methods of reduction and enumeration for indirect measurements, which do not require Taylor series and produce a precise solution to this problem. It produces grounded methods and recommendations for summation of errors. The monograph also analyzes and critiques two foundation metrological documents, the International Vocabulary of Metrology (VIM) and the Guide to the Expression of Uncertainty in Measurement (GUM), and discusses directions for their revision. This new edition adds a step-by-step guide on how to evaluate measurement accuracy and recommendations on how to calculate systematic error of multiple measurements. There is also an extended section on the method of reduction, which provides an alternative to the least-square method and the method of enumeration. Many sections are also rewritten to improve the structure and usability of the material. The 3rd edition reflects the latest developments in metrology and offers new results, and it is designed to be accessible to readers at various levels and positions, including scientists, engineers, and undergraduate and graduate students. By presenting material from a practical perspective and offering solutions and recommendations for problems that arise in conducting real-life measurements, author Semyon Rabinovich offers an invaluable resource for scientists in any field.
This book presents ways of interfacing sensors to the digital world, and discusses the marriage between sensor systems and the IoT: the opportunities and challenges. As sensor output is often affected by noise and interference, the book presents effective schemes for recovering the data from a signal that is buried in noise. It also explores interesting applications in the area of health care, un-obstructive monitoring and the electronic nose and tongue. It is a valuable resource for engineers and scientists in the area of sensors and interfacing wanting to update their knowledge of the latest developments in the field and learn more about sensing applications and challenges.
This Brief describes the calibration of titration calorimeters (ITCs) and calculation of stoichiometry, equilibrium constants, enthalpy changes, and rate constants for reactions in solution. A framework/methodology for model development for analysis of ITC data is presented together with methods for assessing the uncertainties in determined parameters and test data sets. This book appeals to beginners, as well as to researchers and professionals in the field.
This thesis demonstrates the adaptation of existing techniques and principles towards enabling clean and precise measurements of biomolecules interacting with inorganic surfaces. In particular, it includes real-time measurement of serum proteins interacting with engineered nanomaterial. Making meaningful and unambiguous measurements has been an evolving problem in the field of biology and its various allied domains, primarily due to the complex nature of experiments and the large number of possible interferants. The subsequent quantification of interactions between biomolecules and inorganic surfaces solves pressing problems in the rapidly developing fields of lipidomics and nanomedicine.
This book provides the reader with a detailed and captivating account of the story where, for the first time, physicists ventured into proposing a new force of nature beyond the four known ones - the electromagnetic, weak and strong forces, and gravitation - based entirely on the reanalysis of existing experimental data. Back in 1986, Ephraim Fischbach, Sam Aronson, Carrick Talmadge and their collaborators proposed a modification of Newton's Law of universal gravitation. Underlying this proposal were three tantalizing pieces of evidence: 1) an energy dependence of the CP (particle-antiparticle and reflection symmetry) parameters, 2) differences between the measurements of G, the universal gravitational constant, in laboratories and in mineshafts, and 3) a reanalysis of the Eoetvos experiment, which had previously been used to show that the gravitational mass of an object and its inertia mass were equal to approximately one part in a billion. The reanalysis revealed that, contrary to Galileo's position, the force of gravity was in fact very slightly different for different substances. The resulting Fifth Force hypothesis included this composition dependence and also added a small distance dependence to the inverse-square gravitational force. Over the next four years numerous experiments were performed to test the hypothesis. By 1990 there was overwhelming evidence that the Fifth Force, as initially proposed, did not exist. This book discusses how the Fifth Force hypothesis came to be proposed and how it went on to become a showcase of discovery, pursuit and justification in modern physics, prior to its demise. In this new and significantly expanded edition, the material from the first edition is complemented by two essays, one containing Fischbach's personal reminiscences of the proposal, and a second on the ongoing history and impact of the Fifth Force hypothesis from 1990 to the present.
This book covers the diagnosis and assessment of the various faults which can occur in a three phase induction motor, namely rotor broken-bar faults, rotor-mass unbalance faults, stator winding faults, single phasing faults and crawling. Following a brief introduction, the second chapter describes the construction and operation of an induction motor, then reviews the range of known motor faults, some existing techniques for fault analysis, and some useful signal processing techniques. It includes an extensive literature survey to establish the research trends in induction motor fault analysis. Chapters three to seven describe the assessment of each of the five primary fault types. In the third chapter the rotor broken-bar fault is discussed and then two methods of diagnosis are described; (i) diagnosis of the fault through Radar analysis of stator current Concordia and (ii) diagnosis through envelope analysis of motor startup current using Hilbert and Wavelet Transforms. In chapter four, rotor-mass unbalance faults are assessed, and diagnosis of both transient and steady state stator current has been analyzed using different techniques. If both rotor broken-bar and rotor-mass unbalance faults occur simultaneously then for identification an algorithm is provided in this chapter. Chapter five considers stator winding faults and five different analysis techniques, chapter six covers diagnosis of single phasing faults, and chapter seven describes crawling and its diagnosis. Finally, chapter eight focuses on fault assessment, and presents a summary of the book together with a discussion of prospects for future research on fault diagnosis.
This book introduces novel developments in the field of electromagnetic non-destructive testing and evaluation (NDT/E). The topics include electromagnetic ultrasonic guided wave testing, pulsed eddy current testing, remote field eddy current testing, low frequency eddy current testing, metal magnetic memory testing, and magnetic flux leakage testing. Considering the increasing concern about the safety maintenance of critical structures in various industries and everyday life, these topics presented here will be of particular interest to the readers in the NDT/E field. This book covers both theoretical researches and the engineering applications of the electromagnetic NDT technology. It could serve as a valuable reference for college students and relevant NDT technicians. It is also a useful material for qualification training and higher learning for nondestructive testing professionals.
This book introduces the fundamental theory of electromagnetic ultrasonic guided waves, together with its applications. It includes the dispersion characteristics and matching theory of guided waves; the mechanism of production and theoretical model of electromagnetic ultrasonic guided waves; the effect mechanism between guided waves and defects; the simulation method for the entire process of electromagnetic ultrasonic guided wave propagation; electromagnetic ultrasonic thickness measurement; pipeline axial guided wave defect detection; and electromagnetic ultrasonic guided wave detection of gas pipeline cracks. This theory and findings on applications draw on the author's intensive research over the past eight years. The book can be used for nondestructive testing technology and as an engineering reference work. The specific implementation of the electromagnetic ultrasonic guided wave system presented here will also be of value for other nondestructive test developers.
Maximizing reader insights into the key scientific disciplines of Machine Tool Metrology, this text will prove useful for the industrial-practitioner and those interested in the operation of machine tools. Within this current level of industrial-content, this book incorporates significant usage of the existing published literature and valid information obtained from a wide-spectrum of manufacturers of plant, equipment and instrumentation before putting forward novel ideas and methodologies. Providing easy to understand bullet points and lucid descriptions of metrological and calibration subjects, this book aids reader understanding of the topics discussed whilst adding a voluminous-amount of footnotes utilised throughout all of the chapters, which adds some additional detail to the subject. Featuring an extensive amount of photographic-support, this book will serve as a key reference text for all those involved in the field.
This theses reports on an experimental search for an exotic hadron, +(1540) pentaquark, which is a genuine exotic hadron with a five-quark system of uuddsbar. The results of this book support that the existence of + was strongly constrained. The + pentaquark was searched for via the - p K- X reaction using a beam momentum of 2.01 GeV/c at the J-PARC hadron experimental facility, taking advantage of high-statistics and high-resolution compared with previous experiments, some of which claimed the evidence of +. In order to realize a good missing-mass resolution of 2 MeV, the beam spectrometer and superconducting kaon spectrometer were constructed. No clear peak was observed in the missing mass spectrum of the - p K- X reaction, and the upper limit of the production cross section was found to be less than 0.28 b/sr at the 90% confidence level in a mass region of 1500-1560 MeV/c2. This upper limit is an order of magnitude smaller than that of the previous KEK experiment. Compared with a theoretical calculation using the effective Lagrangian approach, the decay width of + was evaluated. The upper limits on the decay width were estimated to be 0.36 and 1.9 MeV for the + spin-parity of 1/2+ and 1/2-, respectively. These are quite small for a width of ordinary hadron resonances, and the existence of + was strongly constrained and is doubtful.
This book explores the microsensing technologies and systems now available to monitor the quality of air and water within the urban environment and examines their role in the creation of sustainable cities against the background of the challenges posed by rapid urbanization. The opening section addresses the theoretical and conceptual background of microsensing networks. The coverage includes detailed description of microsensors, supported by design-specific equations, and clear explanation of the ways in which devices that harvest energy from ambient sources can detect and quantify pollution. The practical application of such systems in addressing environmental impacts within cities and in sustainable urban planning is then discussed with the aid of case studies in developing countries. The book will be of interest to all who wish to understand the benefits of microsensing networks in promoting sustainable cities through better delivery of information on health hazards and improved provision of data to environmental agencies and regulatory bodies in order to assist in monitoring, decision-making, and regulatory enforcement.
The high-redshift galaxies became a distinct research ?eld during the ?nal decade of the20thcentury. AtthattimetheLyman-breaktechniquemadeitpossibletoidentify signi?cant samples of such objects, and the new generation of 8 to 10-m telescopes resulted in ?rst good spectroscopic data. Today the high-redshift galaxies have developed into one of the important topics of astrophysics, accounting for about 5-10% of the publications in the major scienti?c journals devoted to astronomy. Because high-redshift galaxies is a rapidly developing ?eld and since new results are published constantly, writing a book on this topic is challenging. On the other hand, in view of the large amount of individual results now in the literature, and in view of the still growing interest in this topic, it appears worthwhile to summarize and evaluate the available data and to provide an introduction for those who wish to enter this ?eld, or who, for various reasons, might be interested in its results. The end of the ?rst decade of the 21st century appears to be a good point in time to attempt such a summary. The current generation of ground-based 8 to 10-m - optical telescopes, the Hubble Space Telescope, and the most important large radio telescopes have by now been in operation since about one or two decades. Although these instruments will continue to produce important scienti?c results for some time to come, many of the initial programs exploiting their unique new possibilities have been completed.
Housed by a 4 m diameter tunnel of 27 km circumference, with huge underground labs and numerous surface facilities, and set up with a precision of 0.1 mm per kilometer, the Large Electron-Positron Collider (LEP) was not only the largest but also one of the most sophisticated scientific research instruments ever created by Man. Located at CERN, near Geneva, LEP was built during the years 1983 - 1989, was operational until 2000, and corroborated the standard model of particle physics through continuous high precision measurements. The Author, director-general of CERN during the crucial period of the construction of LEP, recounts vividly the convoluted decision-making and technical implementation processes - the tunnel alone being a highly challenging geo- and civil engineering project - and the subsequent extremely fruitful period of scientific research. Finally he describes the difficult decision to close down LEP, at a time when the discovery of the Higgs boson seemed within reach. LEP was eventually dismantled in 2000, enabling the tunnel to be reused for building the next generation machine, the much more powerful Large Hadron Collider (LHC), an upgrade then called LEP3 and foreseen from the beginning. It became operational just as this account was being completed. Written by the main protagonist responsible for making LEP a reality, this is the definitive inside story of a remarkable machine and the many thousands of scientists and engineers from around the world, whose efforts contributed to the new knowledge it produced.
The winner of UCL's annual HEP thesis prize, this work describes an analysis of the data from the second flight of the Antarctica Impulsive Transient Antenna (ANITA). ANITA is a balloon-borne experiment that searches for radio signals originating from ultra-high energy neutrinos and cosmic rays interacting with the Antarctic ice or air. The search for ultrahigh energy neutrinos of astrophysical origin is one of the outstanding experimental challenges of the 21st century. The ANITA experiment was designed to be the most sensitive instrument to ultra-high energy neutrinos that originate from the interactions of cosmic rays with the cosmic microwave background. The methodology and results of the neutrino and cosmic ray searches are presented in the thesis.
This book lays out a new, general theory of light propagation and imaging through Earth's turbulent atmosphere. Current theory is based on the - now widely doubted - assumption of Kolmogorov turbulence. The new theory is based on a generalized atmosphere, the turbulence characteristics of which can be established, as needed, from readily measurable properties of point-object, or star, images. The pessimistic resolution predictions of Kolmogorov theory led to lax optical tolerance prescriptions for large ground-based astronomical telescopes which were widely adhered to in the 1970s and 1980s. Around 1990, however, it became clear that much better resolution was actually possible, and Kolmogorov tolerance prescriptions were promptly abandoned. Most large telescopes built before 1990 have had their optics upgraded (e.g., the UKIRT instrument) and now achieve, without adaptive optics (AO), almost an order of magnitude better resolution than before. As well as providing a more comprehensive and precise understanding of imaging through the atmosphere with large telescopes (both with and without AO), the new general theory also finds applications in the areas of laser communications and high-energy laser beam propagation. |
You may like...
New Perspectives Collection, Microsoft…
Patrick Carey
Paperback
Illustrated Microsoft (R)Office 365…
Lisa Friedrichsen, Carol Cram, …
Paperback
New Perspectives Collection, Microsoft…
Cengage Cengage
Paperback
Practical Guide to Computer Practice…
Suzie du Toit, Christine van der Merwe
Paperback
R181
Discovery Miles 1 810
Routing and Switching Essentials v6 Labs…
Cisco Networking Academy, Allan Johnson
Paperback
R1,477
Discovery Miles 14 770
Illustrated Microsoft (R)Office 365…
Lisa Friedrichsen, Carol Cram, …
Paperback
Pro Office 365 Development
Mark Collins, Creative Enterprises, …
Paperback
R1,441
Discovery Miles 14 410
Scaling Networks v6 Course Booklet
Cisco Networking Academy
Paperback
|