![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
Maximizing reader insights into the key scientific disciplines of Machine Tool Metrology, this text will prove useful for the industrial-practitioner and those interested in the operation of machine tools. Within this current level of industrial-content, this book incorporates significant usage of the existing published literature and valid information obtained from a wide-spectrum of manufacturers of plant, equipment and instrumentation before putting forward novel ideas and methodologies. Providing easy to understand bullet points and lucid descriptions of metrological and calibration subjects, this book aids reader understanding of the topics discussed whilst adding a voluminous-amount of footnotes utilised throughout all of the chapters, which adds some additional detail to the subject. Featuring an extensive amount of photographic-support, this book will serve as a key reference text for all those involved in the field.
This thesis discusses the physical and information theoretical limits of optical 3D metrology, and, based on these principal considerations, introduces a novel single-shot 3D video camera that works close to these limits. There are serious obstacles for a "perfect" 3D-camera: The author explains that it is impossible to achieve a data density better than one third of the available video pixels. Available single-shot 3D cameras yet display much lower data density, because there is one more obstacle: The object surface must be "encoded" in a non-ambiguous way, commonly by projecting sophisticated patterns. However, encoding devours space-bandwidth and reduces the output data density. The dissertation explains how this profound dilemma of 3D metrology can be solved, exploiting just two synchronized video cameras and a static projection pattern. The introduced single-shot 3D video camera, designed for macroscopic live scenes, displays an unprecedented quality and density of the 3D point cloud. The lateral resolution and depth precision are limited only by physics. Like a hologram, each movie-frame encompasses the full 3D information about the object surface and the observation perspective can be varied while watching the 3D movie.
Principles of Scientific Methods focuses on the fundamental principles behind scientific methods. The book refers to "science" in a broad sense, including natural science, physics, mathematics, statistics, social science, political science, and engineering science. A principle is often abstract and has broad applicability while a method is usually concrete and specific. The author uses many concrete examples to explain principles and presents analogies to connect different methods or problems to arrive at a general principle or a common notion. He mainly discusses a particular method to address the great idea behind the method, not the method itself. The book shows how the principles are not only applicable to scientific research but also to our daily lives. The author explains how scientific methods are used for understanding how and why things happen, making predictions, and learning how to prevent mistakes and solve problems. Studying the principles of scientific methods is to think about thinking and to enlighten our understanding of scientific research. Scientific principles are the foundation of scientific methods. In this book, you'll see how the principles reveal the big ideas behind our scientific discoveries and reflect the fundamental beliefs and wisdoms of scientists. The principles make the scientific methods coherent and constitute the source of creativity.
The way science is done has changed radically in recent years. Scientific research and institutions, which have long been characterized by passion, dedication and reliability, have increasingly less capacity for more ethical pursuits, and are pressed by hard market laws. From the vocation of a few, science has become the profession of many - possibly too many. These trends come with consequences and risks, such as the rise in fraud, plagiarism, and in particular the sheer volume of scientific publications, often of little relevance. The solution? A slow approach with more emphasis on quality rather than quantity that will help us to rediscover the essential role of the responsible scientist. This work is a critical review and assessment of present-day policies and behavior in scientific production and publication. It touches on the tumultuous growth of scientific journals, in parallel with the growth of self-declared scientists over the world. The author's own reflections and experiences help us to understand the mechanisms of contemporary science. Along with personal reminiscences of times past, the author investigates the loopholes and hoaxes of pretend journals and nonexistent congresses, so common today in the scientific arena. The book also discusses the problems of bibliometric indices, which have resulted in large part from the above distortions of scientific life.
This book covers a wide range of advanced analytical tools, from electrochemical to in-situ/ex-situ material characterization techniques, as well as the modeling of corrosion systems to foster understanding and prediction. When used properly, these tools can enrich our understanding of material performance (metallic materials, coatings, inhibitors) in various environments/contexts (aqueous corrosion, high-temperature corrosion). The book encourages researchers to develop new corrosion-resistant materials and supports them in devising suitable asset integrity strategies. Offering a valuable resource for researchers, industry professionals, and graduate students alike, the book shows them how to apply these valuable analytical tools in their work.
Central to this thesis is the characterisation and exploitation of electromagnetic properties of light in imaging and measurement systems. To this end an information theoretic approach is used to formulate a hitherto lacking, quantitative definition of polarisation resolution, and to establish fundamental precision limits in electromagnetic systems. Furthermore rigorous modelling tools are developed for propagation of arbitrary electromagnetic fields, including for example stochastic fields exhibiting properties such as partial polarisation, through high numerical aperture optics. Finally these ideas are applied to the development, characterisation and optimisation of a number of topical optical systems: polarisation imaging; multiplexed optical data storage; and single molecule measurements. The work has implications for all optical imaging systems where polarisation of light is of concern.
This is the first book summarizing the theoretical basics of thermal nondestructive testing (TNDT) by combining elements of heat conduction, infrared thermography, and industrial nondestructive testing. The text contains the physical models of TNDT, heat transfer in defective and sound structures, and thermal properties of materials. Also included are the optimization of TNDT procedures, defect characterization, data processing in TNDT, active and passive TNDT systems, as well as elements of statistical data treatment and decision making. This text contains in-depth descriptions of applications in infrared/thermal testing within aerospace, power production, building, as well as the conservation of artistic monuments The book is intended for the industrial specialists who are involved in technical diagnostics and nondestructive testing. It may also be useful for academic researchers, undergraduate, graduate and PhD university students.
This immensely practical guide to PIV provides a condensed, yet exhaustive guide to most of the information needed for experiments employing the technique. This second edition has updated chapters on the principles and extra information on microscopic, high-speed and three component measurements as well as a description of advanced evaluation techniques. What's more, the huge increase in the range of possible applications has been taken into account as the chapter describing these applications of the PIV technique has been expanded.
This well-illustrated book, by two established historians of school mathematics, documents Thomas Jefferson's quest, after 1775, to introduce a form of decimal currency to the fledgling United States of America. The book describes a remarkable study showing how the United States' decision to adopt a fully decimalized, carefully conceived national currency ultimately had a profound effect on U.S. school mathematics curricula. The book shows, by analyzing a large set of arithmetic textbooks and an even larger set of handwritten cyphering books, that although most eighteenth- and nineteenth-century authors of arithmetic textbooks included sections on vulgar and decimal fractions, most school students who prepared cyphering books did not study either vulgar or decimal fractions. In other words, author-intended school arithmetic curricula were not matched by teacher-implemented school arithmetic curricula. Amazingly, that state of affairs continued even after the U.S. Mint began minting dollars, cents and dimes in the 1790s. In U.S. schools between 1775 and 1810 it was often the case that Federal money was studied but decimal fractions were not. That gradually changed during the first century of the formal existence of the United States of America. By contrast, Chapter 6 reports a comparative analysis of data showing that in Great Britain only a minority of eighteenth- and nineteenth-century school students studied decimal fractions. Clements and Ellerton argue that Jefferson's success in establishing a system of decimalized Federal money had educationally significant effects on implemented school arithmetic curricula in the United States of America. The lens through which Clements and Ellerton have analyzed their large data sets has been the lag-time theoretical position which they have developed. That theory posits that the time between when an important mathematical "discovery" is made (or a concept is "created") and when that discovery (or concept) becomes an important part of school mathematics is dependent on mathematical, social, political and economic factors. Thus, lag time varies from region to region, and from nation to nation. Clements and Ellerton are the first to identify the years after 1775 as the dawn of a new day in U.S. school mathematics-traditionally, historians have argued that nothing in U.S. school mathematics was worthy of serious study until the 1820s. This book emphasizes the importance of the acceptance of decimal currency so far as school mathematics is concerned. It also draws attention to the consequences for school mathematics of the conscious decision of the U.S. Congress not to proceed with Thomas Jefferson's grand scheme for a system of decimalized weights and measures.
This book covers a very broad spectrum of experimental and theoretical activity in particle physics, from the searches for the Higgs boson and physics beyond the Standard Model, to detailed studies of Quantum Chromodynamics, the B-physics sectors and the properties of hadronic matter at high energy density as realised in heavy-ion collisions. Starting with a basic introduction to the Standard Model and its most likely extensions, the opening section of the book presents an overview of the theoretical and phenomenological framework of hadron collisions and current theoretical models of frontier physics. In part II, discussion of the theory is supplemented by chapters on the detector capabilities and search strategies, as well as an overview of the main detector components, the initial calibration procedures and physics samples and early LHC results. Part III completes the volume with a description of the physics behind Monte Carlo event generators and a broad introduction to the main statistical methods used in high energy physics. "LHC Phenomenology" covers all of these topics at a pedagogical level, with the aim of providing young particle physicists with the basic tools required for future work on the various LHC experiments. It will also serve as a useful reference text for those working in the field.
The winner of UCL's annual HEP thesis prize, this work describes an analysis of the data from the second flight of the Antarctica Impulsive Transient Antenna (ANITA). ANITA is a balloon-borne experiment that searches for radio signals originating from ultra-high energy neutrinos and cosmic rays interacting with the Antarctic ice or air. The search for ultrahigh energy neutrinos of astrophysical origin is one of the outstanding experimental challenges of the 21st century. The ANITA experiment was designed to be the most sensitive instrument to ultra-high energy neutrinos that originate from the interactions of cosmic rays with the cosmic microwave background. The methodology and results of the neutrino and cosmic ray searches are presented in the thesis.
This up-to-date treatment of recent developments in geometric inverse problems introduces graduate students and researchers to an exciting area of research. With an emphasis on the two-dimensional case, topics covered include geodesic X-ray transforms, boundary rigidity, tensor tomography, attenuated X-ray transforms and the Calderon problem. The presentation is self-contained and begins with the Radon transform and radial sound speeds as motivating examples. The required geometric background is developed in detail in the context of simple manifolds with boundary. An in-depth analysis of various geodesic X-ray transforms is carried out together with related uniqueness, stability, reconstruction and range characterization results. Highlights include a proof of boundary rigidity for simple surfaces as well as scattering rigidity for connections. The concluding chapter discusses current open problems and related topics. The numerous exercises and examples make this book an excellent self-study resource or text for a one-semester course or seminar.
Applied Photometry, Radiometry, and Measurements of Optical Losses reviews and analyzes physical concepts of radiation transfer, providing quantitative foundation for the means of measurements of optical losses, which affect propagation and distribution of light waves in various media and in diverse optical systems and components. The comprehensive analysis of advanced methodologies for low-loss detection is outlined in comparison with the classic photometric and radiometric observations, having a broad range of techniques examined and summarized: from interferometric and calorimetric, resonator and polarization, phase-shift and ring-down decay, wavelength and frequency modulation to pulse separation and resonant, acousto-optic and emissive - subsequently compared to direct and balancing methods for studying free-space and polarization optics, fibers and waveguides. The material is focused on applying optical methods and procedures for evaluation of transparent, reflecting, scattering, absorbing, and aggregated objects, and for determination of power and energy parameters of radiation and color properties of light.
This ready reference surveys the discipline of standards and standardization, defining common terms, clarifying descriptions, describing how standards could be used to restrain trade, and explaining how international trade is stimulated by the due process provisions of standards writing organizations. Containing real-world examples provided by experienced standards professionals, Standardization Essentials is a vital, forward-looking reference for mechanical, civil, electrical and electronics, materials, chemical, mineral, cost, quality, reliability, industrial, developmental, safety, forensic, and consulting engineers; standards managers; architects; project managers; upper-level undergraduate, graduate, and continuing education students in these disciplines. Crystallizes the essential role that standards play in strategic standardization management, purchasing, contractual agreements, and international trade! Covering costs, benefits, limitations, uses, and abuses of standardization programs, Standardization Essentials -Considers whether standards build or bar trade and the use of international standards to leverage world markets -Presents a case study of conformity assessment related to international technical trade barriers -Focuses on consumer safety standards for automobile tires and other products -Addresses implementation of ISO 9000 and ISO 14000 management system standards in industry -Highlights voluntary (nongovernmental) and mandatory (governmental) standards and regulations developed by a variety of organizations -Reveals competition, incongruities, and harmonization among national and international standards
The theoretical foundations of the Standard Model of elementary particles relies on the existence of the Higgs boson, a particle which has been revealed for the first time by the experiments run at the Large Hadron Collider (LHC) in 2012. As the Higgs boson is an unstable particle, its search strategies were based on its decay products. In this thesis, Francesco Pandolfi conducted a search for the Higgs boson in the H ZZ l + l - qq Decay Channel with 4.6 fb -1 of 7 TeV proton-proton collision data collected by the Compact Muon Solenoid (CMS) experiment. The presence of jets in the final state poses a series of challenges to the experimenter: both from a technical point of view, as jets are complex objects and necessitate of ad-hoc reconstruction techniques, and from an analytical one, as backgrounds with jets are copious at hadron colliders, therefore analyses must obtain high degrees of background rejection in order to achieve competitive sensitivity. This is accomplished by following two directives: the use of an angular likelihood discriminant, capable of discriminating events likely to originate from the decay of a scalar boson from non-resonant backgrounds, and by using jet parton flavor tagging, selecting jets compatible with quark hadronization and discarding jets more likely to be initiated by gluons. The events passing the selection requirements in 4.6 fb -1 of data collected by the CMS detector are examined, in the search of a possible signal compatible with the decay of a heavy Higgs boson. The thesis describes the statistical tools and the results of this analysis. This work is a paradigm for studies of the Higgs boson with final states with jets. The non-expert physicists will enjoy a complete and eminently readable description of a proton-proton collider analysis. At the same time, the expert reader will learn the details of the searches done with jets at CMS.
The scientific method delivers prosperity, yet scientific practice has become subject to corrupting influences from within and without the scientific community. This essential reference is intended to help remedy those threats. The authors identify eight essential criteria for the practice of science and provide checklists to help avoid costly failures in scientific practice. Not only for scientists, this book is for all stakeholders of the broad enterprise of science. Science administrators, research funders, journal editors, and policymakers alike will find practical guidance on how they can encourage scientific research that produces useful discoveries. Journalists, commentators, and lawyers can turn to this text for help with assessing the validity and usefulness of scientific claims. The book provides practical guidance and makes important recommendations for reforms in science policy and science administration. The message of the book is complemented by Nobel Laureate Vernon L. Smith's foreword, and an afterword by Terence Kealey.
This thesis represents one of the most comprehensive and in-depth studies of the use of Lorentz-boosted hadronic final state systems in the search for signals of Supersymmetry conducted to date at the Large Hadron Collider. A thorough assessment is performed of the observables that provide enhanced sensitivity to new physics signals otherwise hidden under an enormous background of top quark pairs produced by Standard Model processes. This is complemented by an ingenious analysis optimization procedure that allowed for extending the reach of this analysis by hundreds of GeV in mass of these hypothetical new particles. Lastly, the combination of both deep, thoughtful physics analysis with the development of high-speed electronics for identifying and selecting these same objects is not only unique, but also revolutionary. The Global Feature Extraction system that the author played a critical role in bringing to fruition represents the first dedicated hardware device for selecting these Lorentz-boosted hadronic systems in real-time using state-of-the-art processing chips and embedded systems.
This book provides the basic concepts and fundamental principles of dynamic systems including experimental methods, calibration, signal conditioning, data acquisition and processing as well as the results presentation. How to select suitable sensors to measure is also introduced. It is an essential reference to students, lecturers, professionals and any interested lay readers in measurement technology.
This thesis deals with two main procedures performed with the ATLAS detector at the Large Hadron Collider (LHC). The noise description in the hadronic calorimeter TileCal represents a very valuable technical job. The second part presents a fruitful physics analysis - the cross section measurement of the process p+p Z0 + . The Monte Carlo simulations of the TileCal are described in the first part of the thesis, including a detailed treatment of the electronic noise and multiple interactions (so-called pile-up). An accurate description of both is crucial for the reconstruction of e.g. jets or hadronic tau-jets. The second part reports a Standard Model measurement of the Z0 + process with the emphasis on the final state with an electron and a hadronically decaying tau-lepton. The Z0 + channel forms the dominant background in the search for Higgs bosons decaying into tau lepton pairs, and thus the good understanding achieved here can facilitate more sensitive Higgs detection."
This book presents the state-of-the-art in infrared thermography (IRT) applications with a focus on moisture assessment in buildings. It also offers practical discussions of several case studies, including comparisons of IRT with other surface temperature measurement techniques. In closing, it demonstrates how IRT can be used to assess capillary absorption, and addresses moisture in walls due to wind-driven rain infiltrations, and the drying process. The book equips readers with a deeper understanding of the ideal conditions for accurate IRT assessment and offers practical recommendations.
This book is intended as a guide to the analysis and presentation of experimental results. It develops various techniques for the numerical processing of experimental data, using basic statistical methods and the theory of errors. After presenting basic theoretical concepts, the book describes the methods by which the results can be presented, both numerically and graphically. The book is divided into three parts, of roughly equal length, addressing the theory, the analysis of data, and the presentation of results. Examples are given and problems are solved using the Excel, Origin, Python and R software packages. In addition, programs in all four languages are made available to readers, allowing them to use them in analyzing and presenting the results of their own experiments. Subjects are treated at a level appropriate for undergraduate students in the natural sciences, but this book should also appeal to anyone whose work involves dealing with experimental results.
This book focuses on the development and set-up of fibre Bragg grating (FBG) and no-core fibre (NCF) sensors. It discusses the properties of the sensors and modelling of the resulting devices, which include electronic, optoelectronic, photovoltaic, and spintronic devices. In addition to providing detailed explanations of the properties of FBG and NCF sensors, it features a wealth of instructive illustrations and tables, helping to visualize the respective devices' functions.
This book brings together reviews from leading international authorities on the developments in the study of dark matter and dark energy, as seen from both their cosmological and particle physics side. Studying the physical and astrophysical properties of the dark components of our Universe is a crucial step towards the ultimate goal of unveiling their nature. The work developed from a doctoral school sponsored by the Italian Society of General Relativity and Gravitation. The book starts with a concise introduction to the standard cosmological model, as well as with a presentation of the theory of linear perturbations around a homogeneous and isotropic background. It covers the particle physics and cosmological aspects of dark matter and (dynamical) dark energy, including a discussion of how modified theories of gravity could provide a possible candidate for dark energy. A detailed presentation is also given of the possible ways of testing the theory in terms of cosmic microwave background, galaxy redshift surveys and weak gravitational lensing observations. Included is a chapter reviewing extensively the direct and indirect methods of detection of the hypothetical dark matter particles. Also included is a self-contained introduction to the techniques and most important results of numerical (e.g. N-body) simulations in cosmology. " This volume will be useful to researchers, PhD and graduate students in Astrophysics, Cosmology Physics and Mathematics, who are interested in cosmology, dark matter and dark energy. |
You may like...
Handbook of Research on Automated…
Mrutyunjaya Panda, Harekrishna Misra
Hardcover
R7,766
Discovery Miles 77 660
New Approaches to Data Analytics and…
P. Karthikeyan, Polinpapilinho F. Katina, …
Hardcover
R6,685
Discovery Miles 66 850
Mining Over Air: Wireless Communication…
Ye Ouyang, Mantian Hu, …
Hardcover
R2,885
Discovery Miles 28 850
Foundations and Methods in Combinatorial…
Israel Cesar Lerman
Hardcover
R4,140
Discovery Miles 41 400
Data Mining for Geoinformatics - Methods…
Guido Cervone, Jessica Lin, …
Hardcover
Data Mining Trends and Applications in…
Omowunmi E. Isafiade, Antoine B. Bagula
Hardcover
R5,267
Discovery Miles 52 670
Soft Computing for Knowledge Discovery…
Oded Maimon, Lior Rokach
Hardcover
R1,476
Discovery Miles 14 760
Progress in Location-Based Services 2016
Georg Gartner, Haosheng Huang
Hardcover
R6,451
Discovery Miles 64 510
Linked Data - A Geographic Perspective
Glen Hart, Catherine Dolbear
Paperback
R1,835
Discovery Miles 18 350
|