![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific standards
The purpose of this book is to present methods for developing, evaluating and maintaining rater-mediated assessment systems. Rater-mediated assessments involve ratings that are assigned by raters to persons responding to constructed-response items (e.g., written essays and teacher portfolios) and other types of performance assessments. This book addresses the following topics: (1) introduction to the principles of invariant measurement, (2) application of the principles of invariant measurement to rater-mediated assessments, (3) description of the lens model for rater judgments, (4) integration of principles of invariant measurement with the lens model of cognitive processes of raters, (5) illustration of substantive and psychometric issues related to rater-mediated assessments in terms of validity, reliability, and fairness, and (6) discussion of theoretical and practical issues related to rater-mediated assessment systems. Invariant measurement is fast becoming the dominant paradigm for assessment systems around the world, and this book provides an invaluable resource for graduate students, measurement practitioners, substantive theorists in the human sciences, and other individuals interested in invariant measurement when judgments are obtained with rating scales.
This book presents a comprehensive review of the most important methods used in the characterisation of piezoelectric, ferroelectric and pyroelectric materials. It covers techniques for the analysis of bulk materials and thick and thin film materials and devices. There is a growing demand by industry to adapt and integrate piezoelectric materials into ever smaller devices and structures. Such applications development requires the joint development of reliable, robust, accurate and - most importantly - relevant and applicable measurement and characterisation methods and models. In the past few years there has been a rapid development of new techniques to model and measure the variety of properties that are deemed important for applications development engineers and scientists. The book has been written by the leaders in the field and many chapters represent established measurement best practice, with a strong emphasis on application of the methods via worked examples and detailed experimental procedural descriptions. Each chapter contains numerous diagrams, images, and measurement data, all of which are fully referenced and indexed. The book is intended to occupy space in the research or technical lab, and will be a valuable and practical resource for students, materials scientists, engineers, and lab technicians.
Maximizing reader insights into the key scientific disciplines of Machine Tool Metrology, this text will prove useful for the industrial-practitioner and those interested in the operation of machine tools. Within this current level of industrial-content, this book incorporates significant usage of the existing published literature and valid information obtained from a wide-spectrum of manufacturers of plant, equipment and instrumentation before putting forward novel ideas and methodologies. Providing easy to understand bullet points and lucid descriptions of metrological and calibration subjects, this book aids reader understanding of the topics discussed whilst adding a voluminous-amount of footnotes utilised throughout all of the chapters, which adds some additional detail to the subject. Featuring an extensive amount of photographic-support, this book will serve as a key reference text for all those involved in the field.
This thesis discusses the physical and information theoretical limits of optical 3D metrology, and, based on these principal considerations, introduces a novel single-shot 3D video camera that works close to these limits. There are serious obstacles for a "perfect" 3D-camera: The author explains that it is impossible to achieve a data density better than one third of the available video pixels. Available single-shot 3D cameras yet display much lower data density, because there is one more obstacle: The object surface must be "encoded" in a non-ambiguous way, commonly by projecting sophisticated patterns. However, encoding devours space-bandwidth and reduces the output data density. The dissertation explains how this profound dilemma of 3D metrology can be solved, exploiting just two synchronized video cameras and a static projection pattern. The introduced single-shot 3D video camera, designed for macroscopic live scenes, displays an unprecedented quality and density of the 3D point cloud. The lateral resolution and depth precision are limited only by physics. Like a hologram, each movie-frame encompasses the full 3D information about the object surface and the observation perspective can be varied while watching the 3D movie.
Principles of Scientific Methods focuses on the fundamental principles behind scientific methods. The book refers to "science" in a broad sense, including natural science, physics, mathematics, statistics, social science, political science, and engineering science. A principle is often abstract and has broad applicability while a method is usually concrete and specific. The author uses many concrete examples to explain principles and presents analogies to connect different methods or problems to arrive at a general principle or a common notion. He mainly discusses a particular method to address the great idea behind the method, not the method itself. The book shows how the principles are not only applicable to scientific research but also to our daily lives. The author explains how scientific methods are used for understanding how and why things happen, making predictions, and learning how to prevent mistakes and solve problems. Studying the principles of scientific methods is to think about thinking and to enlighten our understanding of scientific research. Scientific principles are the foundation of scientific methods. In this book, you'll see how the principles reveal the big ideas behind our scientific discoveries and reflect the fundamental beliefs and wisdoms of scientists. The principles make the scientific methods coherent and constitute the source of creativity.
The way science is done has changed radically in recent years. Scientific research and institutions, which have long been characterized by passion, dedication and reliability, have increasingly less capacity for more ethical pursuits, and are pressed by hard market laws. From the vocation of a few, science has become the profession of many - possibly too many. These trends come with consequences and risks, such as the rise in fraud, plagiarism, and in particular the sheer volume of scientific publications, often of little relevance. The solution? A slow approach with more emphasis on quality rather than quantity that will help us to rediscover the essential role of the responsible scientist. This work is a critical review and assessment of present-day policies and behavior in scientific production and publication. It touches on the tumultuous growth of scientific journals, in parallel with the growth of self-declared scientists over the world. The author's own reflections and experiences help us to understand the mechanisms of contemporary science. Along with personal reminiscences of times past, the author investigates the loopholes and hoaxes of pretend journals and nonexistent congresses, so common today in the scientific arena. The book also discusses the problems of bibliometric indices, which have resulted in large part from the above distortions of scientific life.
This book covers a wide range of advanced analytical tools, from electrochemical to in-situ/ex-situ material characterization techniques, as well as the modeling of corrosion systems to foster understanding and prediction. When used properly, these tools can enrich our understanding of material performance (metallic materials, coatings, inhibitors) in various environments/contexts (aqueous corrosion, high-temperature corrosion). The book encourages researchers to develop new corrosion-resistant materials and supports them in devising suitable asset integrity strategies. Offering a valuable resource for researchers, industry professionals, and graduate students alike, the book shows them how to apply these valuable analytical tools in their work.
Central to this thesis is the characterisation and exploitation of electromagnetic properties of light in imaging and measurement systems. To this end an information theoretic approach is used to formulate a hitherto lacking, quantitative definition of polarisation resolution, and to establish fundamental precision limits in electromagnetic systems. Furthermore rigorous modelling tools are developed for propagation of arbitrary electromagnetic fields, including for example stochastic fields exhibiting properties such as partial polarisation, through high numerical aperture optics. Finally these ideas are applied to the development, characterisation and optimisation of a number of topical optical systems: polarisation imaging; multiplexed optical data storage; and single molecule measurements. The work has implications for all optical imaging systems where polarisation of light is of concern.
This is the first book summarizing the theoretical basics of thermal nondestructive testing (TNDT) by combining elements of heat conduction, infrared thermography, and industrial nondestructive testing. The text contains the physical models of TNDT, heat transfer in defective and sound structures, and thermal properties of materials. Also included are the optimization of TNDT procedures, defect characterization, data processing in TNDT, active and passive TNDT systems, as well as elements of statistical data treatment and decision making. This text contains in-depth descriptions of applications in infrared/thermal testing within aerospace, power production, building, as well as the conservation of artistic monuments The book is intended for the industrial specialists who are involved in technical diagnostics and nondestructive testing. It may also be useful for academic researchers, undergraduate, graduate and PhD university students.
This immensely practical guide to PIV provides a condensed, yet exhaustive guide to most of the information needed for experiments employing the technique. This second edition has updated chapters on the principles and extra information on microscopic, high-speed and three component measurements as well as a description of advanced evaluation techniques. What's more, the huge increase in the range of possible applications has been taken into account as the chapter describing these applications of the PIV technique has been expanded.
This well-illustrated book, by two established historians of school mathematics, documents Thomas Jefferson's quest, after 1775, to introduce a form of decimal currency to the fledgling United States of America. The book describes a remarkable study showing how the United States' decision to adopt a fully decimalized, carefully conceived national currency ultimately had a profound effect on U.S. school mathematics curricula. The book shows, by analyzing a large set of arithmetic textbooks and an even larger set of handwritten cyphering books, that although most eighteenth- and nineteenth-century authors of arithmetic textbooks included sections on vulgar and decimal fractions, most school students who prepared cyphering books did not study either vulgar or decimal fractions. In other words, author-intended school arithmetic curricula were not matched by teacher-implemented school arithmetic curricula. Amazingly, that state of affairs continued even after the U.S. Mint began minting dollars, cents and dimes in the 1790s. In U.S. schools between 1775 and 1810 it was often the case that Federal money was studied but decimal fractions were not. That gradually changed during the first century of the formal existence of the United States of America. By contrast, Chapter 6 reports a comparative analysis of data showing that in Great Britain only a minority of eighteenth- and nineteenth-century school students studied decimal fractions. Clements and Ellerton argue that Jefferson's success in establishing a system of decimalized Federal money had educationally significant effects on implemented school arithmetic curricula in the United States of America. The lens through which Clements and Ellerton have analyzed their large data sets has been the lag-time theoretical position which they have developed. That theory posits that the time between when an important mathematical "discovery" is made (or a concept is "created") and when that discovery (or concept) becomes an important part of school mathematics is dependent on mathematical, social, political and economic factors. Thus, lag time varies from region to region, and from nation to nation. Clements and Ellerton are the first to identify the years after 1775 as the dawn of a new day in U.S. school mathematics-traditionally, historians have argued that nothing in U.S. school mathematics was worthy of serious study until the 1820s. This book emphasizes the importance of the acceptance of decimal currency so far as school mathematics is concerned. It also draws attention to the consequences for school mathematics of the conscious decision of the U.S. Congress not to proceed with Thomas Jefferson's grand scheme for a system of decimalized weights and measures.
This book covers a very broad spectrum of experimental and theoretical activity in particle physics, from the searches for the Higgs boson and physics beyond the Standard Model, to detailed studies of Quantum Chromodynamics, the B-physics sectors and the properties of hadronic matter at high energy density as realised in heavy-ion collisions. Starting with a basic introduction to the Standard Model and its most likely extensions, the opening section of the book presents an overview of the theoretical and phenomenological framework of hadron collisions and current theoretical models of frontier physics. In part II, discussion of the theory is supplemented by chapters on the detector capabilities and search strategies, as well as an overview of the main detector components, the initial calibration procedures and physics samples and early LHC results. Part III completes the volume with a description of the physics behind Monte Carlo event generators and a broad introduction to the main statistical methods used in high energy physics. "LHC Phenomenology" covers all of these topics at a pedagogical level, with the aim of providing young particle physicists with the basic tools required for future work on the various LHC experiments. It will also serve as a useful reference text for those working in the field.
The winner of UCL's annual HEP thesis prize, this work describes an analysis of the data from the second flight of the Antarctica Impulsive Transient Antenna (ANITA). ANITA is a balloon-borne experiment that searches for radio signals originating from ultra-high energy neutrinos and cosmic rays interacting with the Antarctic ice or air. The search for ultrahigh energy neutrinos of astrophysical origin is one of the outstanding experimental challenges of the 21st century. The ANITA experiment was designed to be the most sensitive instrument to ultra-high energy neutrinos that originate from the interactions of cosmic rays with the cosmic microwave background. The methodology and results of the neutrino and cosmic ray searches are presented in the thesis.
This up-to-date treatment of recent developments in geometric inverse problems introduces graduate students and researchers to an exciting area of research. With an emphasis on the two-dimensional case, topics covered include geodesic X-ray transforms, boundary rigidity, tensor tomography, attenuated X-ray transforms and the Calderon problem. The presentation is self-contained and begins with the Radon transform and radial sound speeds as motivating examples. The required geometric background is developed in detail in the context of simple manifolds with boundary. An in-depth analysis of various geodesic X-ray transforms is carried out together with related uniqueness, stability, reconstruction and range characterization results. Highlights include a proof of boundary rigidity for simple surfaces as well as scattering rigidity for connections. The concluding chapter discusses current open problems and related topics. The numerous exercises and examples make this book an excellent self-study resource or text for a one-semester course or seminar.
This ready reference surveys the discipline of standards and standardization, defining common terms, clarifying descriptions, describing how standards could be used to restrain trade, and explaining how international trade is stimulated by the due process provisions of standards writing organizations. Containing real-world examples provided by experienced standards professionals, Standardization Essentials is a vital, forward-looking reference for mechanical, civil, electrical and electronics, materials, chemical, mineral, cost, quality, reliability, industrial, developmental, safety, forensic, and consulting engineers; standards managers; architects; project managers; upper-level undergraduate, graduate, and continuing education students in these disciplines. Crystallizes the essential role that standards play in strategic standardization management, purchasing, contractual agreements, and international trade! Covering costs, benefits, limitations, uses, and abuses of standardization programs, Standardization Essentials -Considers whether standards build or bar trade and the use of international standards to leverage world markets -Presents a case study of conformity assessment related to international technical trade barriers -Focuses on consumer safety standards for automobile tires and other products -Addresses implementation of ISO 9000 and ISO 14000 management system standards in industry -Highlights voluntary (nongovernmental) and mandatory (governmental) standards and regulations developed by a variety of organizations -Reveals competition, incongruities, and harmonization among national and international standards
The theoretical foundations of the Standard Model of elementary particles relies on the existence of the Higgs boson, a particle which has been revealed for the first time by the experiments run at the Large Hadron Collider (LHC) in 2012. As the Higgs boson is an unstable particle, its search strategies were based on its decay products. In this thesis, Francesco Pandolfi conducted a search for the Higgs boson in the H ZZ l + l - qq Decay Channel with 4.6 fb -1 of 7 TeV proton-proton collision data collected by the Compact Muon Solenoid (CMS) experiment. The presence of jets in the final state poses a series of challenges to the experimenter: both from a technical point of view, as jets are complex objects and necessitate of ad-hoc reconstruction techniques, and from an analytical one, as backgrounds with jets are copious at hadron colliders, therefore analyses must obtain high degrees of background rejection in order to achieve competitive sensitivity. This is accomplished by following two directives: the use of an angular likelihood discriminant, capable of discriminating events likely to originate from the decay of a scalar boson from non-resonant backgrounds, and by using jet parton flavor tagging, selecting jets compatible with quark hadronization and discarding jets more likely to be initiated by gluons. The events passing the selection requirements in 4.6 fb -1 of data collected by the CMS detector are examined, in the search of a possible signal compatible with the decay of a heavy Higgs boson. The thesis describes the statistical tools and the results of this analysis. This work is a paradigm for studies of the Higgs boson with final states with jets. The non-expert physicists will enjoy a complete and eminently readable description of a proton-proton collider analysis. At the same time, the expert reader will learn the details of the searches done with jets at CMS.
The scientific method delivers prosperity, yet scientific practice has become subject to corrupting influences from within and without the scientific community. This essential reference is intended to help remedy those threats. The authors identify eight essential criteria for the practice of science and provide checklists to help avoid costly failures in scientific practice. Not only for scientists, this book is for all stakeholders of the broad enterprise of science. Science administrators, research funders, journal editors, and policymakers alike will find practical guidance on how they can encourage scientific research that produces useful discoveries. Journalists, commentators, and lawyers can turn to this text for help with assessing the validity and usefulness of scientific claims. The book provides practical guidance and makes important recommendations for reforms in science policy and science administration. The message of the book is complemented by Nobel Laureate Vernon L. Smith's foreword, and an afterword by Terence Kealey.
This book presents the practical aspects of mass measurements. Concepts of gravitational, inertial and conventional mass and details of the variation of acceleration of gravity are described. The Metric Convention and International Prototype Kilogram and BIPM standards are described. The effect of change of gravity on the indication of electronic balances is derived with respect of latitude, altitude and earth topography. The classification of weights by OIML is discussed. Maximum permissible errors in different categories of weights prescribed by national and international organizations are presented. Starting with the necessity of redefining the unit kilogram in terms of physical constants, various methods of defining the kilogram in terms of physical constants are described. The kilogram can be defined by Avogadro's constant, ion collection of some heavy elements, levitation, voltage and Watt Balance. The detection of very small mass of the order of zeptogram through Nanotechnolgy is also discussed. Latest recommendations of CIPM are given.
This thesis represents one of the most comprehensive and in-depth studies of the use of Lorentz-boosted hadronic final state systems in the search for signals of Supersymmetry conducted to date at the Large Hadron Collider. A thorough assessment is performed of the observables that provide enhanced sensitivity to new physics signals otherwise hidden under an enormous background of top quark pairs produced by Standard Model processes. This is complemented by an ingenious analysis optimization procedure that allowed for extending the reach of this analysis by hundreds of GeV in mass of these hypothetical new particles. Lastly, the combination of both deep, thoughtful physics analysis with the development of high-speed electronics for identifying and selecting these same objects is not only unique, but also revolutionary. The Global Feature Extraction system that the author played a critical role in bringing to fruition represents the first dedicated hardware device for selecting these Lorentz-boosted hadronic systems in real-time using state-of-the-art processing chips and embedded systems.
This book provides the basic concepts and fundamental principles of dynamic systems including experimental methods, calibration, signal conditioning, data acquisition and processing as well as the results presentation. How to select suitable sensors to measure is also introduced. It is an essential reference to students, lecturers, professionals and any interested lay readers in measurement technology.
Applied Photometry, Radiometry, and Measurements of Optical Losses reviews and analyzes physical concepts of radiation transfer, providing quantitative foundation for the means of measurements of optical losses, which affect propagation and distribution of light waves in various media and in diverse optical systems and components. The comprehensive analysis of advanced methodologies for low-loss detection is outlined in comparison with the classic photometric and radiometric observations, having a broad range of techniques examined and summarized: from interferometric and calorimetric, resonator and polarization, phase-shift and ring-down decay, wavelength and frequency modulation to pulse separation and resonant, acousto-optic and emissive - subsequently compared to direct and balancing methods for studying free-space and polarization optics, fibers and waveguides. The material is focused on applying optical methods and procedures for evaluation of transparent, reflecting, scattering, absorbing, and aggregated objects, and for determination of power and energy parameters of radiation and color properties of light.
This thesis deals with two main procedures performed with the ATLAS detector at the Large Hadron Collider (LHC). The noise description in the hadronic calorimeter TileCal represents a very valuable technical job. The second part presents a fruitful physics analysis - the cross section measurement of the process p+p Z0 + . The Monte Carlo simulations of the TileCal are described in the first part of the thesis, including a detailed treatment of the electronic noise and multiple interactions (so-called pile-up). An accurate description of both is crucial for the reconstruction of e.g. jets or hadronic tau-jets. The second part reports a Standard Model measurement of the Z0 + process with the emphasis on the final state with an electron and a hadronically decaying tau-lepton. The Z0 + channel forms the dominant background in the search for Higgs bosons decaying into tau lepton pairs, and thus the good understanding achieved here can facilitate more sensitive Higgs detection."
A broad class of accelerators rests on the induction principle whereby the accelerating electrical fields are generated by time-varying magnetic fluxes. Particularly suitable for the transport of bright and high-intensity beams of electrons, protons or heavy ions in any geometry (linear or circular) the research and development of induction accelerators is a thriving subfield of accelerator physics. This text is the first comprehensive account of both the fundamentals and the state of the art about the modern conceptual design and implementation of such devices. Accordingly, the first part of the book is devoted to the essential features of and key technologies used for induction accelerators at a level suitable for postgraduate students and newcomers to the field. Subsequent chapters deal with more specialized and advanced topics.
This book presents the state-of-the-art in infrared thermography (IRT) applications with a focus on moisture assessment in buildings. It also offers practical discussions of several case studies, including comparisons of IRT with other surface temperature measurement techniques. In closing, it demonstrates how IRT can be used to assess capillary absorption, and addresses moisture in walls due to wind-driven rain infiltrations, and the drying process. The book equips readers with a deeper understanding of the ideal conditions for accurate IRT assessment and offers practical recommendations. |
You may like...
Advanced Visual Basic 6 - Power…
Matthew Curland, Gary Clarke
Paperback
R1,273
Discovery Miles 12 730
The Garbage Collection Handbook - The…
Richard Jones, Antony Hosking, …
Hardcover
R1,918
Discovery Miles 19 180
Application of the SWAT Model for Water…
Majid Hosseini, Muhammad Aqeel Ashraf
Hardcover
R2,653
Discovery Miles 26 530
China Satellite Navigation Conference…
Jiadong Sun, Jingnan Liu, …
Hardcover
R5,284
Discovery Miles 52 840
Introducing Delphi Programming - Theory…
John Barrow, Linda Miller, …
Paperback
(1)R751 Discovery Miles 7 510
|