Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Professional & Technical > Technology: general issues > Instruments & instrumentation engineering
There is an increasing demand for dynamic systems to become safer, more reliable and more economical in operation. This requirement extends beyond the normally accepted safety-critical systems e.g., nuclear reactors, aircraft and many chemical processes, to systems such as autonomous vehicles and some process control systems where the system availability is vital. The field of fault diagnosis for dynamic systems (including fault detection and isolation) has become an important topic of research. Many applications of qualitative and quantitative modelling, statistical processing and neural networks are now being planned and developed in complex engineering systems. Issues of Fault Diagnosis for Dynamic Systems has been prepared by experts in fault detection and isolation (FDI) and fault diagnosis with wide ranging experience.Subjects featured include: - Real plant application studies; - Non-linear observer methods; - Robust approaches to FDI; - The use of parity equations; - Statistical process monitoring; - Qualitative modelling for diagnosis; - Parameter estimation approaches to FDI; - Fault diagnosis for descriptor systems; - FDI in inertial navigation; - Stuctured approaches to FDI; - Change detection methods; - Bio-medical studies. Researchers and industrial experts will appreciate the combination of practical issues and mathematical theory with many examples. Control engineers will profit from the application studies.
Branch-and-bound search has been known for a long time and has been widely used in solving a variety of problems in computer-aided design (CAD) and many important optimization problems. In many applications, the classic branch-and-bound search methods perform duplications of computations, or rely on the search decision trees which keep track of the branch-and-bound search processes. In CAD and many other technical fields, the computational cost of constructing branch-and-bound search decision trees in solving large scale problems is prohibitive and duplications of computations are intolerable. Efficient branch-and-bound methods are needed to deal with today's computational challenges. Efficient branch-and-bound methods must not duplicate computations. Efficient Branch and Bound Search with Application to Computer-Aided Design describes an efficient branch-and-bound method for logic justification, which is fundamental to automatic test pattern generation (ATPG), redundancy identification, logic synthesis, minimization, verification, and other problems in CAD. The method is called justification equivalence, based on the observation that justification processes may share identical subsequent search decision sequences. With justification equivalence, duplication of computations is avoided in the dynamic branch-and-bound search process without using search decision trees. Efficient Branch and Bound Search with Application to Computer-Aided Design consists of two parts. The first part, containing the first three chapters, provides the theoretical work. The second part deals with applications, particularly ATPG for sequential circuits. This book is particularly useful to readers who are interested in the design and test of digital circuits.
This book describes important recent developments in fiber optic sensor technology and examines established and emerging applications in a broad range of fields and markets, including power engineering, chemical engineering, bioengineering, biomedical engineering, and environmental monitoring. Particular attention is devoted to niche applications where fiber optic sensors are or soon will be able to compete with conventional approaches. Beyond novel methods for the sensing of traditional parameters such as strain, temperature, and pressure, a variety of new ideas and concepts are proposed and explored. The significance of the advent of extended infrared sensors is discussed, and individual chapters focus on sensing at THz frequencies and optical sensing based on photonic crystal structures. Another important topic is the resonances generated when using thin films in conjunction with optical fibers, and the enormous potential of sensors based on lossy mode resonances, surface plasmon resonances, and long-range surface exciton polaritons. Detailed attention is also paid to fiber Bragg grating sensors and multimode interference sensors. Each chapter is written by an acknowledged expert in the subject under discussion.
PART I - APPARATUS AND PRINCIPLES USED IN MICRODIFFUSION ANALYSIS - II. A STANDARD MICRO DIFFUSION APPARATUS OR 'UNIT' - III. FACTORS INFLUENCING THE ABSORPTION RATE FROM OUTER TO INNER CHAMBER WITH SPECIAL REFERENCE TO AMMONIA - IV. GENERAL PRINCIPLES GOVERNING THE ABSORPTION TIME IN MICRO DIFFUSION ANALYSIS - V. PIPETTES (SUITABLE FOR USE WITH THE STANDARD UNITS) AND THEIR DELIVERY ERRORS - VI. MICRO-BURETTES (SUITABLE FOR USE WITH THE STANDARD UNITS) AND ERRORS INVOLVED IN THEIR USE - VII. THE MICRODIFFUSION METHOD WITH END-POINT VOLUMES AROUND 20 CUBIC MILLIMETRES - VII. COLORIMETRY IN THE MICRODIFFUSION METHODS - PART II - DESCRIPTION OF METHODS WITH THE STANDARD UNITS - IX. AMMONIA. GENERAL METHOD USING STANDARD ACID AS ABSORBENT - X. AMMONIA. GENERAL METHOD (USING THE BORIC-HCL PROCEDURE) - XI. SPECIAL FACTORS INFLUENCING THE RATE OF AMMONIA ABSORPTION - XII. OTHER METHODS FOR DETERMINING THE ABSORBED AMMONIA IN THE MICRO DIFFUSION PROCEDURE - XIII. AMMONIA. BIOLOGICAL DETERMINATIONS - XIV. TOTAL NITROGEN XVII. UREA (BLOOD AND URINE) - XIX. ADENOSINETRIPHOSPHORIC ACID, ADENYLIC ACID, ADENOSINE, ETC. - XX. NITRATE, NITRITE AND AMIDE NITROGEN - XXII. MONOAMINE OXIDASE AND HISTAMINASE IN TISSUES - XXIII. DETERMINATION OF VOLATILE AMINES - XXIV. CARBONATES AND BICARBONATE - XXV. BLOOD GLUCOSE AND FERMENTABLE SUGAR IN NORMAL URINE - XXVI. DETERMINATION OF CARBONIC ANHYDRASE - XXVII. OXIDATION RATES OF ORGANIC SUBSTANCES WITH A STANDARD OXIDANT WITH APPLICATION TO DETERMINATION OF MINUTE AMOUNTS OF CALCIUM AS OXALATE - XXVIII. ACETIC ACID AND OTHER LOWER FATTY ACIDS - XXIX. ASSAY OF ACETYLCHOLINESTERASE - XXX. CYANIDE, AZIDE, SULPHIDE, PHENOLS - XXXI. METHANOL AND ISOPROPANOL GROUP - XXXII. ETHANOL - XXXIII. ETHANOL FROM URETHANE - XXXIV. FORMALDEHYDE - XXXV. FORMALDEHYDOGENIC STEROIDS (PERIODIC ACID AS OXIDANT) - XXXVI. FORMALDEHYDOGENIC STEROIDS (SODIUM BISMUTHATE AS OXIDANT) - XXXVII. GLYCINE (FORMALDEHYDE PRODUCED BY NINHYDRIN OXIDATION) - XXXVIII. ACETALDEHYDE (SEMICARBAZIDE ABSORPTION) - XXXIX. ACETALDEHYDE FROM LACTIC ACID AND THREONINE WITH BISULPHITE ABSORPTION - XL. ACETONE (INCLUDING A RAPID CLINICAL METHOD USING THE NESSLER SOLUTION) - XLI. THE HALOGENS (INTRODUCTORY) - XLII. CHLORIDE (BY OXIDATION TO CHLORINE AND ABSORPTION INTO IODIDE) - XLIII. CHLORIDE (BY OXIDATION TO CHLORINE AND ABSORPTION INTO FAST GREEN) - XLIV. BROMIDE - XLV. IODIDES AND HALOGEN MIXTURES - XLVI. SERIAL DETERMINATION OF ORGANICALLY BOUND HALOGEN - XLVII. VOLATILE HALOGENATED HYDROCARBONS (CHLOROFORM, TRICHLORETHYLENE AND CARBON TETRACHLORIDE) - XLVIII. CARBON MONOXIDE - XLIX. A RAPID CLINICAL METHOD FOR CARBON MONOXIDE DETERMINATION - LI. TOTAL MOLECULAR CONCENTRATION IN FLUID SAMPLES OF ABOUT 3-4 MILLIGRAMS - LII. SEPARATION OF CRYSTALS AND' GUMS' BY MICRODIFFUSION - QUALITATIVE MICRO-DIFFUSION ANALYSIS - LIII. SOME CONSIDERATIONS ON QUALITATIVE MICRO-DIFFUSION ANALYSIS - PART III - THE ERROR OF VOLUMETRIC TITRATION - LIV. INTRODUCTORY - LV. THE VARIABLE GLASS ERROR - LVI. THE TOTAL VARIABLE GLASS ERROR AND ITS CONTROL - LVII. THE VARIABLE CHEMICAL ERROR IN TITRATION - LVIII. THE RATIONALE OF MICRO TITRATION - LIX. THE CONSTANT GLASS ERROR - LX. THE CONSTANT CHEMICAL ERROR - LXI. VOLUMETRIC ERROR IN KJELDAHL NITROGEN ANALYSES - LXIII. UREA EXCRETION AS RENAL FUNCTION TEST - Full TOC available on website
The object of this book is to provide a comprehensive treatment of the principal issues in modern instrumentation, but without attempting an encyclopedic reference. It thus discusses the basic theory and physical principles underlying the operation of the various sensors as well as the practical aspects of their operation and their incorporation into larger systems. The intent is to cover the most important topics in electronics, sensors, measurements, and acquisition systems, always keeping in mind the needs of practicing scientists and engineers. The presentation focuses on systems controlled by desktop personal computers running a high-level program and operating through internal cards or an external bus connected to instruments, rather than the specialized microprocessors discussed in older texts. The book will thus be useful to students in a wide variety of experimental sciences and engineering disciplines, including physics, chemistry, mechanical, nuclear, and electrical engineering, experimental psychology, biology, and geophysics.
For Sophomore/Junior-level courses in Automatic Control Systems, Process Controls, and Instrumentation and Measurement. This text is designed to provide students with an understanding and appreciation of some of the essential concepts behind control system elements and operations, without the need of advanced math and theory. It also presents some of the practical details of how elements of a control system are designed and operated, such as would be gained from on-the-job experience. This edition includes treatment of modern fieldbus approaches to networked and distributed control systems. This middle ground of knowledge enables students to design the elements of a control system from a practical, working perspective, and comprehend how these elements affect overall system operation and tuning.
This book contains selected contributions from the 6th CIRP International Seminar on Computer-Aided Tolerancing, which was held on 22-24 March, 1999, at the University of Twente, Enschede, The Netherlands. This volume presents the theory and application of consistent tolerancing. Until recently CADCAM systems did not even address the issue of tolerances and focused purely on nominal geometry. Therefore, CAD data was only of limited use for the downstream processes. The latest generation of CADCAM systems incorporates functionality for tolerance specification. However, the lack of consistency in existing tolerancing standards and everyday tolerancing practice still lead to ill-defined products, excessive manufacturing costs and unexpected failures. Research and improvement of education in tolerancing are hot items today. Global Consistency of Tolerances gives an excellent overview of the recent developments in the field of Computer-Aided Tolerancing, including such topics as tolerance specification; tolerance analysis; tolerance synthesis; tolerance representation; geometric product specification; functional product analysis; statistical tolerancing; education of tolerancing; computational metrology; tolerancing standards; and industrial applications and CAT systems. This book is well suited to users of new generation CADCAM systems who want to use the available tolerancing possibilities properly. It can also be used as a starting point for research activities.
Throughout the 1980s and 1990s, the theory and practice of testing electronic products has changed considerably. Quality and testing have become inextricably linked and both are fundamental to the generation of revenue to a company, helping the company to remain profitable and therefore survive. Testing plays an important role in assessing the quality of a product. The tester acts as a filter, separating good products from bad. Unfortunately, the tester can pass bad products and fail good products, and the generation of high quality tests has become complex and time consuming. To achieve significant reduction in time and cost of testing, the role and responsibility of testing has to be considered across an entire organization and product development process. Testability Concepts for Digital ICs: The Macro Test Approach considers testability aspects for digital ICs. The strategy taken is to integrate the testability aspects into the design and manufacturing of ICs and, for each IC design project, to give a precise definition of the boundary conditions, responsibilities, interfaces and communications between persons, and quality targets. Macro Test, a design-for-Testability approach, provides a manageable test program route. Using the Macro Test approach, one can explore alternative solutions to satisfy pre-defined levels of performance (e.g. defect detection, defect location, test application) within a pre-defined cost budget and time scale. Testability Concepts for Digital ICs is the first book to present a tried and proven method of using a Macro approach to testing complex ICs and is of particular interest to all test engineers, IC designers and managers concerned with producing highquality ICs.
This book is a comprehensive and practical guide to the use of
ultrasonic techniques for the characterization of fluids. Focusing
on ultrasonic velocimetry, the author covers the basic topics and
techniques necessaryfor successful ultrasound measurements on
emulsions, dispersions, multiphase media, and
viscoelastic/viscoplastic materials. Advanced techniques such as
scattering, particle sizing, and automation are also presented. As
a handbook for industrial and scientific use, Ultrasonic Techniques
for Fluids Characterization is an indispensable guide to chemists
and chemical engineers using ultrasound for research or process
monitoring in the chemical, food processing, pharmaceutical,
cosmetic, biotechnology, and fuels industries.
Provides a comprehensive guide to measurements with lasers Examines the design of optical and laser-based instruments Reviews the development of measurement strategies Includes two new chapters on self-mixing interferometry and quantum sensing Includes end of chapter problems
In the current push to convert to renewable sources of energy, many issues raised years ago on the economics and the difficulties of siting energy storage are once again being raised today. When large amounts of wind, solar, and other renewable energy sources are added to existing electrical grids, efficient and manageable energy storage becomes a crucial component to allowing a range of eco-friendly resources to play a significant role in our energy system. In order to fulfill our intended goal of diminishing dependence on non-renewable sources of energy and reducing our carbon footprint, we must find a way to store and convert these novel resources into practical solutions. Based on the efforts of a University of Colorado team devoted to increasing the use of renewable energy production within the current electrical power grid, Large Energy Storage Systems Handbook examines a number of ways that energy can be stored and converted back to electricity. Examining how to enhance renewable generation energy storage relative to economic and carbon impact, this book discusses issues of reliability, siting, economics, and efficiency. Chapters include the practicalities of energy storage, generation, and absorption of electrical power; the difficulties of intermittent generation; and the use of pumped and underground pumped hydroelectric energy storage. The book highlights the storage of compressed air, battery energy, solar thermal, and natural gas sources of energy. Heavily referenced and easily accessible to policy makers, developers, and students alike, this book provides contributions from those active in the field for coverage of many important topics. With this book as a foundation, these pioneers can develop the capacity of power grids to handle high renewable energy generation penetration and provide a brighter future for generations to come.
Computer Aided Tolerancing (CAT) is an important topic in any field of design and production where parts move relative to one another and/or are assembled together. Geometric variations from specified dimensions and form always occur when parts are manufactured. Improvements in production systems can cause the amounts of the variations to become smaller, but their presence does not disappear. To shorten the time from concept to market of a product, it has been increasingly important to take clearances and the tolerancing of manufacturing variations into consideration right from the beginning, at the stage of design. Hence, geometric models are defined that represent both the complete array of geometric variations possible during manufacture and also the influence of geometry on the function of individual parts and on assemblies of them...
This book is a compilation of selected papers from the 3rd International Symposium on Software Reliability, Industrial Safety, Cyber Security and Physical Protection of Nuclear Power Plants, held in Harbin, China on 15th-17th August 2018. The symposium discussed the status quo, technical advances and development direction of digital instrument control technology, software reliability, information security and physical protection in the process of nuclear power development. Offering technical insights and know from leading experts, this book is a valuable resource for both practitioners and academics working in the field of nuclear instrumentation, control systems and other safety-critical systems, as well as nuclear power plant managers, public officials, and regulatory authorities.
Metrology and Properties of Engineering Surfaces provides in a single volume a comprehensive and authoritative treatment of the crucial topics involved in the metrology and properties of engineering surfaces. The subject matter is a central issue in manufacturing technology, since the quality and reliability of manufactured components depend greatly upon the selection and qualities of the appropriate materials as ascertained through measurement. The book can in broad terms be split into two parts; the first deals with the metrology of engineering surfaces and covers the important issues relating to the measurement and characterization of surfaces in both two and three dimensions. This covers topics such as filtering, power spectral densities, autocorrelation functions and the use of Fractals in topography. A significant proportion is dedicated to the calibration of scanning probe microscopes using the latest techniques. The remainder of the book deals with the properties of engineering surfaces and covers a wide range of topics including hardness (measurement and relevance), surface damage and the machining of brittle surfaces, the characterization of automobile cylinder bores using different techniques including artificial neural networks and the design and use of polymer bearings in microelectromechanical devices. Edited by three practitioners with a wide knowledge of the subject and the community, Metrology and Properties of Engineering Surfaces brings together leading academics and practitioners in a comprehensive and insightful treatment of the subject. The book is an essential reference work both for researchers working and teaching in the technology and for industrial users who need to be aware of current developments of the technology and new areas of application.
These two volumes present the proceedings of the International Conference on Technology and Instrumentation in Particle Physics 2017 (TIPP2017), which was held in Beijing, China from 22 to 26 May 2017. Gathering selected articles on the basis of their quality and originality, it highlights the latest developments and research trends in detectors and instrumentation for all branches of particle physics, particle astrophysics and closely related fields. This is the first volume, and focuses on the main themes Gaseous detectors, Semiconductor detectors, Experimental detector systems, Calorimeters, Particle identification, Photon detectors, Dark Matter Detectors and Neutrino Detectors. The TIPP2017 is the fourth in a series of international conferences on detectors and instrumentation, held under the auspices of the International Union of Pure and Applied Physics (IUPAP). The event brings together experts from the scientific and industrial communities to discuss their current efforts and plan for the future. The conference's aim is to provide a stimulating atmosphere for scientists and engineers from around the world.
Theory and practice of tolerances are very important for designing and manufacturing engineering artifacts on a rational basis. Tolerance specifies a degree of "discrepancy" between an idealized object and its physical realization. Such discrepancy inevitably comes into our product realization processes because of practical cost consideration or our inability to fully control manufacturing processes. Major product and production characteristics which are affected by tolerances are product quality and cost. For achieving high precision machines tight tolerance specification is necessary, but this will normally increase product cost. In order to optimally compromise the conflicting requirements of quality and cost, it is essential to take into account of the total product life cycle throughout product planning, design, manufacturing, maintenance and recycling. For example, in order to construct durable products under severe working conditions, low sensitivity of product functionality with respect to tolerances is required. In future, re-use of components or parts will become important, and tolerance synthesis with respect to this aspect will be an interesting future research topics.
Showcasing the most influential developments, experiments, and architectures impacting the digital, surveillance, automotive, industrial, and medical sciences, Image Processing Technologies tracks the evolution and advancement of computer vision and image processing (CVIP) technologies, examining methods and algorithms for image analysis, optimization, segmentation, and restoration. It focuses on recent approaches and techniques in CVIP applications development and explores various coding methods for individual types of 3-D images. This text/reference brings researchers and specialists up-to-date on the latest innovations affecting multiple image processing environments.
Offers a working knowledge of the origin and development of the more traditional technology flowmeters: differential pressure and primary elements, positive displacement, turbine, open channel, and variable area. Describes how these conventional meters still fit into what is being called Industry 4.0. Discusses the advantages and disadvantages of conventional technology meters and provides a rationale for retaining or displacing these meters. Focuses on the origin, development operating principles, and applications for the meters. Explores the development of each conventional flowmeter type, including the roles of companies like Siemens, ABB, Emerson, Foxboro, KROHNE, and Endress+Hauser.
The ability to arrange precisely designed patterns of nanoparticles into a desired spatial configuration is the key to creating novel nanoscale devices that take advantage of the unique properties of nanomaterials. While two-dimensional arrays of nanoparticles have been demonstrated successfully by various techniques, a controlled way of building ordered arrays of three-dimensional (3D) nanoparticle structures remains challenging. This book describes a new technique called the 'nanoscopic lens' which is able to produce a variety of 3D nano-structures in a controlled manner. This ebook describes the nanoscopic lens technique and how it can serve as the foundation for device development that is not limited to a variety of optical, magnetic and electronic devices, but can also create a wide range of bio-nanoelectronic devices.
Comprising specially selected papers on the subject of Computational Methods and Experimental Measurements, this book includes research from scientists, researchers and specialists who perform experiments, develop computer codes and carry out measurements on prototypes. Improvements relating to computational methods have generated an ever-increasing expansion of computational simulations that permeate all fields of science and technology. Validating the results of these improvements can be achieved by carrying out committed and accurate experiments, which have undertaken continuous development. Current experimental techniques have become more complex and sophisticated so that they require the intensive use of computers, both for running experiments as well as acquiring and processing the resulting data. This title explores new experimental and computational methods and covers various topics such as: Computer-aided Models; Image Analysis Applications; Noise Filtration of Shockwave Propagation; Finite Element Simulations.
This book reviews the advances in data gathering and processing in the biotech laboratory environment, and it sheds new lights on the various aspects that are necessary for the implementation of intelligent laboratory architecture and infrastructure. Smart technologies are increasingly dominating our everyday lives and have become an indispensable part of the industrial environment. The laboratory environment, which has long been rather conservative, has also set out to adapt smart technologies with regards to Industry 4.0 and the Internet of Things (IoT) for the laboratory. Due to the heterogeneity of the existing infrastructure and the often complex work processes, standardization is slow, e.g. to implement device interfaces or standardized driver protocols, which are urgently needed to generate standardized data streams that would be immanent for post-processing of data. Divided into 9 chapters, this book offers an authoritative overview of the diverse aspects in the generation and recording of uniform data sets in the laboratory, and in the processing of the data and enabling seamless processing towards machine learning and artificial intelligence. In the first part of the book, readers will find more about high throughout systems, automation, robotics, and the evolution of technology in the laboratory. The second part of the book is devoted to standardization in lab automation, in which readers will learn more about some regulatory aspects, the SiLA2 standards, the OPC LADS (Laboratory and Analytical Device Standard), and FAIR Data infrastructure
This reference is a guide to the concepts, technology, uses, costs and vocabulary of fractional T-1 services. The book illustrates fractional T-1 capabilities and limitations by explaining basic T-1 networking, common fractional T-1 access methods, equipment interfacing and troubleshooting. The text describes the advantages to be gained with these services, the various service alternatives and the cost/benefit considerations.
Tbis book is basicaUy concemed with approaches for improving safety in man-made systems. We caU these approaches, coUectively, fault monitoring, since they are concemed primarily with detecting faults occurring in the components of such systems, being sensors, actuators, controUed plants or entire strucutures. The common feature of these approaches is the intention to detect an abrupt change in some characteristic property of the considered object, by monitoring the behavior of the system. This change may be a slow-evolving effect or a complete breakdoWD. In tbis sense, fault monitoring touches upon, and occasionaUy overIaps with, other areas of control engineering such as adaptive control, robust controller design, reIiabiIity and safety engineering, ergonomics and man-macbine interfacing, etc. In fact, a system safety problem, could be attacked from any of the above angles of view. In tbis book, we don't touch upon these areas, unless there is a strong relationship between the fauIt monitoring approaches discussed and the aforementioned fields. When we set out to write tbis book, our aim was to incIude as much material as possible in a most rigorous, unified and concise format. Tbis would incIude state-of-the-art method as weil as more cIassical techniques, stilI in use today. AB we proceeded in gathering material, however, it soon became apparent that these were contradicting design criteria and a trade-off had to be made. We believe that the completeness vs. |
You may like...
Beat Cancer Kitchen - Deliciously Simple…
Chris Wark, Micah Wark
Paperback
|