![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Mechanical engineering & materials > Production engineering > Industrial quality control
An authoritative guide to the most recent advances in statistical methods for quantifying reliability Statistical Methods for Reliability Data, Second Edition (SMRD2) is an essential guide to the most widely used and recently developed statistical methods for reliability data analysis and reliability test planning. Written by three experts in the area, SMRD2 updates and extends the long- established statistical techniques and shows how to apply powerful graphical, numerical, and simulation-based methods to a range of applications in reliability. SMRD2 is a comprehensive resource that describes maximum likelihood and Bayesian methods for solving practical problems that arise in product reliability and similar areas of application. SMRD2 illustrates methods with numerous applications and all the data sets are available on the book's website. Also, SMRD2 contains an extensive collection of exercises that will enhance its use as a course textbook. The SMRD2's website contains valuable resources, including R packages, Stan model codes, presentation slides, technical notes, information about commercial software for reliability data analysis, and csv files for the 93 data sets used in the book's examples and exercises. The importance of statistical methods in the area of engineering reliability continues to grow and SMRD2 offers an updated guide for, exploring, modeling, and drawing conclusions from reliability data. SMRD2 features: Contains a wealth of information on modern methods and techniques for reliability data analysis Offers discussions on the practical problem-solving power of various Bayesian inference methods Provides examples of Bayesian data analysis performed using the R interface to the Stan system based on Stan models that are available on the book's website Includes helpful technical-problem and data-analysis exercise sets at the end of every chapter Presents illustrative computer graphics that highlight data, results of analyses, and technical concepts Written for engineers and statisticians in industry and academia, Statistical Methods for Reliability Data, Second Edition offers an authoritative guide to this important topic.
STATISTICAL QUALITY CONTROL Provides a basic understanding of statistical quality control (SQC) and demonstrates how to apply the techniques of SQC to improve the quality of products in various sectors This book introduces Statistical Quality Control and the elements of Six Sigma Methodology, illustrating the widespread applications that both have for a multitude of areas, including manufacturing, finance, transportation, and more. It places emphasis on both the theory and application of various SQC techniques and offers a large number of examples using data encountered in real life situations to support each theoretical concept. Statistical Quality Control: Using MINITAB, R, JMP and Python begins with a brief discussion of the different types of data encountered in various fields of statistical applications and introduces graphical and numerical tools needed to conduct preliminary analysis of the data. It then discusses the basic concept of statistical quality control (SQC) and Six Sigma Methodology and examines the different types of sampling methods encountered when sampling schemes are used to study certain populations. The book also covers Phase 1 Control Charts for variables and attributes; Phase II Control Charts to detect small shifts; the various types of Process Capability Indices (CPI); certain aspects of Measurement System Analysis (MSA); various aspects of PRE-control; and more. This helpful guide also Focuses on the learning and understanding of statistical quality control for second and third year undergraduates and practitioners in the field Discusses aspects of Six Sigma Methodology Teaches readers to use MINITAB, R, JMP and Python to create and analyze charts Requires no previous knowledge of statistical theory Is supplemented by an instructor-only book companion site featuring data sets and a solutions manual to all problems, as well as a student book companion site that includes data sets and a solutions manual to all odd-numbered problems Statistical Quality Control: Using MINITAB, R, JMP and Python is an excellent book for students studying engineering, statistics, management studies, and other related fields and who are interested in learning various techniques of statistical quality control. It also serves as a desk reference for practitioners who work to improve quality in various sectors, such as manufacturing, service, transportation, medical, oil, and financial institutions. It's also useful for those who use Six Sigma techniques to improve the quality of products in such areas.
This textbook presents the principal methods of stress analysis for the design of frame structures, beginning with a description of the basic criteria for probabilistic safety verification used in modern codes. The Force Method and the Displacement Method are dealt with, together with their applications to more common structural situations. A special chapter is dedicated to the second order analysis required for slender structures and for the elaboration of instability problems. In turn, a thorough set of numerical examples rounds out the text. Given its scope, the book offers an ideal learning resource for students of Civil and Building Engineering and Architecture, and a valuable reference guide for practicing structural design professionals.
Sicherheitsmanagement wird mit Hilfe des vorliegenden Handbuchs und der Software Ariadne SMS effizient und kostengunstig. Die Autoren beschreiben die Digitalisierung des Expertenwissens von Unfalluntersuchern und dessen maschinelle Verarbeitung. Das gesamte Unfallgeschehen wird mit algorithmisch auswertbaren Schlusseln strukturiert klassifiziert. Unfalldeskriptoren und Fehlervariablen werden auf der Basis des 3-Ebenen-Modells der Unfallentstehung zu Risikoprofilen und Praventionsmassnahmen verarbeitet. Ariadne SMS basiert auf aktueller Web-IT und ist durch selbstlernende Netze innovativ. Automatisierte Muster- und Spracherkennungsverfahren generieren valide Risikovorhersagen und Simulationen der Wirksamkeit von Massnahmen auf die Risikoverteilung. Fehlerquellen entfallen, Bearbeitungsschritte werden eingespart, Informationen sind transparent und jederzeit verfugbar. Der praktische Einsatz bei Bundeswehr und Berufsgenossenschaften fuhrte zu erheblichen Einsparungen. Anwendungen in Medizin und Unternehmensfuhrung, im Umwelt- und Katastrophenschutz sowie bei Versicherungen sind moeglich.
Die Betriebsfestigkeitsprufung von Bauteilen ist u. a. erforderlich, um ihre Lebensdauer zu berechnen. Dazu werden verschiedene Methoden angewendet, die jeweils Vor- und Nachteile aufweisen. Bei schwingender Beanspruchung ist die Methode der Zahlverfahren geeignet. Sie wird in dem vorliegenden Buch beschrieben und bewertet. Die Autoren bieten einen kompakten Uberblick uber den Stand der Technik und liefern Empfehlungen fur die Anwendung in der Praxis."
In der vorliegenden zweiten Auflage wurde der Methodenteil um die Wertstromanalyse erganzt, die sich zu einem Standardwerkzeug der Analyse und Verbesserung von Prozessen etabliert hat. Neu ist ein Kapitel uber die Verbesserungs-Kata, die als ganzheitliche Verbesserungsmethode zurzeit diskutiert wird, sowie ein Kapitel uber die IT-Unterstutzung von Prozessen, das einen Einblick in gangige Systeme zur Automatisierung von Prozessen gibt. Trotz Bestrebungen zur Kostenreduzierung investieren die meisten Unternehmen in die Optimierung ihrer Arbeitsablaufe und Organisation. Denn wer Prozesse optimal definiert, gestaltet und umsetzt, kann nicht nur Kunden besser zufrieden stellen, sondern hat damit auch die Moeglichkeit, auf den Kosten- und Wettbewerbsdruck mit "schlanken" und wertschoepfenden Prozessen zu reagieren. Ein Unternehmen ohne Prozesse gibt es nicht. Nur wenn die Handlungen einzelner Mitarbeiter entlang einer Ablauf- oder Prozesskette koordiniert werden, kann das Unternehmen erfolgreich agieren. Diese Koordination stellt eine hoch komplexe Aufgabenstellung dar, die Unternehmen effizient loesen mussen. Eine Moeglichkeit besteht darin, ein kontinuierliches Prozessmanagement zu etablieren. Damit Konzepte wie Six Sigma, Kaizen und Total Quality Management effizient angewandt werden koennen, wird ein grundlegendes Verstandnis des Projektmanagements benoetigt. Eine sichere Anwendung dieser Methoden wird immer mehr zu einer Schlusselqualifikation nicht nur fur Studierende und Absolventen, sondern auch fur Berufspraktiker vom "Denken in Funktionen" hin zum "Denken in Prozessen".
Aufgrund der komplexen Baustrukturen und der oft geringen Losgroen sind leistungselektronische Systeme haufig mit einer kostenintensiven Montage verbunden. Durch die Betrachtung unterschiedlichster Aspekte vom Design uber die Produktion von Leistungselektronik bis hin zu erganzenden Dienstleistungen und der Mitarbeiterqualifizierung werden Entscheidungshilfen fur erfolgreiche Produkte im globalen Wettbewerb gegeben. Das Konzept dabei ist die Kombination technologischer Inhalte und organisatorischer Methoden fur die integrale Auslegung von Produkt und Montage.
This textbook presents methodologies and applications associated with multiple criteria decision analysis (MCDA), especially for those students with an interest in industrial engineering. With respect to methodology, the book covers (1) problem structuring methods; (2) methods for ranking multi-dimensional deterministic outcomes including multiattribute value theory, the analytic hierarchy process, the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), and outranking techniques; (3) goal programming,; (4) methods for describing preference structures over single and multi-dimensional probabilistic outcomes (e.g., utility functions); (5) decision trees and influence diagrams; (6) methods for determining input probability distributions for decision trees, influence diagrams, and general simulation models; and (7) the use of simulation modeling for decision analysis.
This book gathers the proceedings of the 12th International Conference on Measurement and Quality Control - Cyber Physical Issues (IMEKO TC 14 2019), held in Belgrade, Serbia, on 4-7 June 2019. The event marks the latest in a series of high-level conferences that bring together experts from academia and industry to exchange knowledge, ideas, experiences, research findings, and information in the field of measurement of geometrical quantities. The book addresses a wide range of topics, including: 3D measurement of GPS characteristics, measurement of gears and threads, measurement of roughness, micro- and nano-metrology, laser metrology for precision measurements, cyber physical metrology, optical measurement techniques, industrial computed tomography, multisensor techniques, intelligent measurement systems, evaluating measurement uncertainty, dimensional management in industry, product quality assurance methods, and big data analytics. By providing updates on key issues and highlighting recent advances in measurement and quality control, the book supports the transfer of vital knowledge to the next generation of academics and practitioners.
This is a comprehensive, user-friendly and hands-on book that is a single source of reference of tools and techniques for all quality practitioners. Implementing Six Sigma and Lean covers the basics of how to manage for consistently high quality and gives good coverage of both simple tools and advanced techniques which can be used in all businesses. This book provides guidance on how to use these tools for different situations such as new start-up companies, stalled projects and the constant achievement of high quality in well-established quality regimes.Case studies are included that encourage the reader to respond in a practical situations and provide a good learning resource for courses. There are summaries of key elements and questions with exercises at the end of each chapter.
This book is a compilation of perspectives provided by several winners of the ASQ Feigenbaum Medal, which is awarded each year to an individual under the age of 35 who has made a significant contribution to the field of Quality. As such, it serves as a valuable reference book in this area. It is primarily based on the medalists' vision to "refresh" and "re-think" the quality concepts that have been used over the past century and the future development of the topic. Maximizing readers' understanding of the ways in which Quality is created, it provides insights from pioneers in this field from around the globe and anticipates how and what Quality will be in the future, as well as how people and organizations can benefit from it today.
This is a practical book for health and IT professionals who need to ensure that patient safety is prioritized in the design and implementation of clinical information technology. Healthcare professionals are increasingly reliant on information technology to deliver care and inform their clinical decision making. Health IT provides enormous benefits in efficiency, communication and decision making. However a number of high-profile UK and US studies have concluded that when Health IT is poorly designed or sub-optimally implemented then patient safety can be compromised. Manufacturers and healthcare organizations are increasingly required to demonstrate that their Health IT solutions are proactively assured. Surprisingly the majority of systems are not subject to regulation so there is little in the way of practical guidance as to how risk management can be achieved. The book fills that gap. The author, a doctor and IT professional, harnesses his two decades of experience to characterize the hazards that health technology can introduce. Risk can never be eliminated but by drawing on lessons from other safety-critical industries the book systematically sets out how clinical risk can be strategically controlled. The book proposes the employment of a Safety Case to articulate and justify residual risk so that not only is risk proactively managed but it is seen to be managed. These simple techniques drive product quality and allow a technology's benefits to be realized without compromising patient safety.
This book introduces condition-based maintenance (CBM)/data-driven prognostics and health management (PHM) in detail, first explaining the PHM design approach from a systems engineering perspective, then summarizing and elaborating on the data-driven methodology for feature construction, as well as feature-based fault diagnosis and prognosis. The book includes a wealth of illustrations and tables to help explain the algorithms, as well as practical examples showing how to use this tool to solve situations for which analytic solutions are poorly suited. It equips readers to apply the concepts discussed in order to analyze and solve a variety of problems in PHM system design, feature construction, fault diagnosis and prognosis.
This is the first English language book that systematically introduces the spatial and temporal patterns of major natural disasters in China from 1949 to 2014. It also reveals natural disaster formation mechanisms and processes, quantifies vulnerability to these disasters, evaluates disaster risks, summarizes the key strategies of integrated disaster risk governance, and analyzes large-scale disaster response cases in recent years in China. The book can be a good reference for researchers, students, and practitioners in the field of natural disaster risk management and risk governance for improving the understanding of natural disasters in China.
"...a comprehensive and well written book, which...will be useful reading for both researchers entering the field and experienced specialists looking for new ideas....a valuable and long-lasting contribution to experimental mechanics." - Stepan Lomov, KU Leuven This expert volume, an enhanced Habilitation thesis by the head of the Materials Testing Research Group at the University of Augsburg, provides detailed coverage of a range of inspection methods for insitu characterization of fiber-reinforced composites. The failure behavior of fiber reinforced composites is a complex evolution of microscopic damage phenomena. Beyond the use of classical testing methods, the ability to monitor the progression of damage insitu offers new ways to interpret the materials failure modes. Methods covered include digital image correlation, acoustic emission, electromagnetic emission, computed tomography, thermography, shearography, and promising method combinations. For each method, the discussion includes operational principles and practical applications for quality control as well as thoughtful assessment of the method's strengths and weakness so that the reader is equipped to decide which method or methods are most appropriate in a given situation. The book includes extensive appendices covering common experimental parameters influencing comparability of acoustic emission measurements; materials properties for modeling; and an overview of terms and abbreviations.
This book offers a thorough and systematic introduction to the modified failure mode and effect analysis (FMEA) models based on uncertainty theories (e.g. fuzzy logic, intuitionistic fuzzy sets, D numbers and 2-tuple linguistic variables) and various multi-criteria decision making (MCDM) approaches such as distance-based MCDM, compromise ranking MCDM and hybrid MCDM, etc. As such, it provides essential FMEA methods and practical examples that can be considered in applying FMEA to enhance the reliability and safety of products and services. The book offers a valuable guide for practitioners and researchers working in the fields of quality management, decision making, information science, management science, engineering, etc. It can also be used as a textbook for postgraduate and senior undergraduate students.
This book describes the prerequisites for the placing on the market and the safe use of machinery in compliance with the relevant EU Directives, especially the Machinery Directive 2006/42. It provides readers with high-level knowledge concerning the Essential Health and Safety Requirements (EHSR) that machinery must fulfill. The approach and principles of the Machinery Directive were most recently made worldwide acknowledged in the ILO code of practice on safe machinery, released in 2013. The book addresses that code, as well as providing valuable insight into other EU Product and Workplace legislation. Focusing on the key aspect of safe machinery, the "machinery safety risk assessment", which allows readers to better understand the more difficult aspects of risk assessments, the book equips readers to tackle problems at the manufacturing stage and in different use scenarios, introducing them to risk reduction techniques and functional safety aspects.
Structural Health Monitoring, Damage Detection & Mechatronics, Volume 7. Proceedings of the 34th IMAC, A Conference and Exposition on Dynamics of Multiphysical Systems: From Active Materials to Vibroacoustics, 2016, the seventh volume of ten from the Conference brings together contributions to this important area of research and engineering. The collection presents early fi ndings and case studies on fundamental and applied aspects of Structural Dynamics, including papers on: * Structural Health Monitoring* Damage Detection * Numerical Modeling * Mechatronics * System Identifi cation * Active Controls
This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.
This book addresses a key issue in today's society: the safer transport of dangerous goods, taking into account people, the environment and economics. In particular, it offers a potential approach to identifying the issues, developing the models, providing the methods and recommending the tools to address the risks and vulnerabilities involved. We believe this can only be achieved by assessing those risks in a comprehensive, quantifiable and integrated manner. Examining both rail and road transportation, the book is divided into three sections, covering: the mature and accepted (by both academia and practitioners) methodology of risk assessment; the vulnerability assessment - a novel approach proposed as a vital complement to risk; guidance and support to build the tools that make methods and equations to yield: the Decision Support Systems. Throughout the book, the authors do not endeavor to provide THE solution. Instead, the book offers insightful food for thought for students, researchers, practitioners and policymakers alike.
This book details how safety (i.e. the absence of unacceptable risks) is ensured in areas where potentially explosive atmospheres (ATEX) can arise. The book also offers readers essential information on how to comply with the newest (April 2016) EU legislation when the presence of ATEX cannot be avoided. By presenting general guidance on issues arising out of the EU ATEX legislation - especially on zone classification, explosion risk assessment, equipment categorization, Ex-marking and related technical/chemical aspects - the book provides equipment manufacturers, responsible employers, and others with the essential knowledge they need to be able to understand the different - and often complicated - aspects of ATEX and to implement the necessary safety precautions. As such, it represents a valuable resource for all those concerned with maintaining high levels of safety in ATEX environments.
The book introduces basic risk concepts and then goes on to discuss risk management and analysis processes and steps. The main emphasis is on methods that fulfill the requirements of one or several risk management steps. The focus is on risk analysis methods including statistical-empirical analyses, probabilistic and parametrized models, engineering approaches and simulative methods, e.g. for fragment and blast propagation or hazard density computation. Risk management is essential for improving all resilience management steps: preparation, prevention, protection, response and recovery. The methods investigate types of event and scenario, as well as frequency, exposure, avoidance, hazard propagation, damage and risks of events. Further methods are presented for context assessment, risk visualization, communication, comparison and assessment as well as selecting mitigation measures. The processes and methods are demonstrated using detailed results and overviews of security research projects, in particular in the applications domains transport, aviation, airport security, explosive threats and urban security and safety. Topics include: sufficient control of emerging and novel hazards and risks, occupational safety, identification of minimum (functional) safety requirements, engineering methods for countering malevolent or terrorist events, security research challenges, interdisciplinary approaches to risk control and management, risk-based change and improvement management, and support of rational decision-making. The book addresses advanced bachelor students, master and doctoral students as well as scientists, researchers and developers in academia, industry, small and medium enterprises working in the emerging field of security and safety engineering.
This book introduces the concept of holistic design and development of cyber physical systems to achieve their safe and secure operation. It shows that by following the standards for embedded system's safety and using appropriate hardware and software components inherently safe system's architectures can be devised and certified. While the standards already enable testing and certification of inherently safe and sound hardware, this is still not the case with software. The book demonstrates that Specification PEARL(SPEARL) addresses this issue and proposes appropriate solutions from the viewpoints of software engineering as well as concrete program components. By doing so it reduces the complexity of cyber physical systems design in an innovative way. Three ultimate goals are being followed in the course of defining this new PEARL standard, namely: 1. simplicity over complexity, 2. inherent real-time ability, and 3. conformity to safety integrity and security capability levels.
This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.
This book promotes and describes the application of objective and effective decision making in asset management based on mathematical models and practical techniques that can be easily implemented in organizations. This comprehensive and timely publication will be an essential reference source, building on available literature in the field of asset management while laying the groundwork for further research breakthroughs in this field. The text provides the resources necessary for managers, technology developers, scientists and engineers to adopt and implement better decision making based on models and techniques that contribute to recognizing risks and uncertainties and, in general terms, to the important role of asset management to increase competitiveness in organizations. |
You may like...
|