![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment > General
New Edition! Completely Revised and Updated
This book provides descriptions of current laboratory accreditation schemes and explains why these schemes fall short of assuring data purchasers that the data produced from accredited laboratories are always quality products. The book then presents a system for laboratory accreditation in conjunction with data certification that assures data purchasers their data are useful for the purposes for which they are intended. Simple quality assurance and quality control techniques, in addition to concepts of total quality management, are described and then applied to the environmental laboratory industry. This "System For Success" was developed from real problems and real solutions within the industry and represents an integration of proven techniques that offer a better way to ensure quality laboratory data is obtained. Laboratory Accreditation: A Workable Solution is a must for government officials, environmental professionals, independent environmental laboratories, hazardous waste disposal industries, chemical manufacturers, QA professionals, and testing laboratories.
This new manual is an indispensable working lab guide and reference for water/wastewater quality analysis. Based on procedures from "Standard Methods" and "Methods for Chemical Analysis of Water and Waste (EPA)," and other pertinent references the Water and Wastewater Examination Manual is an excellent complement to these references-that you will want to keep at your fingertips. Written especially for use by water quality laboratory technicians and water/wastewater operators, managers and supervisors-who will use this practical manual every day. Procedures are included for parameters frequently used in water quality analysis.
Covers experiment planning, execution, analysis, and reporting This single-source resource guides readers in planning and conducting credible experiments for engineering, science, industrial processes, agriculture, and business. The text takes experimenters all the way through conducting a high-impact experiment, from initial conception, through execution of the experiment, to a defensible final report. It prepares the reader to anticipate the choices faced during each stage. Filled with real-world examples from engineering science and industry, Planning and Executing Credible Experiments: A Guidebook for Engineering, Science, Industrial Processes, Agriculture, and Business offers chapters that challenge experimenters at each stage of planning and execution and emphasizes uncertainty analysis as a design tool in addition to its role for reporting results. Tested over decades at Stanford University and internationally, the text employs two powerful, free, open-source software tools: GOSSET to optimize experiment design, and R for statistical computing and graphics. A website accompanies the text, providing additional resources and software downloads. A comprehensive guide to experiment planning, execution, and analysis Leads from initial conception, through the experiment's launch, to final report Prepares the reader to anticipate the choices faced throughout an experiment Hones the motivating question Employs principles and techniques from Design of Experiments (DoE) Selects experiment designs to obtain the most information from fewer experimental runs Offers chapters that propose questions that an experimenter will need to ask and answer during each stage of planning and execution Demonstrates how uncertainty analysis guides and strengthens each stage Includes examples from real-life industrial experiments Accompanied by a website hosting open-source software Planning and Executing Credible Experiments is an excellent resource for graduates and senior undergraduates--as well as professionals--across a wide variety of engineering disciplines.
Laboratory studies in hemostasis have traditionally focused on abn- malities of platelet function or the quantitative and qualitative disorders that affect the proteins involved in blood coagulation. However, over the last 10 years there has been an explosion in our understanding of the molecular bases that underlie many of the inherited and acquired disorders of hemostasis. Many of these disorders are now routinely diagnosed and assessed by methods that involve genotypic analysis. Indeed in the late 1990s the distinction between molecular methods for research and for routine diagnosis is becoming incre- ingly blurred. The techniques and approaches that are used in hemostasis are manifold and published in isolation in a variety of publications. The aim, therefore, of this volume Hemostasis and Thrombosis Protocols is to pull together, into a single volume, the variety of techniques that are frequently used in the field of hemostasis. We have targeted this volume at laboratories who wish to move into the field of molecular hemostasis or who may already have some expe- ence in this area but wish to develop new areas of research and diagnosis. The chapters are wide-ranging and hopefully provide a broad overview of the d- fering applications in which these standard techniques can be used. Though the articles may appear relatively specific, the techniques contained within them are applicable to the study of many different disorders and we hope that they provide a series of ideas and concepts well-suited to problem solving.
With Bayesian statistics rapidly becoming accepted as a way to solve applied statisticalproblems, the need for a comprehensive, up-to-date source on the latest advances in thisfield has arisen.Presenting the basic theory of a large variety of linear models from a Bayesian viewpoint,Bayesian Analysis of Linear Models fills this need. Plus, this definitive volume containssomething traditional-a review of Bayesian techniques and methods of estimation, hypothesis,testing, and forecasting as applied to the standard populations ... somethinginnovative-a new approach to mixed models and models not generally studied by statisticianssuch as linear dynamic systems and changing parameter models ... and somethingpractical-clear graphs, eary-to-understand examples, end-of-chapter problems, numerousreferences, and a distribution appendix.Comprehensible, unique, and in-depth, Bayesian Analysis of Linear Models is the definitivemonograph for statisticians, econometricians, and engineers. In addition, this text isideal for students in graduate-level courses such as linear models, econometrics, andBayesian inference.
This book gives the pertinent information on the high pressure liquid chromatography (HPLC) analyses of all the compounds of interest in nucleic acid metabolism. It aids chromatographers, biochemists, biomedical researchers, and chemists by giving information on applications of HPLC technique.
Science Sifting is designed primarily as a textbook for students interested in research and as a general reference book for existing career scientists. The aim of this book is to help budding scientists broaden their capacities to access and use information from diverse sources to the benefit of their research careers.The book describes why the capacity to access and integrate both linear and nonlinear information has been an important historic feature of pivotal scientific breakthroughs. Yet, it is a process that our students are rarely, if ever, taught in universities. This book goes beyond simply describing the features of great scientific breakthroughs. It discusses the basis for accessing and using nonlinear information in the linear research context. It also provides a series of tools and exercises that can be used to enhance access to nonlinear information for application to research and other endeavors.Topics covered include focal points in scientific breakthroughs, the use of concepts maps in research, use of different vantage points, information as patterns, fractals for the scientist, memory storage and access points, and synchronicities. Young researchers need useful tools to help with a more holistic approach to their research careers. This book provides the useful tools to support flexibility and creativity across a long-term research career.Roald Hoffmann - Winner of the 1981 Nobel Prize in Chemistry - has contributed the to Science Sifting. More information on Professor Hoffmann can be found at .
This book describes a comprehensive regression analysis to the conduct of scientific research. It outlines theoretical principals underlying the techniques utilized in regression analysis and illustrates their application on a variety of data sets.
Complex mathematical and computational models are used in all areas of society and technology and yet model based science is increasingly contested or refuted, especially when models are applied to controversial themes in domains such as health, the environment or the economy. More stringent standards of proofs are demanded from model-based numbers, especially when these numbers represent potential financial losses, threats to human health or the state of the environment. Quantitative sensitivity analysis is generally agreed to be one such standard. Mathematical models are good at mapping assumptions into inferences. A modeller makes assumptions about laws pertaining to the system, about its status and a plethora of other, often arcane, system variables and internal model settings. To what extent can we rely on the model-based inference when most of these assumptions are fraught with uncertainties? Global Sensitivity Analysis offers an accessible treatment of such problems via quantitative sensitivity analysis, beginning with the first principles and guiding the reader through the full range of recommended practices with a rich set of solved exercises. The text explains the motivation for sensitivity analysis, reviews the required statistical concepts, and provides a guide to potential applications. The book: Provides a self-contained treatment of the subject, allowing readers to learn and practice global sensitivity analysis without further materials. Presents ways to frame the analysis, interpret its results, and avoid potential pitfalls. Features numerous exercises and solved problems to help illustrate the applications. Is authored by leading sensitivityanalysis practitioners, combining a range of disciplinary backgrounds. Postgraduate students and practitioners in a wide range of subjects, including statistics, mathematics, engineering, physics, chemistry, environmental sciences, biology, toxicology, actuarial sciences, and econometrics will find much of use here. This book will prove equally valuable to engineers working on risk analysis and to financial analysts concerned with pricing and hedging.
Science Sifting is designed primarily as a textbook for students interested in research and as a general reference book for existing career scientists. The aim of this book is to help budding scientists broaden their capacities to access and use information from diverse sources to the benefit of their research careers.The book describes why the capacity to access and integrate both linear and nonlinear information has been an important historic feature of pivotal scientific breakthroughs. Yet, it is a process that our students are rarely, if ever, taught in universities. This book goes beyond simply describing the features of great scientific breakthroughs. It discusses the basis for accessing and using nonlinear information in the linear research context. It also provides a series of tools and exercises that can be used to enhance access to nonlinear information for application to research and other endeavors.Topics covered include focal points in scientific breakthroughs, the use of concepts maps in research, use of different vantage points, information as patterns, fractals for the scientist, memory storage and access points, and synchronicities. Young researchers need useful tools to help with a more holistic approach to their research careers. This book provides the useful tools to support flexibility and creativity across a long-term research career.Roald Hoffmann - Winner of the 1981 Nobel Prize in Chemistry - has contributed the to Science Sifting. More information on Professor Hoffmann can be found at .
The first edition of this classic book has become the authoritative reference for physicists desiring to master the finer points of statistical data analysis. This second edition contains all the important material of the first, much of it unavailable from any other sources. In addition, many chapters have been updated with considerable new material, especially in areas concerning the theory and practice of confidence intervals, including the important Feldman-Cousins method. Both frequentist and Bayesian methodologies are presented, with a strong emphasis on techniques useful to physicists and other scientists in the interpretation of experimental data and comparison with scientific theories. This is a valuable textbook for advanced graduate students in the physical sciences as well as a reference for active researchers.
Written by an author with more than 40 years of teaching experience in the field, Experiments in Pharmaceutical Chemistry, Second Edition responds to a critical classroom need for material on directed laboratory investigations in biological and pharmaceutical chemistry. This new edition supplies 75 experiments, expanding the range of topics to 22 major areas of pharmaceutical chemistry. These include biochemical groups, botanical classes important to pharmacy, and major drug classifications:
Sections contain introductions to basic concepts underlying the fields addressed and a specific bibliography relating to each field. Each experiment provides detailed instructions in a user-friendly format, and can be carried out, in most cases, without the need for expensive instrumentation. This comprehensive laboratory manual offers much-needed instructional material for teaching laboratory classes in pharmaceutical chemistry. The breadth of subject matter covered provides a variety of choices for structuring a laboratory course.
This book deals exclusively and comprehensively with the role of proficiency testing in the quality assurance of analytical data. It covers in detail proficiency testing schemes from the perspectives of scheme organisers, participant laboratories and the ultimate end-users of analytical data. A wide variety of topics are addressed including the organisation, effectiveness, applicability, and the costs and benefits of proficiency testing. Procedures for the evaluation and interpretation of laboratory proficiency, and the relation of proficiency testing to other quality assurance measures are also discussed. Proficiency Testing in Analytical Chemistry is an important addition to the literature on proficiency testing and is essential reading for practising analytical chemists and all organisations and individuals with an interest in the quality of analytical data.
Why do Japanese artists team up with engineers in order to create so-called "Device Art"? What is a nanoscientist's motivation in approaching the artworld? In the past few years, there has been a remarkable increase in attempts to foster the exchange between art, technology, and science - an exchange taking place in academies, museums, or even in research laboratories. Media art has proven especially important in the dialogue between these cultural fields. This book is a contribution to the current debate on "art & science", interdisciplinarity, and the discourse of innovation. It critically assesses artistic positions that appear as the ongoing attempt to localize art's position within technological and societal change - between now and the future.
Paul Feyeraband famously asked, what's so great about science? One answer is that it has been surprisingly successful in getting things right about the natural world, more successful than non-scientific or pre-scientific systems, religion or philosophy. Science has been able to formulate theories that have successfully predicted novel observations. It has produced theories about parts of reality that were not observable or accessible at the time those theories were first advanced, but the claims about those inaccessible areas have since turned out to be true. And science has, on occasion, advanced on more or less a priori grounds theories that subsequently turned out to be highly empirically successful. In this book the philosopher of science, John Wright delves deep into science's methodology to offer an explanation for this remarkable success story.
Basic principles of applied life sciences such as recombinant DNA technology is used in most life sciences industries marketing bio-formulations for designing more effective protein-based drugs, such as erythropoietin and fast-acting insulin etc. In recent times genetically engineered host cells from mammal, animal and plants are also being used in life sciences industries to manufacture biologics. This book discusses the most basic as well advanced issues on biological products for successfully managing a life sciences industry. It elucidates the life cycle of biological molecules, right from the conceptual development of different types of biopolymers, and their subsequent transfer from the conical flasks in laboratory to life sciences industries for large scale production and marketing. It focuses on sustainable longevity in the life cycle of commercial biopolymers. Cumulative facts and figures in this volume would immensely help in inspiring life sciences industry promoters to monitor value chain transfer process of biologics for better profitability. Additionally, it would serve as a perusal document for the students and researchers interested in entrepreneurial ventures or having their own start-up projects for the commercialization of biologics.
This book is focused on the current status of industrial pollution, its source, characteristics, and management through various advanced treatment technologies. The book covers the recycle, reuse and recovery of waste for the production of value-added products. The book explores industrial wastewater pollution and its treatment through various advanced technologies and also the source and characteristics of solid waste and its management for environmental safety. It discusses new methods and technologies to combat the waste-related pollution and focuses on the use of recycled products. This book is of value to upcoming students, researchers, scientists, industry persons and professionals in the field of environmental science and engineering, microbiology, biotechnology, toxicology, further it is useful for global and local authorities and policy makers responsible for the management of liquid and solid wastes.
Spark scientific curiosity from a young age with this six-level course through an enquiry-based approach and active learning. Collins International Primary Science fully meets the requirements of the Cambridge Primary Science Curriculum Framework from 2020 and has been carefully developed for a range of international contexts. The course is organised into four main strands: Biology, Chemistry, Physics and Earth and Space and the skills detailed under the 'Thinking and Working Scientifically' strand are introduced and taught in the context of those areas. For each Workbook at Stages 1 to 6, we offer: A write-in Workbook linked to the Student's Book New language development activities help build science vocabulary Earth and Space content covers the new curriculum framework Thinking and Working Scientifically deepens and enhances the delivery of Science skills Actively learn through practical activities that don't require specialist equipment or labs Scaffolding allows students of varying abilities to work with common content and meet learning objectives Supports Cambridge Global Perspectives (TM) with activities that develop and practise key skills Provides learner support as part of a set of resources for the Cambridge Primary Science curriculum framework (0097) from 2020 This series is endorsed by Cambridge Assessment International Education to support the new curriculum framework 0097 from 2020.
A renowned philosopher's final work, illuminating how the logical empiricist tradition has failed to appreciate the role of actual experiments in forming its philosophy of science. The logical empiricist treatment of physics dominated twentieth-century philosophy of science. But the logical empiricist tradition, for all it accomplished, does not do justice to the way in which empirical evidence functions in modern physics. In his final work, the late philosopher of science William Demopoulos contends that philosophers have failed to provide an adequate epistemology of science because they have failed to appreciate the tightly woven character of theory and evidence. As a consequence, theory comes apart from evidence. This trouble is nowhere more evident than in theorizing about particle and quantum physics. Arguing that we must consider actual experiments as they have unfolded across history, Demopoulos provides a new epistemology of theories and evidence, albeit one that stands on the shoulders of giants. On Theories finds clarity in Isaac Newton's suspicion of mere "hypotheses." Newton's methodology lies in the background of Jean Perrin's experimental investigations of molecular reality and of the subatomic investigations of J. J. Thomson and Robert Millikan. Demopoulos extends this account to offer novel insights into the distinctive nature of quantum reality, where a logico-mathematical reconstruction of Bohrian complementarity meets John Stewart Bell's empirical analysis of Einstein's "local realism." On Theories ultimately provides a new interpretation of quantum probabilities as themselves objectively representing empirical reality.
This detailed book highlights recent advances in molecular imaging techniques and protocols, designed to be immediately applicable in global bio-laboratories. The chapters are categorized into seven major groups according to the reporter materials, such as imaging with passive optical readouts, activatable bioluminescent probes, functional substrates and luciferases, organic fluorescent probes, BRET probes, FRET probes, as well as with advanced instrumentation. Written for the highly successful Methods in Molecular Biology series, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and tips on troubleshooting and avoiding known pitfalls. Authoritative and practical, Live Cell Imaging: Methods and Protocols aims to direct and inspire researchers into creating smarter, next-generation imaging techniques that are truly quantitative, highly sensitive, and readily comprehended, in the effort to engender deeper understanding of biological systems and break new ground in the research fields of life science.
In the 1980's sonochemistry was considered to be a rather restricted branch of chemistry involving the ways in which ultrasound could improve synthetic procedures, predominantly in heterogeneous systems and particularly for organometallic reactions. Within a few years the subject began to expand into other disciplines including food technology, environmental protection and the extraction of natural materials. Scientific interest grew and led to the formation of the European Society of Sonochemistry in 1990 and the launch of a new journal Ultrasonics Sonochemistry in 1994. The subject continues to develop as an exciting and multi-disciplinary science with the participation of not only chemists but also physicists, engineers and biologists. The resulting cross-fertilisation of ideas has led to the rapid growth of interdisciplinary research and provided an ideal way for young researchers to expand their knowledge and appreciation of the ways in which different sciences can interact. It expands scientific knowledge through an opening of the closed doors that sometimes restrict the more specialist sciences. The journey of exploration in sonochemistry and its expansion into new fields of science and engineering is recounted in "Sonochemistry Evolution and Expansion" written by two pioneers in the field. It is unlike other texts about sonochemistry in that it follows the chronological developments in several very different applications of sonochemistry through the research experiences of the two authors Tim Mason and Mircea Vinatoru. Designed for chemists and chemical engineers Written by two experts and practitioners in the subject Volume 1 covers the historical background and evolution of sonochemistry Volume 2 explains the wider applications and expansion of the subject VOLUME 1 Fundamentals and Evolution This volume traces the evolution of sonochemistry from the very beginning when the effects of acoustic cavitation were first reported almost as a scientific curiosity. The major developments of the subject from the 1980's are described by the authors who became active participants in the field during that period. A chapter is devoted to ultrasonically assisted extraction (UAE) which illustrates the different ways in which sonochemical technologies can be applied in both batch and flow modes leading to the development of large-scale processing. The chapter on environmental protection shows the wide range of applications of sonochemistry in this important field for both biological and chemical decontamination.
Electron Magnetic Resonance: Applications in Physical Sciences and Biology, Volume 50, describes the principles and recent trends in different experimental methods of Electron Magnetic Resonance (EMR) spectroscopy. In addition to principles, experimental methods and applications, each chapter contains a complete list of references that guide the reader to relevant literature. The book is intended for both skilled and novice researchers in academia, professional fields, scientists and students without any geographical limitations. It is useful for both beginners and experts in the field of Electron Spin Resonance who are looking for recent experimental methods of EMR techniques.
This completely revised edition explores novel discoveries in bacterial genomic research, with a focus on technical and computational improvements as well as methods used for bacterial pangenome analysis, which relies on microbiome studies and metagenomic data. Beginning with up-to-date sequencing methods, the book continues with sections covering methods for deep phylogenetic analysis, the role of metagenomic data in understanding the genomics of the many yet uncultured bacteria, progress in genome-to-phenome inference, as well as computational genomic tools. Written for the highly successful Methods in Molecular Biology series, chapters include the type of practical detail necessary for reproducible results in the lab. Authoritative and up-to-date, Bacterial Pangenomics: Methods and Protocols, Second Edition serves as an ideal guide for both highly qualified investigators in bacterial genomics and for less experienced researchers, including students and teachers, who could use a reference for approaching genomic analysis and genome data.
The Sunday Times Top Ten Bestseller Have you ever wondered if a severed head retains consciousness long enough to see what happened to it? Or whether your dog would run to fetch help, if you fell down a disused mineshaft? And what would happen if you were to give an elephant the largest ever single dose of LSD? The chances are that someone, somewhere has conducted a scientific experiment to find out... 'Excellent accounts of some of the most important and interesting experiments in biology and psychology' Simon Singh If left to their own devices, would babies instinctively choose a well-balanced diet? Discover the secret of how to sleep on planes Which really tastes better in a blind tasting - Coke or Pepsi? |
![]() ![]() You may like...
Data Ethics of Power - A Human Approach…
Gry Hasselbalch
Hardcover
Smart Technologies - Breakthroughs in…
Information Resources Management Association
Hardcover
R9,165
Discovery Miles 91 650
Second International Conference on…
Maurizio Palesi, Ljiljana Trajkovic, …
Hardcover
R5,128
Discovery Miles 51 280
AI, IoT, and Blockchain Breakthroughs in…
Kavita Saini, N.S. Gowri Ganesh, …
Hardcover
R6,439
Discovery Miles 64 390
|