![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment
Innovations in crystallographic instrumentation and the rapid development of methods of diffraction measurement have led to a vast improvement in our ability to determine crystal and molecular structure. This up-to-date resource will allow the reader to harness the potential of X-ray diffraction instruments. Different sources of X-radiation used in crystallography are introduced, including synchrotron radiation, as well as a systematic review of detectors for X-rays and basic instruments for single crystal and powder diffractometry. The principles of the diffraction experiment are discussed and related to their practical application with a comparative description of different scan procedures. Diffraction data collection and processing are also reviewed and methods for error correction are described. This book will provide a useful guide for researchers and students starting in this area of science, as well as skilled crystallographers.
Computer Techniques for Image Processing in Electron Microscopy, Volume 214 in the Advances in Imaging and Electron Physics series, presents the latest advances in the field, with this new volume covering Image Formation Theory, The Discrete Fourier Transform, Analytic Images, The Image and Diffraction Plane Problem: Uniqueness, The Image and Diffraction Plane Problem: Numerical Methods, The Image and Diffraction Plane Problem: Computational Trials, Alternative Data for the Phase Determination, The Hardware of Digital Image Handling, Basic Software or Digital Image Handling, Improc, and much more.
Electron Magnetic Resonance: Applications in Physical Sciences and Biology, Volume 50, describes the principles and recent trends in different experimental methods of Electron Magnetic Resonance (EMR) spectroscopy. In addition to principles, experimental methods and applications, each chapter contains a complete list of references that guide the reader to relevant literature. The book is intended for both skilled and novice researchers in academia, professional fields, scientists and students without any geographical limitations. It is useful for both beginners and experts in the field of Electron Spin Resonance who are looking for recent experimental methods of EMR techniques.
Since its discovery, Atomic Force Microscopy (AFM) has become a technique of choice for non-destructive surface characterization with sub-molecular resolution. The AFM has also emerged as a problem-solving tool in applications relevant to particle-solid and particle-liquid interactions, design, fabrication, and characterization of new materials, and development of new technologies for processing and modification of materials. This volume is a comprehensive review of AFM techniques and their application in adhesion studies. It is intended for both researchers and students in engineering disciplines, physics and biology. Over 100 authors contributed to this book, summarizing current status of research on measurements of colloidal particle-solid adhesion and molecular forces, solid surface imaging and mapping, and discussing the contact mechanics models applicable to particle-substrate and particle-particle systems.
Stereology, or quantitative microscopy, is a basic research tool in science and technology. The emergence of design-based methods has greatly increased the power, flexibility, adaptability, and scope of stereology applications, establishing a closer connection between statistics and quantitative microscopy. Despite its scientific importance, modern stereology remains largely unknown to the statistical community, with valuable information either widely scattered or inaccessible to newcomers to the field. Now is the perfect time for a book that enables biostatisticians and statistical consultants to give beneficial advice to researchers in microscopy. Stereology for Statisticians sets out the principles of stereology from a statistical viewpoint, focusing on both basic theory and practical implications. This book discusses ways to effectively communicate statistical issues to clients, draws attention to common methodological errors, and provides references to essential literature. The first full text on design-based stereology, it opens with a review of classical and modern stereology, followed by a treatment of mathematical foundations such as geometry, probability, and statistical inference. The book then presents core techniques, including estimation of absolute geometrical quantities, relative quantities, and statistical inference for populations of discrete objects. The final chapters discuss implementing techniques in practical sampling designs, summarize understanding of the variance of stereological estimators, and describe open problems for further research.
Million-copy bestselling author of The Elements, Molecules, and Reactions Theodore Gray applies his trademark mix of engaging stories, real-time experiments, and stunning photography to the inner workings of machines, big and small, revealing the extraordinary science, beauty, and rich history of everyday things. Theodore Gray has become a household name among fans, both young and old, of popular science and mechanics. He's an incorrigible tinkerer with a constant curiosity for how things work. Gray's readers love how he always brings the perfect combination of know-how, humour and daring-do to every project or demonstration, be it scientific or mechanical. In How Things Work he explores the mechanical underpinnings of dozens of types of machines and mechanisms, from the cotton gin to the wristwatch to an industrial loom. Filled with stunning original photographs in Gray's inimitable style, How Things Work is a must-have exploration of stuff - large and small - for any builder, maker or lover of mechanical things.
Developing microscale chemistry experiments, using small quantities of chemicals and simple equipment, has been a recent initiative in the UK. Microscale chemistry experiments have several advantages over conventional experiments: They use small quantities of chemicals and simple equipment which reduces costs; The disposal of chemicals is easier due to the small quantities; Safety hazards are often reduced and many experiments can be done quickly; Using plastic apparatus means glassware breakages are minimised; Practical work is possible outside a laboratory. Microscale Chemistry is a book of such experiments designed for use in schools and colleges, and the ideas behind the experiments in it come from many sources, including chemistry teachers from all around the world. Current trends indicate that with the likelihood of further environmental legislation, the need for microscale chemistry teaching techniques and experiments is likely to grow. This book should serve as a guide in this process.
Image processing is fast becoming a valuable tool for analyzing multidimensional data in all areas of natural science. Since the publication of the best-selling first edition of this handbook, the field of image processing has matured in many of its aspects from ad hoc, empirical approaches to a sound science based on established mathematical and physical principles. The Practical Handbook on Image Processing for Scientific and Technical Applications, Second Edition builds a sound basic knowledge of image processing, provides a critically evaluated collection of the best algorithms, and demonstrates those algorithms with real-world applications from many fields. It covers all aspects of image processing, from image formation to image analysis, and gives an up-to-date review of advanced concepts. Organized according to the hierarchy of tasks, each chapter includes a summary, an outline of the background the task requires, and a section of practical tips that help you avoid common errors and save valuable research time. New in the Second Edition: Expanded application areas now include technical fields such as the automotive industry, quality inspection, and materials science More practical tips in each chapter Discussion of digital camera interfaces and a comparison of CMOS vs. CCD cameras A section on wavelets Advanced techniques for reconstruction, new segmentation methods based on global optimization and anisotropic diffusion, and additional classification techniques, including neural networks, polynomial classification, and support vector machines Just as digital image processing provides the key to studying complex scientific problems that researchers once could only dream of tackling, this handbook unlocks the intricacies of image processing and brings its power and potential within the grasp of researchers across the spectrum of scientific and engineering disciplines.
Although methods and techniques that will help solve various analytical problems do exist, they are often difficult to perform. Using polarized light microscopy as the method of choice, Color Atlas and Manual of Microscopy for Criminalists, Chemists, and Conservators offers swift, simple, yet irrefutable analytical tests and testing procedures that can be used to identify organic and inorganic particles. Seasoned forensic microscopists Nicholas Petraco and Thomas Kubic have lent their expertise as consultants to forensic scientists, analytical chemists, art historians, pathologists, customs agents, detectives, gemologists, numismatists, and art conservators. Now they share their extensive photomicrograph collection of minute specimens along with clear, concise, and simple methods to help solve your analytical problems.
I In this volume, the author demystifies the Design of Experiments (DOE). He begins with a clear explanation of the traditional experimentation process. He then covers the concept of variation and the importance of experimentation and follows through with applications. Stamatis also discusses full and fractional factorials. The strength of this volume lies in the fact that not only does it introduce the concept of robustness, it also addresses "Robust Designs" with discussions on the Taguchi methodology of experimentation. And throughout the author ties these concepts into the Six Sigma philosophy and shows readers how they use those concepts in their organizations.
Private landowners or Federal Agencies responsible for cleaning up radiological environments are faced with the challenge of clearly defining the nature and extent of radiological contamination, implementing remedial alternatives, then statistically verifying that cleanup objectives have been met. Sampling and Surveying Radiological Environments provides the how-tos for designing and implementing cost effective and defensible sampling programs in radiological environments, such as those found in the vicinity of uranium mine sites, nuclear weapons production facilities, nuclear reactors, radioactive waste storage and disposal facilities, and nuclear accidents. It includes downloadable resources that walk you through the EPA's Data Quality Objectives(DQO) procedures and provides electronic templates you can complete and print. Sampling and Surveying Radiological Environments addresses all of the major topics that will assist you in designing and implementing statistically defensible sampling programs in radiological environments, including: Summary of the major environmental laws and regulations that apply to radiological sites, and advice on regulatory interfacing * Internet addresses where you can find regulations pertaining to each States Theory of radiation detection and definitions of common radiological terminology Statistics and statistical software that apply to the environmental industry Details on commercially available radiological instrumentation and detection systems Building decontamination and decommissioning, radiological and chemical equipment decontamination procedures, and tank/drum/remote characterization Standard operating procedures for collecting environmental media samples Guidance on sample preparation, documentation, and shipment Guidance on data verification/validation, radiological data management, data quality assessment (DQA)
'G. Adams in Fleet Street London' is the signature on some of the finest scientific instruments of the eighteenth century. This book is the first comprehensive study of the instrument-making business run by the Adams family, from its foundation in 1734 to bankruptcy in 1817. It is based on detailed research in the archival sources as well as examination of extant instruments and publications by George Adams senior and his two sons, George junior and Dudley. Separate chapters are devoted to George senior's family background, his royal connections, and his new globes; George junior's numerous publications, and his dealings with van Marum; and to Dudley's dabbling with 'medico-electrical therapeutics'. The book is richly illustrated with plates from the Adams's own publications and with examples of instruments ranging from unique museum pieces - such as the 'Prince of Wales' microscope - and globes to the more common, even mundane, items of the kind seen in salesrooms and dealers - the surveying, navigational and military instruments that formed the backbone of the business. The appendices include facsimiles of trade catalogues and an annotated short-title listing of the Adams family's publications, which also covers American and Continental editions, as well as the posthumous ones by W. & S. Jones.
Toxicology has made tremendous strides in the sophistication of the models used to identify and understand the mechanisms of agents that can harm or kill humans and other higher organisms. Non-animals or in vitro models started to gain significant use in the 1960s. As a result of the increased concern over animal welfare, economic factors, and the need for greater sensitivity and understanding of mechanisms, interest in in vitro models has risen. This volume demonstrates that there now exists a broad range of in vitro models for use in either identifying or understanding most forms of toxicity. The availability of in vitro models spans both the full range of endpoints (irritation, sensitization, lethality, mutagenicity, and developmental toxicity) and the full spectrum of target organ systems (including the skin, eye, heart, liver, kidney and nervous system). Chapters are devoted to each of these speciality areas from a perspective of presenting the principal models and their uses and limitations.
Analytical techniques are powerful tools in a chemist's armoury and this book identifies some of the most important chemical techniques currently in use, along with their applications. Aimed at those with some familiarity with modern chemical techniques, as well as those completely new to them, the book covers much of the basic theory without emphasising the mathematics and physics involved. Where appropriate, descriptions of the instrumentation and sample preparation are included, as are problems with example solutions. More advanced ideas are presented in highlighted boxes, so the novice can happily skip these if desired. Modern Chemical Techniques is based on a series of 'hands-on' symposia that enabled individuals to update their chemical skills and learn about the newest methods, techniques, and instrumentation available. The resource material presented at the symposia is published here, developed and extended into an accessible, illustrated book, making the valuable information it contained available to a much wider audience.
For all the emphasis on particle counting, there are extremely few pub lications of the technology - until now. A Practical Guide to Particle Counting for Drinking Water Treatment is a user's manual poised to re medy this problem with insight into every area of particle counting, f or both the system designer and the treatment operator The coverage f eatures an overview of particle counting, including the basic principl es of operation, application in the treatment process, and the fundame ntals of installation, operation, maintenance, data collection, and sy stem integration. It provides understanding of the general equipment s pecifications that help you make intelligent choices in equipment sele ction for a given application and of the underlying technology to help you make the most of your particle counting system.
Making a clear distinction is made between nano- and micro-mechanical testing for physical reasons, this monograph describes the basics and applications of the supermicroscopies AFM and SNOM, and of the nanomechanical testing on rough and technical natural surfaces in the submicron range down to a lateral resolution of a few nm. New or improved instrumentation, new physical laws and unforeseen new applications in all branches of natural sciences (around physics, chemistry, mineralogy, materials science, biology and medicine) and nanotechnology are covered as well as the sources for pitfalls and errors. It outlines the handling of natural and technical samples in relation to those of flat standard samples and emphasizes new special features. Pitfalls and sources of errors are clearly demonstrated as well as their efficient remedy when going from molecularly flat to rough surfaces. The academic or industrial scientist learns how to apply the principles for tackling their scientific or manufacturing tasks that include roughness far away from standard samples.
Experts from The Jackson Laboratory and around the world provide practical advice on everything from how to establish a colony to where to go for specific mutations. Systematic Approach to Evaluation of Mouse Mutations includes information on medical photography, grafting procedures, how to map the genes and evaluate the special biological characteristics of the mice.
Topics in Electron Diffraction and Microscopy of Materials
celebrates the retirement of Professor Michael Whelan from the
University of Oxford. Professor Whelan taught many of today's heads
of department and was a pioneer in the development and use of
electron microscopy. His collaborators and colleagues, each one of
whom has made important advances in the use of microscopy to study
materials, have contributed to this cohesive work.
Since the initial discovery of the G protein-coupled receptor system that regulates cyclicAMP production, the G protein field has rapidly expanded. Cell surface receptors that couple to heterotrimeric G proteins, the G prote- coupled receptors (GPCRs), number in the hundreds and bind to a wide div- sity of ligands including, biogenic amines (e. g. , adrenaline), lipid derivatives (e. g. , lysophosphatidic acid), peptides (e. g. , opioid peptides), proteins (e. g. , thyroid-stimulating hormone), and odorants to name a few. The GPCR system is found throughout biology in such simple organisms as yeast and in such more complex organisms as Dictyostelium discoideum (slime mold), Caen- habditis elegans (nematode worm), and of course in humans. GPCRs and their associated G protein systems are the subject of intense academic research and because of their involvement in a human biology and disease, the pharmac- tical industry has large research initiatives dedicated to the study of GPCRs. By some estimates, more than 50% of the pharmaceuticals on the market are targeted at GPCRs. The G protein/G protein-coupled receptor system consists of a receptor (GPCR), a heterotrimeric G protein consisting of ?, ?, and ? subunits, and an effector. G protein effector molecules, such as enzymes or ion channels, respond to acti- tion by the G protein to generate second messengers or changes in membrane potential that lead to alterations in cell physiology.
The goal of this book is to make some underutilized but potentially very useful methods in experimental design and analysis available to ecologists, and to encourage better use of standard statistical techniques. Ecology has become more and more an experimental science in both basic and applied work, but experiments in the field and in the laboratory often present formidable statistical difficulties. Organized around providing solutions to ecological problems, this book offers ways to improve the statistical aspects of conducting manipulative ecological experiments, from setting them up to interpreting and reporting the results. An abundance of tools, including advanced approaches, are made available to ecologists in step-by-step examples, with computer code provided for common statistical packages. This is an essential how-to guide for the working ecologist and for graduate students preparing for research and teaching careers in the field of ecology.
It is now more than ten years since Dr. Alec Jeffreys (now Professor Sir Alec Jeffreys, FRS) reported in Nature that the investigation of certain minisatellite regions in the human genome could produce what he termed DNA fingerprints and provide useful information in the fields of paternity testing and forensic analysis. Since that time we have witnessed a revolution in the field of forensic identification. A total change of technology, from serological or electrophoretic analysis of protein polymorphisms to direct investigation of the underlying DNA polymorphisms has occurred in a short space of time. In addition, the evolution and development of the DNA systems themselves has been rapid and spectacular. In the last decade we have progressed from the multilocus DNA fing- prints, through single locus systems based on the same Southern blot RFLP technology, to a host of systems based on the PCR technique. These include Allele Specific Oligonucleotide (ASO)-primed systems detected by dot blots, the "binary" genotypes produced by mapping variations within VNTR repeats demonstrated by minisatellite variant repeat (MVR) analysis, and yet other fragment-length polymorphisms in the form of Short Tandem Repeat (STR) loci. Hand in hand with the increasing range of systems available has been the development of new instrumentation to facilitate their analysis and allow us to explore the possibilities of high volume testing in the form of mass scre- ing and offender databases.
Over the last two decades, advances in the design, miniaturization, and analytical capabilities of portable X-ray fluorescence (pXRF) instrumentation have led to its rapid and widespread adoption in a remarkably diverse range of applications in research and industrial fields. The impetus for this volume was that, as pXRF continues to grow into mainstream use, analysts should be increasingly empowered with the right information to safely and effectively employ pXRF as part of their analytical toolkit. This volume provides introductory and advanced-level users alike with readings on topics ranging from basic principles of pXRF and qualitative and quantitative approaches, through to machine learning and artificial intelligence for enhanced applications. It also includes fundamental guidance on calibrations, the mathematics of calculating uncertainties, and an extensive reference index of all elements and their interactions with X-rays. Contributing authors have provided a wealth of information and case studies in industry-specific chapters. These sections delve into detail on current standard practices in industry and research, including examples from agricultural and geo-exploration sectors, research in art and archaeology, and metals industrial and regulatory applications. As pXRF continues to grow in use in industrial and academic settings, it is essential that practitioners continue to learn, share, and implement informed and effective use of this technique. This volume serves as an accessible guidebook and go-to reference manual for new and experienced users in pXRF to achieve this goal.
Offering all aspects of humidity measurement and instrumentation, this work includes rudiments and theory, common applications, advantages and limitations of frequently-used sensors and techniques, and guidelines for installation, maintenance and calibration. The disk is intended for easy conversions of humidity parameters and units.
Response Surfaces: Designs and Analyses; Second Edition presents
techniques for designing experiments that yield adequate and
reliable measurements of one or several responses of interest,
fitting and testing the suitability of empirical models used for
acquiring information from the experiments, and for utilizing the
experimental results to make decisions concerning the system under
investigation.
Let this down-to-earth book be your guide to the statistical integrity of your work. Without relying on the detailed and complex mathematical explanations found in many other statistical texts, Principles of Experimental Design for the Life Sciences teaches how to design, conduct, and interpret top-notch life science studies. Learn about the planning of biomedical studies, the principles of statistical design, sample size estimation, common designs in biological experiments, sequential clinical trials, high dimensional designs and process optimization, and the correspondence between objectives, design, and analysis. Each of these important topics is presented in an understandable and non-technical manner, free of statistical jargon and formulas. Written by a biostatistical consultant with 25 years of experience, Principles of Experimental Design for the Life Sciences is filled with real-life examples from the author's work that you can quickly and easily apply to your own. These examples illustrate the main concepts of experimental design and cover a broad range of application areas in both clinical and nonclinical research. With this one innovative, helpful book you can improve your understanding of statistics, enhance your confidence in your results, and, at long last, shake off those statistical shackles! |
You may like...
Teacher Education - The Key to Effective…
Delbert Long, Rodney Riegle
Hardcover
R2,553
Discovery Miles 25 530
Readings for Learning to Teach in the…
Susan Capel, Marilyn Leask, …
Hardcover
R4,238
Discovery Miles 42 380
|