![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment
Since its discovery, Atomic Force Microscopy (AFM) has become a technique of choice for non-destructive surface characterization with sub-molecular resolution. The AFM has also emerged as a problem-solving tool in applications relevant to particle-solid and particle-liquid interactions, design, fabrication, and characterization of new materials, and development of new technologies for processing and modification of materials. This volume is a comprehensive review of AFM techniques and their application in adhesion studies. It is intended for both researchers and students in engineering disciplines, physics and biology. Over 100 authors contributed to this book, summarizing current status of research on measurements of colloidal particle-solid adhesion and molecular forces, solid surface imaging and mapping, and discussing the contact mechanics models applicable to particle-substrate and particle-particle systems.
Stereology, or quantitative microscopy, is a basic research tool in science and technology. The emergence of design-based methods has greatly increased the power, flexibility, adaptability, and scope of stereology applications, establishing a closer connection between statistics and quantitative microscopy. Despite its scientific importance, modern stereology remains largely unknown to the statistical community, with valuable information either widely scattered or inaccessible to newcomers to the field. Now is the perfect time for a book that enables biostatisticians and statistical consultants to give beneficial advice to researchers in microscopy. Stereology for Statisticians sets out the principles of stereology from a statistical viewpoint, focusing on both basic theory and practical implications. This book discusses ways to effectively communicate statistical issues to clients, draws attention to common methodological errors, and provides references to essential literature. The first full text on design-based stereology, it opens with a review of classical and modern stereology, followed by a treatment of mathematical foundations such as geometry, probability, and statistical inference. The book then presents core techniques, including estimation of absolute geometrical quantities, relative quantities, and statistical inference for populations of discrete objects. The final chapters discuss implementing techniques in practical sampling designs, summarize understanding of the variance of stereological estimators, and describe open problems for further research.
Million-copy bestselling author of The Elements, Molecules, and Reactions Theodore Gray applies his trademark mix of engaging stories, real-time experiments, and stunning photography to the inner workings of machines, big and small, revealing the extraordinary science, beauty, and rich history of everyday things. Theodore Gray has become a household name among fans, both young and old, of popular science and mechanics. He's an incorrigible tinkerer with a constant curiosity for how things work. Gray's readers love how he always brings the perfect combination of know-how, humour and daring-do to every project or demonstration, be it scientific or mechanical. In How Things Work he explores the mechanical underpinnings of dozens of types of machines and mechanisms, from the cotton gin to the wristwatch to an industrial loom. Filled with stunning original photographs in Gray's inimitable style, How Things Work is a must-have exploration of stuff - large and small - for any builder, maker or lover of mechanical things.
Image processing is fast becoming a valuable tool for analyzing multidimensional data in all areas of natural science. Since the publication of the best-selling first edition of this handbook, the field of image processing has matured in many of its aspects from ad hoc, empirical approaches to a sound science based on established mathematical and physical principles. The Practical Handbook on Image Processing for Scientific and Technical Applications, Second Edition builds a sound basic knowledge of image processing, provides a critically evaluated collection of the best algorithms, and demonstrates those algorithms with real-world applications from many fields. It covers all aspects of image processing, from image formation to image analysis, and gives an up-to-date review of advanced concepts. Organized according to the hierarchy of tasks, each chapter includes a summary, an outline of the background the task requires, and a section of practical tips that help you avoid common errors and save valuable research time. New in the Second Edition: Expanded application areas now include technical fields such as the automotive industry, quality inspection, and materials science More practical tips in each chapter Discussion of digital camera interfaces and a comparison of CMOS vs. CCD cameras A section on wavelets Advanced techniques for reconstruction, new segmentation methods based on global optimization and anisotropic diffusion, and additional classification techniques, including neural networks, polynomial classification, and support vector machines Just as digital image processing provides the key to studying complex scientific problems that researchers once could only dream of tackling, this handbook unlocks the intricacies of image processing and brings its power and potential within the grasp of researchers across the spectrum of scientific and engineering disciplines.
Although methods and techniques that will help solve various analytical problems do exist, they are often difficult to perform. Using polarized light microscopy as the method of choice, Color Atlas and Manual of Microscopy for Criminalists, Chemists, and Conservators offers swift, simple, yet irrefutable analytical tests and testing procedures that can be used to identify organic and inorganic particles. Seasoned forensic microscopists Nicholas Petraco and Thomas Kubic have lent their expertise as consultants to forensic scientists, analytical chemists, art historians, pathologists, customs agents, detectives, gemologists, numismatists, and art conservators. Now they share their extensive photomicrograph collection of minute specimens along with clear, concise, and simple methods to help solve your analytical problems.
There is currently a high level of interest in Laboratory Information Management Systems (LIMS), which, when successfully implemented, can revitalize the operations of a laboratory and contribute significantly to the effectiveness and efficiency of the overall enterprise. LIMS describes the strategy, planning, resources, and activities needed to integrate LIMS and its supporting technologies into an organization. It covers all aspects of implementation and management and has the benefit of not being product specific. This book will not date as it is not restricted to a particular software product, hardware platform, or technical automation approach. Instead it deals with the issues, expertise, organization, and resources that contribute to the successful implementation of LIMS. The author has wide experience of automated laboratory systems in the chemical, pharmaceutical, environmental, and biotechnology industries, and for the past 15 years has been intimately involved in every aspect of LIMS implementations including justification, system selection, installation, project management, developing, training, validation, performance optimization, and maintenance. LIMS contains numerous illustrations and tables to highlight concisely the major points and concepts discussed in each chapter. The book is essential reading for laboratory, information systems and project managers responsible for the implementation of LIMS and, as it does not require any previous knowledge of computers or laboratory information management systems, is easily accessible to all.
I In this volume, the author demystifies the Design of Experiments (DOE). He begins with a clear explanation of the traditional experimentation process. He then covers the concept of variation and the importance of experimentation and follows through with applications. Stamatis also discusses full and fractional factorials. The strength of this volume lies in the fact that not only does it introduce the concept of robustness, it also addresses "Robust Designs" with discussions on the Taguchi methodology of experimentation. And throughout the author ties these concepts into the Six Sigma philosophy and shows readers how they use those concepts in their organizations.
Private landowners or Federal Agencies responsible for cleaning up radiological environments are faced with the challenge of clearly defining the nature and extent of radiological contamination, implementing remedial alternatives, then statistically verifying that cleanup objectives have been met. Sampling and Surveying Radiological Environments provides the how-tos for designing and implementing cost effective and defensible sampling programs in radiological environments, such as those found in the vicinity of uranium mine sites, nuclear weapons production facilities, nuclear reactors, radioactive waste storage and disposal facilities, and nuclear accidents. It includes downloadable resources that walk you through the EPA's Data Quality Objectives(DQO) procedures and provides electronic templates you can complete and print. Sampling and Surveying Radiological Environments addresses all of the major topics that will assist you in designing and implementing statistically defensible sampling programs in radiological environments, including: Summary of the major environmental laws and regulations that apply to radiological sites, and advice on regulatory interfacing * Internet addresses where you can find regulations pertaining to each States Theory of radiation detection and definitions of common radiological terminology Statistics and statistical software that apply to the environmental industry Details on commercially available radiological instrumentation and detection systems Building decontamination and decommissioning, radiological and chemical equipment decontamination procedures, and tank/drum/remote characterization Standard operating procedures for collecting environmental media samples Guidance on sample preparation, documentation, and shipment Guidance on data verification/validation, radiological data management, data quality assessment (DQA)
'G. Adams in Fleet Street London' is the signature on some of the finest scientific instruments of the eighteenth century. This book is the first comprehensive study of the instrument-making business run by the Adams family, from its foundation in 1734 to bankruptcy in 1817. It is based on detailed research in the archival sources as well as examination of extant instruments and publications by George Adams senior and his two sons, George junior and Dudley. Separate chapters are devoted to George senior's family background, his royal connections, and his new globes; George junior's numerous publications, and his dealings with van Marum; and to Dudley's dabbling with 'medico-electrical therapeutics'. The book is richly illustrated with plates from the Adams's own publications and with examples of instruments ranging from unique museum pieces - such as the 'Prince of Wales' microscope - and globes to the more common, even mundane, items of the kind seen in salesrooms and dealers - the surveying, navigational and military instruments that formed the backbone of the business. The appendices include facsimiles of trade catalogues and an annotated short-title listing of the Adams family's publications, which also covers American and Continental editions, as well as the posthumous ones by W. & S. Jones.
Toxicology has made tremendous strides in the sophistication of the models used to identify and understand the mechanisms of agents that can harm or kill humans and other higher organisms. Non-animals or in vitro models started to gain significant use in the 1960s. As a result of the increased concern over animal welfare, economic factors, and the need for greater sensitivity and understanding of mechanisms, interest in in vitro models has risen. This volume demonstrates that there now exists a broad range of in vitro models for use in either identifying or understanding most forms of toxicity. The availability of in vitro models spans both the full range of endpoints (irritation, sensitization, lethality, mutagenicity, and developmental toxicity) and the full spectrum of target organ systems (including the skin, eye, heart, liver, kidney and nervous system). Chapters are devoted to each of these speciality areas from a perspective of presenting the principal models and their uses and limitations.
Making a clear distinction is made between nano- and micro-mechanical testing for physical reasons, this monograph describes the basics and applications of the supermicroscopies AFM and SNOM, and of the nanomechanical testing on rough and technical natural surfaces in the submicron range down to a lateral resolution of a few nm. New or improved instrumentation, new physical laws and unforeseen new applications in all branches of natural sciences (around physics, chemistry, mineralogy, materials science, biology and medicine) and nanotechnology are covered as well as the sources for pitfalls and errors. It outlines the handling of natural and technical samples in relation to those of flat standard samples and emphasizes new special features. Pitfalls and sources of errors are clearly demonstrated as well as their efficient remedy when going from molecularly flat to rough surfaces. The academic or industrial scientist learns how to apply the principles for tackling their scientific or manufacturing tasks that include roughness far away from standard samples.
Experts from The Jackson Laboratory and around the world provide practical advice on everything from how to establish a colony to where to go for specific mutations. Systematic Approach to Evaluation of Mouse Mutations includes information on medical photography, grafting procedures, how to map the genes and evaluate the special biological characteristics of the mice.
Topics in Electron Diffraction and Microscopy of Materials
celebrates the retirement of Professor Michael Whelan from the
University of Oxford. Professor Whelan taught many of today's heads
of department and was a pioneer in the development and use of
electron microscopy. His collaborators and colleagues, each one of
whom has made important advances in the use of microscopy to study
materials, have contributed to this cohesive work.
Since the initial discovery of the G protein-coupled receptor system that regulates cyclicAMP production, the G protein field has rapidly expanded. Cell surface receptors that couple to heterotrimeric G proteins, the G prote- coupled receptors (GPCRs), number in the hundreds and bind to a wide div- sity of ligands including, biogenic amines (e. g. , adrenaline), lipid derivatives (e. g. , lysophosphatidic acid), peptides (e. g. , opioid peptides), proteins (e. g. , thyroid-stimulating hormone), and odorants to name a few. The GPCR system is found throughout biology in such simple organisms as yeast and in such more complex organisms as Dictyostelium discoideum (slime mold), Caen- habditis elegans (nematode worm), and of course in humans. GPCRs and their associated G protein systems are the subject of intense academic research and because of their involvement in a human biology and disease, the pharmac- tical industry has large research initiatives dedicated to the study of GPCRs. By some estimates, more than 50% of the pharmaceuticals on the market are targeted at GPCRs. The G protein/G protein-coupled receptor system consists of a receptor (GPCR), a heterotrimeric G protein consisting of ?, ?, and ? subunits, and an effector. G protein effector molecules, such as enzymes or ion channels, respond to acti- tion by the G protein to generate second messengers or changes in membrane potential that lead to alterations in cell physiology.
The goal of this book is to make some underutilized but potentially very useful methods in experimental design and analysis available to ecologists, and to encourage better use of standard statistical techniques. Ecology has become more and more an experimental science in both basic and applied work, but experiments in the field and in the laboratory often present formidable statistical difficulties. Organized around providing solutions to ecological problems, this book offers ways to improve the statistical aspects of conducting manipulative ecological experiments, from setting them up to interpreting and reporting the results. An abundance of tools, including advanced approaches, are made available to ecologists in step-by-step examples, with computer code provided for common statistical packages. This is an essential how-to guide for the working ecologist and for graduate students preparing for research and teaching careers in the field of ecology.
It is now more than ten years since Dr. Alec Jeffreys (now Professor Sir Alec Jeffreys, FRS) reported in Nature that the investigation of certain minisatellite regions in the human genome could produce what he termed DNA fingerprints and provide useful information in the fields of paternity testing and forensic analysis. Since that time we have witnessed a revolution in the field of forensic identification. A total change of technology, from serological or electrophoretic analysis of protein polymorphisms to direct investigation of the underlying DNA polymorphisms has occurred in a short space of time. In addition, the evolution and development of the DNA systems themselves has been rapid and spectacular. In the last decade we have progressed from the multilocus DNA fing- prints, through single locus systems based on the same Southern blot RFLP technology, to a host of systems based on the PCR technique. These include Allele Specific Oligonucleotide (ASO)-primed systems detected by dot blots, the "binary" genotypes produced by mapping variations within VNTR repeats demonstrated by minisatellite variant repeat (MVR) analysis, and yet other fragment-length polymorphisms in the form of Short Tandem Repeat (STR) loci. Hand in hand with the increasing range of systems available has been the development of new instrumentation to facilitate their analysis and allow us to explore the possibilities of high volume testing in the form of mass scre- ing and offender databases.
Offering all aspects of humidity measurement and instrumentation, this work includes rudiments and theory, common applications, advantages and limitations of frequently-used sensors and techniques, and guidelines for installation, maintenance and calibration. The disk is intended for easy conversions of humidity parameters and units.
Response Surfaces: Designs and Analyses; Second Edition presents
techniques for designing experiments that yield adequate and
reliable measurements of one or several responses of interest,
fitting and testing the suitability of empirical models used for
acquiring information from the experiments, and for utilizing the
experimental results to make decisions concerning the system under
investigation.
Let this down-to-earth book be your guide to the statistical integrity of your work. Without relying on the detailed and complex mathematical explanations found in many other statistical texts, Principles of Experimental Design for the Life Sciences teaches how to design, conduct, and interpret top-notch life science studies. Learn about the planning of biomedical studies, the principles of statistical design, sample size estimation, common designs in biological experiments, sequential clinical trials, high dimensional designs and process optimization, and the correspondence between objectives, design, and analysis. Each of these important topics is presented in an understandable and non-technical manner, free of statistical jargon and formulas. Written by a biostatistical consultant with 25 years of experience, Principles of Experimental Design for the Life Sciences is filled with real-life examples from the author's work that you can quickly and easily apply to your own. These examples illustrate the main concepts of experimental design and cover a broad range of application areas in both clinical and nonclinical research. With this one innovative, helpful book you can improve your understanding of statistics, enhance your confidence in your results, and, at long last, shake off those statistical shackles!
"When you first view Rose-Lynn Fisher's photographs, you might think you're looking down at the world from an airplane, at dunes, skyscrapers or shorelines. In fact, you're looking at her tears. . . . [There's] poetry in the idea that our emotional terrain bears visual resemblance to the physical world; that our tears can look like the vistas we see out an airplane window. Fisher's images are the only remaining trace of these places, which exist during a moment of intense feeling-and then vanish." -NPR "[A] delicate, intimate book. . . . In The Topography of Tears photographer Rose-Lynn Fisher shows us a place where language strains to express grief, longing, pride, frustration, joy, the confrontation with something beautiful, the confrontation with an onion." -Boston Globe Does a tear shed while chopping onions look different from a tear of happiness? In this powerful collection of images, an award-winning photographer trains her optical microscope and camera on her own tears and those of men, women, and children, released in moments of grief, pain, gratitude, and joy, and captured upon glass slides. These duotone photographs reveal the beauty of recurring patterns in nature and present evocative, crystalline imagery for contemplation. Underscored by poetic captions, they translate the mysterious act of crying into an atlas mapping the structure and magnificence of our interior lives. Rose-Lynn Fisher is an artist and author of the International Photography Award-winning studies Bee and The Topography of Tears. Her photographs are exhibited in galleries, festivals, and museums across the world and have been featured by the Dr. Oz Show, NPR, Smithsonian, Harper's, New Yorker, Time, Wired, Reader's Digest, Discover, Brain Pickings, and elsewhere. She received her BFA from Otis Art Institute and lives in Los Angeles.
This work elucidates the power of modern nuclear magnetic resonance (NMR) techniques to solve a wide range of practical problems that arise in both academic and industrial settings. This edition provides current information regarding the implementation and interpretation of NMR experiments, and contains material on: three- and four-dimensional NMR; the NMR analysis of peptides, proteins, carbohydrates and oligonucleotides; and more.
The Second Edition of this bestseller brings together basic plant
pathology methods published in diverse and often abstract
publications. The Second Edition is updated and expanded with
numerous new figures, new culture media, and additional methods for
working with a greater number of organisms. Methods are easy to use
and eliminate the need to seek out original articles. This
reference allows for easy identification of methods appropriate for
specific problems and facilities. Scientific names of pathogens and
some of their hosts are updated in this edition. The book also acts
as a research source providing more than 1,800 literature
citations.
Proceedings of the 51st Course of the International School of Subnuclear Physics on 'Reflections on the next step for LHC', Erice, 24 June - 3 July 2013.
Why do Japanese artists team up with engineers in order to create so-called "Device Art"? What is a nanoscientist's motivation in approaching the artworld? In the past few years, there has been a remarkable increase in attempts to foster the exchange between art, technology, and science - an exchange taking place in academies, museums, or even in research laboratories. Media art has proven especially important in the dialogue between these cultural fields. This book is a contribution to the current debate on "art & science", interdisciplinarity, and the discourse of innovation. It critically assesses artistic positions that appear as the ongoing attempt to localize art's position within technological and societal change - between now and the future.
This book focuses primarily on the atomic force microscope and serves as a reference for students, postdocs, and researchers using atomic force microscopes for the first time. In addition, this book can serve as the primary text for a semester-long introductory course in atomic force microscopy. There are a few algebra-based mathematical relationships included in the book that describe the mechanical properties, behaviors, and intermolecular forces associated with probes used in atomic force microscopy. Relevant figures, tables, and illustrations also appear in each chapter in an effort to provide additional information and points of interest. This book includes suggested laboratory investigations that provide opportunities to explore the versatility of the atomic force microscope. These laboratory exercises include opportunities for experimenters to explore force curves, surface roughness, friction loops, conductivity imaging, and phase imaging. |
You may like...
Behind Prison Walls - Unlocking a Safer…
Edwin Cameron, Rebecca Gore, …
Paperback
Broken To Heal - Deceit, Destruction…
Alistair Izobell
Paperback
(3)
1 Recce: Volume 3 - Onsigbaarheid Is Ons…
Alexander Strachan
Paperback
|