Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment > General
This companion to The New Statistical Analysis of Data by Anderson and Finn provides a hands-on guide to data analysis using SPSS. Included with this guide are instructions for obtaining the data sets to be analysed via the World Wide Web. First, the authors provide a brief review of using SPSS, and then, corresponding to the organisation of The New Statistical Analysis of Data, readers participate in analysing many of the data sets discussed in the book. In so doing, students both learn how to conduct reasonably sophisticated statistical analyses using SPSS whilst at the same time gaining an insight into the nature and purpose of statistical investigation.
Since their rapid proliferation in the late 1960s and early 1970s, quadrupole mass spectrometers have had a profound impact across the physical sciences. Geometrically simple, yet behaviorally complex, these dynamic mass analyzers continue to facilitate remarkable breakthroughs in fields ranging from biochemical analysis to process control technology. Long regarded as the standard introduction to the field, Quadrupole Mass Spectrometry and Its Applications provides today's engineers and scientists with an authoritative, wide-ranging overview of the development and uses of quadrupoles. Beginning with the basic operating principles of quadrupole devices, the book moves from general explanations of the actions of radio-frequency fields to descriptions of their utilization in quadrupole mass filters, monopoles, three-dimensional quadrupole ion traps, and various time-of-flight spectrometers. A concluding series of chapters examines early applications of quadrupoles in atomic physics, gas chromatography, upper atmospheric research, medicine, and environmental studies. Superb writing from the field's foremost scientists along with the continued central role of quadrupoles in contemporary research make this volume as timely and relevant as ever.
ism (i. e. , Saccharomyces carlsbergensis, or brewer's yeast) and one of its corresponding enzymes. The experiments on this organism and enzyme are not limited to the materials suggested and can be easily adapted to the desired technical level and available budget. Similarly, the subse- quent cloning experiments suggest that use of particular vectors and strains, but, as indicated, alternative materials can be used to success- fully perform the laboratory exercises. We would like to thank the corporate sponsors of the Biotechnology Training Institute for providing the materials and expertise for the devel- opment of our programs, and thus for the materials in this manual. These sponsors include: * Barnsteadffhermolyne, Dubuque, IA * Beckman Instruments, Somerset, NJ * Bio-Rad Laboratories, Hercules, CA * Boehringer Mannheim Corporation, Indianapolis, IN * Coming Costar Corporation, Cambridge, MA * FMC BioProducts, Rockland, ME * Kodak Laboratory Products, New Haven, CT * Labconco, Kansas City, MO * MJ Research, Cambridge, MA * Olympus Instruments, Lake Success, NY * Pharmacia Biotech, Piscataway, NJ * Savant, Inc. , Farmingdale, NY * VWR Scientific, Philadelphia, P A We would also like to thank the following individuals for their input, comments, and suggestions: Tom Slyker, Bernie Janoson, Steven Piccoli, John Ford,JeffGarelik, Yanan Tian, and Douglas Beecher. Special thanks to Alan Williams for his critique of the chromatography experiments and Shannon Gentile for her work in the laboratory. We would especial- ly like to thank Maryann Burden for her comments and encouragement.
The intent of this work is to bring together in a single volume the techniques that are most widely used in the study of protein stability and protein folding. Over the last decade our understanding of how p- teins fold and what makes the folded conformation stable has advanced rapidly. The development of recombinant DNA techniques has made possible the production of large quantities of virtually any protein, as well as the production of proteins with altered amino acid sequence. Improvements in instrumentation, and the development and refinement of new techniques for studying these recombinant proteins, has been central to the progress made in this field. To give the reader adequate background information about the s- ject, the first two chapters of this book review two different, yet related, aspects of protein stability. The first chapter presents a review of our current understanding of the forces involved in determining the conf- mational stability of proteins as well as their three-dimensional folds. The second chapter deals with the chemical stability of proteins and the pathways by which their covalent structure can degrade. The remainder of the book is devoted to techniques used in the study of these two major areas of protein stability, as well as several areas of active research. Although some techniques, such as X-ray crystallography and mass spectroscopy, are used in the study of protein stability, they are beyond the scope of this book and will not be covered extensively.
This monograph presents the still young, but already large and very
active interdisciplinary realm of computer supported cooperative
work (CSCW) in a systematic and well-balanced way. Besides
technical progress also the cultural, social, legal, psychological
and economic aspects of CSCW are discussed. The book makes
accessible a wealth of information and culminates in the
development and detailed discussion of a "Collaboratory" suitable
to fulfil the needs of scientific cooperation in Europe.
Peptide synthesis has emerged as one of the most powerful tools in biochemical, pharmacological, immunological, and biophysical la- ratories. Recent improvements include general solid-phase method- ogy, new protecting groups, and automated equipment. These advances have allowed the facile synthesis of increasingly more complex p- tides. Many of these new and improved methods for the synthesis of peptides and peptide-related substances have been reported in various publications, but never compiled in a convenient handbook. Like other volumes in this series, Peptide Synthesis Protocols concentrates on the practical aspects of these procedures, providing the researcher with detailed descriptions and helpful tips about potential problems. This volume is not intended to serve as a basic guide to standard Merrifie- type solid-phase strategy, but rather to provide the researcher with some of the most recent applications in the field of peptide science. A c- panion volume, Peptide Analysis Protocols, will detail methodology for the charaterization of new synthetic peptides. Development of new methods and applications has continued actively even as this volume was in preparation. Owing to the number of contributors to this volume, it was necessary to establish a cutoff for publication purposes. We feel that all of the protocols presented are timely and up-to-date. Several promising new strategies, such as allyloxycarbonyl-based syntheses, were being developed at the time this volume was in the editing stages and will be included in future editions.
It is now twenty years since Cohen and Boyer's first steps into DNA cloning. In the time since then, there has been an ever increasing acc- eration in the development and application of the cloning methodology. With the recent development of the polymerase chain reaction, a second generation of the technology has been born, enabling the isolation of DNA (and in particular, genes) with little more information than the p- tial knowledge of the sequence. In fact, DNA sequencing is now so advanced that it can almost be carried out on the industrial scale. As a consequence of these advances, it now appears feasible to sequence whole genomes, including one the size of the human. What are we going to do with this information? The future of basic molecular biology must lie in the ability to analyze DNA (and especially the genes within it) starting at the DNA level. It is for these problems that Protocols for Gene Analysis attempts to offer solutions. So you have a piece of DNA, possibly a gene--what do you do next? The first section of this book contains a number of "basic" te- niques that are required for further manipulation of the DNA. This s- tion is not intended to be a comprehensive collection of methods, but merely to serve as an up-to-date set of techniques. I refer you to other volumes in the Methods Molecular Biology series for further rec- binant DNA techniques.
A treatment of the experimental techniques and instrumentation most often used in nuclear and particle physics experiments as well as in various other experiments, providing useful results and formulae, technical know-how and informative details. This second edition has been revised, while sections on Cherenkov radiation and radiation protection have been updated and extended.
Computational Fluid Dynamics research, especially for aeronautics, continues to be a rewarding and industrially relevant field of applied science in which to work. An enthusiastic international community of expert CFD workers continue to push forward the frontiers of knowledge in increasing number. Applications of CFD technology in many other sectors of industry are being successfully tackled. The aerospace industry has made significant investments and enjoys considerable benefits from the application of CFD to its products for the last two decades. This era began with the pioneering work ofMurman and others that took us into the transonic (potential flow) regime for the first time in the early 1970's. We have also seen momentous developments of the digital computer in this period into vector and parallel supercomputing. Very significant advances in all aspects of the methodology have been made to the point where we are on the threshold of calculating solutions for the Reynolds-averaged Navier-Stokes equations for complete aircraft configurations. However, significant problems and challenges remain in the areas of physical modelling, numerics and computing technology. The long term industrial requirements are captured in the U. S. Governments 'Grand Challenge' for 'Aerospace Vehicle Design' for the 1990's: 'Massively parallel computing systems and advanced parallel software technology and algorithms will enable the development and validation of multidisciplinary, coupled methods. These methods will allow the numerical simulation and design optimisation of complete aerospace vehicle systems throughout the flight envelope'.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
Professor John D. Roberts published a highly readable book on Molecular Orbital Calculations directed toward chemists in 1962. That timely book is the model for this book. The audience this book is directed toward are senior undergraduate and beginning graduate students as well as practicing bench chemists who have a desire to develop conceptual tools for understanding chemical phenomena. Although, ab initio and more advanced semi-empirical MO methods are regarded as being more reliable than HMO in an absolute sense, there is good evidence that HMO provides reliable relative answers particularly when comparing related molecular species. Thus, HMO can be used to rationalize electronic structure in 1t-systems, aromaticity, and the shape use HMO to gain insight of simple molecular orbitals. Experimentalists still into subtle electronic interactions for interpretation of UV and photoelectron spectra. Herein, it will be shown that one can use graph theory to streamline their HMO computational efforts and to arrive at answers quickly without the aid of a group theory or a computer program of which the experimentalist has no understanding. The merging of mathematical graph theory with chemical theory is the formalization of what most chemists do in a more or less intuitive mode. Chemists currently use graphical images to embody chemical information in compact form which can be transformed into algebraical sets. Chemical graph theory provides simple descriptive interpretations of complicated quantum mechanical calculations and is, thereby, in-itself-by-itself an important discipline of study.
Geophysical measurements are not done for the sake of art only. The ultimategoal is to solve some well-defined geological, tectonical or structural problems. For this purpose, the data have to be interpreted, translated, into a physical model of the subsurface. ... This book describes some ofthe most important common features of different geophysical data sets. (fromthe Introduction) Users at universities but also practitioners in exploration, physics or environmental sciences, wherever signal processing is necessary, will benefit from this textbook.
Nucleic acid hybridization techniques allow the detection of
specific DNA or RNA sequences. This book is a clear and concise
guide to the techniques used for preparing DNA and RNA for membrane
hybridization. These include Southern blotting of DNA, northern
blotting of RNA, dot/slot blotting, Benton-and-Davis screening of
recombinant bacteriophage and Grunstein-Hogness screening of
recombinant plasmids. It also discusses the pros and cons of using
nitrocellulose filters and nylon membranes in these procedures. The
book demystifies the laboratory manuals by explaining the rationale
for each step in the published protocols and points out potential
pitfalls with tips on how to avoid them.
Most cells will survive removal from the natural mic- environment of their in vivo tissue and placement into a sterile culture dish under optimal conditions. Not only do they survive, but they also multiply and express differen- ated properties in such a culture dish. A few cells do this in suspension, but most will need some kind of mechanical support substituting for their natural connections with other cells. The surface of a culture dish that might have to be coated is usually sufficient. The recent trend to standa- ization of conditions and the existence of commercial ent- prises with adequate funds and specializing in the needs of scientists were responsible for the tremendous proliferation of cell culture techniques in all fields of research in the last 20 years. No longer does a scientist have to concentrate all his/her efforts on that technology; the new trends make it feasible to employ cell culture techniques as only one of the many methods available in a small corner of a larger research laboratory. Some areas of research depend more heavily than others on cell culture techniques. Neuroscience is one of the areas that has developed hand in hand with the prol- eration of cell culture methodology. Molecular biological aspects, cell differentiation and development, neurophy- ological and neurochemical studies, as well as investigations into the nature of various diseases are now to a large extent dependent on the use of cell cultures.
The Fundamentals of Scientific Research: An Introductory Laboratory Manual is a laboratory manual geared towards first semester undergraduates enrolled in general biology courses focusing on cell biology. This laboratory curriculum centers on studying a single organism throughout the entire semester Serratia marcescens, or S. marcescens, a bacterium unique in its production of the red pigment prodigiosin. The manual separates the laboratory course into two separate modules. The first module familiarizes students with the organism and lab equipment by performing growth curves, Lowry protein assays, quantifying prodigiosin and ATP production, and by performing complementation studies to understand the biochemical pathway responsible for prodigiosin production. Students learn to use Microsoft Excel to prepare and present data in graphical format, and how to calculate their data into meaningful numbers that can be compared across experiments. The second module requires that the students employ UV mutagenesis to generate hyper-pigmented mutants of S. marcescens for further characterization. Students use experimental data and protocols learned in the first module to help them develop their own hypotheses, experimental protocols, and to analyze their own data. Before each lab, students are required to answer questions designed to probe their understanding of required pre-laboratory reading materials. Questions also guide the students through the development of hypotheses and predictions. Following each laboratory, students then answer a series of post-laboratory questions to guide them through the presentation and analysis of their data, and how to place their data into the context of primary literature. Students are also asked to review their initial hypotheses and predictions to determine if their conclusions are supportive. A formal laboratory report is also to be completed after each module, in a format similar to that of primary scientific literature. The Fundamentals of Scientific Research: An Introductory Laboratory Manual is an invaluable resource to undergraduates majoring in the life sciences.
TO VEGETATION ANALYSIS Principles, practice and interpretation D.R.CAUSTON Department of Botany and Microbiology, University College of Wales, Aberystwyth London UNWIN HYMAN Boston Sydney Wellington (c) D. R. Causton, 1988 This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved. Published by the Academic Division of Unwin Hyman Ltd 15/17 Broadwick Street, London W1V 1FP, UK Allen & Unwin Inc., 8 Winchester Place, Winchester, Mass. 01890, USA Allen & Unwin (Australia) Ltd, 8 Napier Street, North Sydney, NSW 2060, Australia Allen & Unwin (New Zealand) Ltd in association with the Port Nicholson Press Ltd, 60 Cambridge Terrace, Wellington, New Zealand First published in 1988 British Library Cataloguing in Publication Data Causton, David R. An introduction to vegetation analysis: principles, practice and intepretation. 1. Botany-Ecology-Mathematics I. Title 581.5'247 QK901 ISBN-13: 978-0-04-581025-3 e-ISBN-13: 978-94-011-7981-2 DOl: 10.1007/978-94-011-7981-2 Library of Congress Cataloging-in-Publication Data Causton, David R. An introduction to vegetation analysis. Bibliography: p. Includes index. 1. Botany-Ecology-Methodology. 2. Plant communities-Research-Methodology. 3. Vegetation surveys. 4. Vegetation classification. I. Title. QK90I.C33 1987 581.5 87-19327 ISBN-13: 978-0-04-581025-3 Typeset in 10 on 12 point Times by Mathematical Composition Setters Ltd, Salisbury and Biddies of Guildford Preface This book has been written to help students and their teachers, at various levels, to understand the principles, some of the methods, and ways of interpreting vegetational and environmental data acquired in the field.
Many chemists - especially those most brilliant in their field - fail to appreciate the power of planned experimentation. They dislike the mathematical aspects of statistical analysis. In addition, these otherwise very capable chemists also dismissed predictive models based only on empirical data. Ironically, in the hands of subject matter experts like these elite chemists, the statistical methods of mixture design and analysis provide the means for rapidly converging on optimal compositions. What differentiates Formulation Simplified from the standard statistical texts on mixture design is that the authors make the topic relatively easy and fun to read. They provide a whole new collection of insighful original studies that illustrate the essentials of mixture design and analysis. Solid industrial examples are offered as problems at the end of many chapters for those who are serious about trying new tools on their own. Statistical software to do the computations can be freely accessed via a web site developed in support of this book.
This definitive new book should appeal to everyone who produces, uses, or evaluates scientific data. Ensures accuracy and reliability. Dr. Taylor's book provides guidance for the development and implementation of a credible quality assurance program, plus it also provides chemists and clinical chemists, medical and chemical researchers, and all scientists and managers the ideal means to ensure accurate and reliable work. Chapters are presented in a logical progression, starting with the concept of quality assurance, principles of good measurement, principles of quality assurance, and evaluation of measurement quality. Each chapter has a degree of independence so that it may be consulted separately from the others.
Complex mathematical and computational models are used in all areas of society and technology and yet model based science is increasingly contested or refuted, especially when models are applied to controversial themes in domains such as health, the environment or the economy. More stringent standards of proofs are demanded from model-based numbers, especially when these numbers represent potential financial losses, threats to human health or the state of the environment. Quantitative sensitivity analysis is generally agreed to be one such standard. Mathematical models are good at mapping assumptions into inferences. A modeller makes assumptions about laws pertaining to the system, about its status and a plethora of other, often arcane, system variables and internal model settings. To what extent can we rely on the model-based inference when most of these assumptions are fraught with uncertainties? Global Sensitivity Analysis offers an accessible treatment of such problems via quantitative sensitivity analysis, beginning with the first principles and guiding the reader through the full range of recommended practices with a rich set of solved exercises. The text explains the motivation for sensitivity analysis, reviews the required statistical concepts, and provides a guide to potential applications. The book: Provides a self-contained treatment of the subject, allowing readers to learn and practice global sensitivity analysis without further materials. Presents ways to frame the analysis, interpret its results, and avoid potential pitfalls. Features numerous exercises and solved problems to help illustrate the applications. Is authored by leading sensitivityanalysis practitioners, combining a range of disciplinary backgrounds. Postgraduate students and practitioners in a wide range of subjects, including statistics, mathematics, engineering, physics, chemistry, environmental sciences, biology, toxicology, actuarial sciences, and econometrics will find much of use here. This book will prove equally valuable to engineers working on risk analysis and to financial analysts concerned with pricing and hedging.
This is the first book devoted to the use of X-ray beam techniques to study magnetic properties of materials. It covers both experimental and theoretical issues. The three main topics are dichroism, elastic scattering (both non-resonant and resonant diffraction) and spectroscopy. In the past decade there has been an expansion of activity in the field, driven by the availability of intense, tuneable and highly polarized X-ray beams from synchrtron facilities. The pace of events is likely to continue with the start of new (3rd generation) facilities, including the European Synchrotron Radiation Facility, Grenoble, and the Advanced Light Source, Argonne National Laboratory. USA.
This book focuses on the use of novel electron microscopy techniques to further our understanding of the physics behind electron-light interactions. It introduces and discusses the methodologies for advancing the field of electron microscopy towards a better control of electron dynamics with significantly improved temporal resolutions, and explores the burgeoning field of nanooptics - the physics of light-matter interaction at the nanoscale - whose practical applications transcend numerous fields such as energy conversion, control of chemical reactions, optically induced phase transitions, quantum cryptography, and data processing. In addition to describing analytical and numerical techniques for exploring the theoretical basis of electron-light interactions, the book showcases a number of relevant case studies, such as optical modes in gold tapers probed by electron beams and investigations of optical excitations in the topological insulator Bi2Se3. The experiments featured provide an impetus to develop more relevant theoretical models, benchmark current approximations, and even more characterization tools based on coherent electron-light interactions. |
You may like...
Microbial Surfaces - Structure…
Terri A. Camesano, Charlene Mello
Hardcover
R1,763
Discovery Miles 17 630
Mentoring Strategies To Facilitate the…
Kerry Karukstis, Bridget Gourley, …
Hardcover
R5,405
Discovery Miles 54 050
An Illustrated Electron Microscopic…
Wangxiang Zhang, Junjun Fan, …
Hardcover
R2,794
Discovery Miles 27 940
Chemistry as a Second Language…
Charity Flener Lovitt, Paul Kelter
Hardcover
R2,684
Discovery Miles 26 840
Advances in Teaching Physical Chemistry
Mark D. Ellison, Tracy A. Schoolcraft
Hardcover
R5,238
Discovery Miles 52 380
STEM Research for Students Volume 1…
Julia H Cothron, Ronald N Giese, …
Hardcover
R2,676
Discovery Miles 26 760
STEM Research for Students Volume 2…
Julia H Cothron, Ronald N Giese, …
Hardcover
R2,691
Discovery Miles 26 910
|