![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment > General
Multivariate Calibration Harald Martens, Chemist, Norwegian Food Research Institute, Aas, Norway and Norwegian Computing Center, Oslo, Norway Tormod NA]s, Statistician, Norwegian Food Research Institute, Aas, Norway The aim of this inter-disciplinary book is to present an up-to-date view of multivariate calibration of analytical instruments, for use in research, development and routine laboratory and process operation. The book is intended to show practitioners in chemistry and technology how to extract the quantitative and understandable information embedded in non-selective, overwhelming and apparently useless measurements by multivariate data analysis. Multivariate calibration is the process of learning how to combine data from several channels, in order to overcome selectivity problems, gain new insight and allow automatic outlier detection. Multivariate calibration is the basis for the present success of high-speed Near-Infrared (NIR) diffuse spectroscopy of intact samples. But the technique is very general: it has shown similar advantages in, for instance, UV, Vis, and IR spectrophotometry, (transmittance, reflectance and fluorescence), for x-ray diffraction, NMR, MS, thermal analysis, chromatography (GC, HPLC) and for electrophoresis and image analysis (tomography, microscopy), as well as other techniques. The book is written at two levels: the main level is structured as a tutorial on the practical use of multivariate calibration techniques. It is intended for university courses and self-study for chemists and technologists, giving one complete and versatile approach, based mainly on data compression methodology in self-modelling PLS regression, with considerations ofexperimental design, data pre-processing and model validation. A second, more methodological, level is intended for statisticians and specialists in chemometrics. It compares several alternative calibration methods, validation approaches and ways to optimize the models. The book also outlines some cognitive changes needed in analytical chemistry, and suggests ways to overcome some communication problems between statistics and chemistry and technology.
Since the creation of the term "Scientific Computing" and of its German counterpart "Wissenschaftliches Rechnen" (whoever has to be blamed for that), scientists from outside the field have been confused about the some what strange distinction between scientific and non-scientific computations. And the insiders, i. e. those who are, at least, convinced of always comput ing in a very scientific way, are far from being happy with this summary of their daily work, even if further characterizations like "High Performance" or "Engineering" try to make things clearer - usually with very modest suc cess, however. Moreover, to increase the unfortunate confusion of terms, who knows the differences between "Computational Science and Engineering" , as indicated in the title of the series these proceedings were given the honour to be published in, and "Scientific and Engineering Computing", as chosen for the title of our book? Actually, though the protagonists of scientific com puting persist in its independence as a scientific discipline (and rightly so, of course), the ideas behind the term diverge wildly. Consequently, the variety of answers one can get to the question "What is scientific computing?" is really impressive and ranges from the (serious) "nothing else but numerical analysis" up to the more mocking "consuming as much CPU-time as possible on the most powerful number crunchers accessible" .
Since the pioneering work of U. S. VonEuler, G. O. Burr, B. Samuelsson, and others in the field of eicosanoids, research in this area continues to grow rapidly. Novel eicosanoids are being discovered even as enzymes that ca- lyze the synthesis of well-established eicosanoids are being critically studied with respect to their regulation and function. The novice in this field will most likely encounter three areas of intense research activity: regulation of expression and function of enzymes, i.e., ph- pholipases, cyclooxygenases, and lipoxygenases involved in the syntheses of established eicosanoids, characterization and distribution in tissues of eicosanoid receptors, and discovery and biologic roles of novel eicosanoids. This book is a compilation of chapters addressing these three areas. Most chapters of Eicosanoid Protocols address the first area, giving p- ticular emphasis to the cyclooxygenases and their two isoforms. This was done intentionally, because the discovery of the constitutive and inducible isoforms of this enzyme have introduced new concepts in the pathobiology of inflammation and in the use of nonsteroidal anti-inflammatory drugs. Although receptors of most established eicosanoids have been characterized and cloned, only one chapter (on the thromboxane A receptor) was devoted to this area.
This companion to The New Statistical Analysis of Data by Anderson and Finn provides a hands-on guide to data analysis using SPSS. Included with this guide are instructions for obtaining the data sets to be analysed via the World Wide Web. First, the authors provide a brief review of using SPSS, and then, corresponding to the organisation of The New Statistical Analysis of Data, readers participate in analysing many of the data sets discussed in the book. In so doing, students both learn how to conduct reasonably sophisticated statistical analyses using SPSS whilst at the same time gaining an insight into the nature and purpose of statistical investigation.
Matrix isolation is a technique used for studying short-lived atoms and molecules at very low temperatures. This book offers detailed practical advice on how to carry out matrix-isolation experiments, and is a unique introduction to the subject. It is an essential practical text that covers a range of topics, from how to build a matrix-isolation laboratory from scratch, to detailed instructions for carrying out experiments.
Bestselling author Theodore Gray has spent more than a decade dreaming up, executing, photographing, and writing about extreme scientific experiments, which he then published between 2009 and 2014 in his monthly Popular Science column "Gray Matter." Previously published in book form by Black Dog in two separate volumes (Mad Science and Mad Science 2), these experiments, plus 5 more all-new ones, will now be combined in one complete book. Packaged in a smaller, chunkier format Completely Mad Science is 432 pages of dazzling chemical demonstrations, illustrated in spectacular full-color photographs. Some of the completely mad experiments in the book include: Casting a model fish out of mercury (demonstrating how this element behaves very differently depending upon temperature); the famous Flaming Bacon Lance that can cut through steel (demonstrating the amount of energy contained in fatty foods like bacon); creating nylon thread out of pure liquid by combining molecules of hexamethylenediamine and sebacoyl chloride; making homemade ice cream using a fire extinguisher and a pillow case; powering your iPhone using 150 pennies and an apple, and many, many more. It's the ultimate collection for Gray's millions of fans.
ism (i. e. , Saccharomyces carlsbergensis, or brewer's yeast) and one of its corresponding enzymes. The experiments on this organism and enzyme are not limited to the materials suggested and can be easily adapted to the desired technical level and available budget. Similarly, the subse- quent cloning experiments suggest that use of particular vectors and strains, but, as indicated, alternative materials can be used to success- fully perform the laboratory exercises. We would like to thank the corporate sponsors of the Biotechnology Training Institute for providing the materials and expertise for the devel- opment of our programs, and thus for the materials in this manual. These sponsors include: * Barnsteadffhermolyne, Dubuque, IA * Beckman Instruments, Somerset, NJ * Bio-Rad Laboratories, Hercules, CA * Boehringer Mannheim Corporation, Indianapolis, IN * Coming Costar Corporation, Cambridge, MA * FMC BioProducts, Rockland, ME * Kodak Laboratory Products, New Haven, CT * Labconco, Kansas City, MO * MJ Research, Cambridge, MA * Olympus Instruments, Lake Success, NY * Pharmacia Biotech, Piscataway, NJ * Savant, Inc. , Farmingdale, NY * VWR Scientific, Philadelphia, P A We would also like to thank the following individuals for their input, comments, and suggestions: Tom Slyker, Bernie Janoson, Steven Piccoli, John Ford,JeffGarelik, Yanan Tian, and Douglas Beecher. Special thanks to Alan Williams for his critique of the chromatography experiments and Shannon Gentile for her work in the laboratory. We would especial- ly like to thank Maryann Burden for her comments and encouragement.
This monograph presents the still young, but already large and very
active interdisciplinary realm of computer supported cooperative
work (CSCW) in a systematic and well-balanced way. Besides
technical progress also the cultural, social, legal, psychological
and economic aspects of CSCW are discussed. The book makes
accessible a wealth of information and culminates in the
development and detailed discussion of a "Collaboratory" suitable
to fulfil the needs of scientific cooperation in Europe.
The intent of this work is to bring together in a single volume the techniques that are most widely used in the study of protein stability and protein folding. Over the last decade our understanding of how p- teins fold and what makes the folded conformation stable has advanced rapidly. The development of recombinant DNA techniques has made possible the production of large quantities of virtually any protein, as well as the production of proteins with altered amino acid sequence. Improvements in instrumentation, and the development and refinement of new techniques for studying these recombinant proteins, has been central to the progress made in this field. To give the reader adequate background information about the s- ject, the first two chapters of this book review two different, yet related, aspects of protein stability. The first chapter presents a review of our current understanding of the forces involved in determining the conf- mational stability of proteins as well as their three-dimensional folds. The second chapter deals with the chemical stability of proteins and the pathways by which their covalent structure can degrade. The remainder of the book is devoted to techniques used in the study of these two major areas of protein stability, as well as several areas of active research. Although some techniques, such as X-ray crystallography and mass spectroscopy, are used in the study of protein stability, they are beyond the scope of this book and will not be covered extensively.
Computational Fluid Dynamics research, especially for aeronautics, continues to be a rewarding and industrially relevant field of applied science in which to work. An enthusiastic international community of expert CFD workers continue to push forward the frontiers of knowledge in increasing number. Applications of CFD technology in many other sectors of industry are being successfully tackled. The aerospace industry has made significant investments and enjoys considerable benefits from the application of CFD to its products for the last two decades. This era began with the pioneering work ofMurman and others that took us into the transonic (potential flow) regime for the first time in the early 1970's. We have also seen momentous developments of the digital computer in this period into vector and parallel supercomputing. Very significant advances in all aspects of the methodology have been made to the point where we are on the threshold of calculating solutions for the Reynolds-averaged Navier-Stokes equations for complete aircraft configurations. However, significant problems and challenges remain in the areas of physical modelling, numerics and computing technology. The long term industrial requirements are captured in the U. S. Governments 'Grand Challenge' for 'Aerospace Vehicle Design' for the 1990's: 'Massively parallel computing systems and advanced parallel software technology and algorithms will enable the development and validation of multidisciplinary, coupled methods. These methods will allow the numerical simulation and design optimisation of complete aerospace vehicle systems throughout the flight envelope'.
Peptide synthesis has emerged as one of the most powerful tools in biochemical, pharmacological, immunological, and biophysical la- ratories. Recent improvements include general solid-phase method- ogy, new protecting groups, and automated equipment. These advances have allowed the facile synthesis of increasingly more complex p- tides. Many of these new and improved methods for the synthesis of peptides and peptide-related substances have been reported in various publications, but never compiled in a convenient handbook. Like other volumes in this series, Peptide Synthesis Protocols concentrates on the practical aspects of these procedures, providing the researcher with detailed descriptions and helpful tips about potential problems. This volume is not intended to serve as a basic guide to standard Merrifie- type solid-phase strategy, but rather to provide the researcher with some of the most recent applications in the field of peptide science. A c- panion volume, Peptide Analysis Protocols, will detail methodology for the charaterization of new synthetic peptides. Development of new methods and applications has continued actively even as this volume was in preparation. Owing to the number of contributors to this volume, it was necessary to establish a cutoff for publication purposes. We feel that all of the protocols presented are timely and up-to-date. Several promising new strategies, such as allyloxycarbonyl-based syntheses, were being developed at the time this volume was in the editing stages and will be included in future editions.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
A treatment of the experimental techniques and instrumentation most often used in nuclear and particle physics experiments as well as in various other experiments, providing useful results and formulae, technical know-how and informative details. This second edition has been revised, while sections on Cherenkov radiation and radiation protection have been updated and extended.
Professor John D. Roberts published a highly readable book on Molecular Orbital Calculations directed toward chemists in 1962. That timely book is the model for this book. The audience this book is directed toward are senior undergraduate and beginning graduate students as well as practicing bench chemists who have a desire to develop conceptual tools for understanding chemical phenomena. Although, ab initio and more advanced semi-empirical MO methods are regarded as being more reliable than HMO in an absolute sense, there is good evidence that HMO provides reliable relative answers particularly when comparing related molecular species. Thus, HMO can be used to rationalize electronic structure in 1t-systems, aromaticity, and the shape use HMO to gain insight of simple molecular orbitals. Experimentalists still into subtle electronic interactions for interpretation of UV and photoelectron spectra. Herein, it will be shown that one can use graph theory to streamline their HMO computational efforts and to arrive at answers quickly without the aid of a group theory or a computer program of which the experimentalist has no understanding. The merging of mathematical graph theory with chemical theory is the formalization of what most chemists do in a more or less intuitive mode. Chemists currently use graphical images to embody chemical information in compact form which can be transformed into algebraical sets. Chemical graph theory provides simple descriptive interpretations of complicated quantum mechanical calculations and is, thereby, in-itself-by-itself an important discipline of study.
It is now twenty years since Cohen and Boyer's first steps into DNA cloning. In the time since then, there has been an ever increasing acc- eration in the development and application of the cloning methodology. With the recent development of the polymerase chain reaction, a second generation of the technology has been born, enabling the isolation of DNA (and in particular, genes) with little more information than the p- tial knowledge of the sequence. In fact, DNA sequencing is now so advanced that it can almost be carried out on the industrial scale. As a consequence of these advances, it now appears feasible to sequence whole genomes, including one the size of the human. What are we going to do with this information? The future of basic molecular biology must lie in the ability to analyze DNA (and especially the genes within it) starting at the DNA level. It is for these problems that Protocols for Gene Analysis attempts to offer solutions. So you have a piece of DNA, possibly a gene--what do you do next? The first section of this book contains a number of "basic" te- niques that are required for further manipulation of the DNA. This s- tion is not intended to be a comprehensive collection of methods, but merely to serve as an up-to-date set of techniques. I refer you to other volumes in the Methods Molecular Biology series for further rec- binant DNA techniques.
Geophysical measurements are not done for the sake of art only. The ultimategoal is to solve some well-defined geological, tectonical or structural problems. For this purpose, the data have to be interpreted, translated, into a physical model of the subsurface. ... This book describes some ofthe most important common features of different geophysical data sets. (fromthe Introduction) Users at universities but also practitioners in exploration, physics or environmental sciences, wherever signal processing is necessary, will benefit from this textbook.
Handbook of Radioactivity Analysis: Radiation Physics and Detectors, Volume One, and Radioanalytical Applications, Volume Two, Fourth Edition, constitute an authoritative reference on the principles, practical techniques and procedures for the accurate measurement of radioactivity - everything from the very low levels encountered in the environment, to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, and fuel cycle facilities, and in the implementation of nuclear forensic analysis and nuclear safeguards. It includes sample preparation techniques for all types of matrices found in the environment, including soil, water, air, plant matter and animal tissue, and surface swipes. Users will find the latest advances in the applications of radioactivity analysis across various fields, including environmental monitoring, radiochemical standardization, high-resolution beta imaging, automated radiochemical separation, nuclear forensics, and more.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
Nucleic acid hybridization techniques allow the detection of
specific DNA or RNA sequences. This book is a clear and concise
guide to the techniques used for preparing DNA and RNA for membrane
hybridization. These include Southern blotting of DNA, northern
blotting of RNA, dot/slot blotting, Benton-and-Davis screening of
recombinant bacteriophage and Grunstein-Hogness screening of
recombinant plasmids. It also discusses the pros and cons of using
nitrocellulose filters and nylon membranes in these procedures. The
book demystifies the laboratory manuals by explaining the rationale
for each step in the published protocols and points out potential
pitfalls with tips on how to avoid them.
Most cells will survive removal from the natural mic- environment of their in vivo tissue and placement into a sterile culture dish under optimal conditions. Not only do they survive, but they also multiply and express differen- ated properties in such a culture dish. A few cells do this in suspension, but most will need some kind of mechanical support substituting for their natural connections with other cells. The surface of a culture dish that might have to be coated is usually sufficient. The recent trend to standa- ization of conditions and the existence of commercial ent- prises with adequate funds and specializing in the needs of scientists were responsible for the tremendous proliferation of cell culture techniques in all fields of research in the last 20 years. No longer does a scientist have to concentrate all his/her efforts on that technology; the new trends make it feasible to employ cell culture techniques as only one of the many methods available in a small corner of a larger research laboratory. Some areas of research depend more heavily than others on cell culture techniques. Neuroscience is one of the areas that has developed hand in hand with the prol- eration of cell culture methodology. Molecular biological aspects, cell differentiation and development, neurophy- ological and neurochemical studies, as well as investigations into the nature of various diseases are now to a large extent dependent on the use of cell cultures.
This book presents a wide range of tested and proven protocols relevant to a number of fields within biotechnology used in laboratory experiments in everyday phycological (seaweed) research. A major focus will be on bioenergy related aspects of this emerging technology. These protocols will be written in a clear and concise manner using simple language permitting even nonspecialist to adequately understand the significance of this research. It will also contain all necesssary notes and guidelines for successful execution of these experiments.
This history of the thermometer includes controversy about its invention, the story of different scales, Fahrenheit and centigrade, and the history of the gradual scientific then popular understanding of the concept of temperature. Not until 1800 did people interested in thermometers begin to see clearly what they were measuring, and the impetus for improving thermometry came largely from study of the weather--the liquid-in-glass thermometer became the meteorologist's instrument before that of the chemist or physicist. This excellent introductory study follows the development of indicating and recording thermometers until recent times, emphasizing meteorological applications.
Purification of Laboratory Chemicals, Eighth Edition, tabulates methods taken from literature for purifying thousands of individual commercially available chemicals. To help in applying this information, the more common processes currently used for purification in chemical laboratories and new methods are discussed. For dealing with substances not separately listed, a chapter is included setting out the usual methods for purifying specific classes of compounds.
TO VEGETATION ANALYSIS Principles, practice and interpretation D.R.CAUSTON Department of Botany and Microbiology, University College of Wales, Aberystwyth London UNWIN HYMAN Boston Sydney Wellington (c) D. R. Causton, 1988 This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved. Published by the Academic Division of Unwin Hyman Ltd 15/17 Broadwick Street, London W1V 1FP, UK Allen & Unwin Inc., 8 Winchester Place, Winchester, Mass. 01890, USA Allen & Unwin (Australia) Ltd, 8 Napier Street, North Sydney, NSW 2060, Australia Allen & Unwin (New Zealand) Ltd in association with the Port Nicholson Press Ltd, 60 Cambridge Terrace, Wellington, New Zealand First published in 1988 British Library Cataloguing in Publication Data Causton, David R. An introduction to vegetation analysis: principles, practice and intepretation. 1. Botany-Ecology-Mathematics I. Title 581.5'247 QK901 ISBN-13: 978-0-04-581025-3 e-ISBN-13: 978-94-011-7981-2 DOl: 10.1007/978-94-011-7981-2 Library of Congress Cataloging-in-Publication Data Causton, David R. An introduction to vegetation analysis. Bibliography: p. Includes index. 1. Botany-Ecology-Methodology. 2. Plant communities-Research-Methodology. 3. Vegetation surveys. 4. Vegetation classification. I. Title. QK90I.C33 1987 581.5 87-19327 ISBN-13: 978-0-04-581025-3 Typeset in 10 on 12 point Times by Mathematical Composition Setters Ltd, Salisbury and Biddies of Guildford Preface This book has been written to help students and their teachers, at various levels, to understand the principles, some of the methods, and ways of interpreting vegetational and environmental data acquired in the field.
Basic principles of applied life sciences such as recombinant DNA technology is used in most life sciences industries marketing bio-formulations for designing more effective protein-based drugs, such as erythropoietin and fast-acting insulin etc. In recent times genetically engineered host cells from mammal, animal and plants are also being used in life sciences industries to manufacture biologics. This book discusses the most basic as well advanced issues on biological products for successfully managing a life sciences industry. It elucidates the life cycle of biological molecules, right from the conceptual development of different types of biopolymers, and their subsequent transfer from the conical flasks in laboratory to life sciences industries for large scale production and marketing. It focuses on sustainable longevity in the life cycle of commercial biopolymers. Cumulative facts and figures in this volume would immensely help in inspiring life sciences industry promoters to monitor value chain transfer process of biologics for better profitability. Additionally, it would serve as a perusal document for the students and researchers interested in entrepreneurial ventures or having their own start-up projects for the commercialization of biologics. |
You may like...
Constraint Programming and Decision…
Martine Ceberio, Vladik Kreinovich
Hardcover
R3,182
Discovery Miles 31 820
Designing Sorting Networks - A New…
Sherenaz W. Al-Haj Baddar, Kenneth E. Batcher
Hardcover
R1,397
Discovery Miles 13 970
Toeplitz Matrices and Singular Integral…
Albrecht Bottcher, Israel Gohberg, …
Hardcover
R2,460
Discovery Miles 24 600
Yearbook of Corpus Linguistics and…
Jesus Romero-Trillo
Hardcover
Cellular Internet of Things - From…
Olof Liberg, Marten Sundberg, …
Paperback
R2,403
Discovery Miles 24 030
k-Schur Functions and Affine Schubert…
Thomas Lam, Luc Lapointe, …
Hardcover
Recommender Systems - Algorithms and…
S. Vairachilai, Sirisha Potluri, …
Hardcover
R3,111
Discovery Miles 31 110
|