![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment > General
The investigation and manipulation of matter on the atomic scale have been revolutionized by scanning tunneling microscopy and related scanning probe techniques. This book is the first to provide a clear and comprehensive introduction to this subject. Beginning with the theoretical background of scanning tunneling microscopy, the design and instrumentation of practical STM and associated systems are described in detail, including topographic imaging, local tunneling barrier height measurements, tunneling spectroscopy, and local potentiometry. A treatment of the experimental techniques used in scanning force microscopy and other scanning probe techniques rounds out this section. The second part discusses representative applications of these techniques in fields such as condensed matter physics, chemistry, materials science, biology, and nanotechnology, so this book will be extremely valuable to upper-division students and researchers in these areas.
Ecological Methods by the late T.R. E. Southwood and revised over the years by P. A. Henderson has developed into a classic reference work for the field biologist. It provides a handbook of ecological methods and analytical techniques pertinent to the study of animals, with an emphasis on non-microscopic animals in both terrestrial and aquatic environments. It remains unique in the breadth of the methods presented and in the depth of the literature cited, stretching right back to the earliest days of ecological research. The universal availability of R as an open source package has radically changed the way ecologists analyse their data. In response, Southwood's classic text has been thoroughly revised to be more relevant and useful to a new generation of ecologists, making the vast resource of R packages more readily available to the wider ecological community. By focusing on the use of R for data analysis, supported by worked examples, the book is now more accessible than previous editions to students requiring support and ideas for their projects. Southwood's Ecological Methods provides a crucial resource for both graduate students and research scientists in applied ecology, wildlife ecology, fisheries, agriculture, conservation biology, and habitat ecology. It will also be useful to the many professional ecologists, wildlife biologists, conservation biologists and practitioners requiring an authoritative overview of ecological methodology.
Drawing on state-of-the-art cellular and molecular techniques as
well as new and sophisticated imaging and information technologies,
this comprehensive, three-volume collection of cutting-edge
protocols provides readily reproducible methods for studying and
analyzing the events of embryonic development. volume 1 (ISBN:
089603-574-3) contains techniques for establishing and
characterizing several widely used experimental model systems, for
the study of developmental patterns and morphogenesis, and for the
examination of embryo structure and function. There are also
step-by-step methods for the analaysis of cell lineage, the
production and use of chimeras, and the experimental and molecular
manipulation of embryos, including the application of viral
vectors. volume 2 (ISBN: 0-89603-575-1) describes state-of-the-art
methods for the study of organogenesis, the analysis of abnormal
development and teratology, the screening and mapping of novel
genes and mutations, and the application of transgenesis, including
the production of transgenic animals and gene knockouts. No less
innovative, volume 3 (ISBN: 0-89603-576-X) introduces powerful
techniques for the manipulation of developmental gene expression
and function, the analysis of gene expression, the characterization
of tissue morphogenesis and development, the in vitro study of
differentiation and development, and the genetic analysis of
developmental models of diseases. Highly practical and richly
annotated, the three volumes of Developmental Biology Protocols
describe multiple experimental systems and details techniques
adopted from the broadest array of biomedical disciplines.
This book is a very simple introduction for those who would like to learn about the particle accelerators or 'atom-smashers' used in hospitals, industry and large research institutes where physicists probe deep into the nature of matter itself. The reader with a basic knowledge of mathematics and physics will discover a wide spectrum of technologies.
Flow cytometry is now well established in research laboratories and is gaining increasing use in clinical medicine and pathology. The technique enables multiple simultaneous light scatter and fluorescence measurements to be made at the individual cell level at very rapid rates and results in very large quantities of data being collected. Data, however, is just a series of numbers which have to be converted to information which, in turn, must be shown to have meaning. This is the most important single aspect of flow cytometry but it has received relatively little attention. One of the frequently voiced advantages of the technology is that it produces 'good statistics' because large numbers of cells have been analysed. However, it is not very often that confidence limits are placed on results, hence the reader has little or no feel for the inherent variability in the information produced. This book covers very basic number handling techniques, regression analysis, probability functions, statistical tests and methods of analysing dynamic processes. All those who use flow cytometry in their research will find this book an invaluable guide to interpreting the data produced by flow cytometers.
Since antibodies tagged with markers have been developed, immunocytochemistry has become the method of choice for identifying tissue substances or for the localisation of nucleic acid in tissue by in situ hybridisation. Resin-embedded tissue is routinely used and new techniques are constantly introduced. Thus, the novice entering these fields has a breathtaking variety of methods open to him. This labmanual covers the embedding of tissue using epoxy resin methods to the more sensitive procedures employing the acrylics. The possibilities and results are discussed so that an understanding of the techniques can be acquired and appropriate choices made. The various resins available and all steps involved in tissue processing, beginning with fixation, as well as the great variety of labelling methods and markers that are commonly used for "on-section" cytochemistry and immunocytochemistry are described, including detailed protocols for the application.
Since the creation of the term "Scientific Computing" and of its German counterpart "Wissenschaftliches Rechnen" (whoever has to be blamed for that), scientists from outside the field have been confused about the some what strange distinction between scientific and non-scientific computations. And the insiders, i. e. those who are, at least, convinced of always comput ing in a very scientific way, are far from being happy with this summary of their daily work, even if further characterizations like "High Performance" or "Engineering" try to make things clearer - usually with very modest suc cess, however. Moreover, to increase the unfortunate confusion of terms, who knows the differences between "Computational Science and Engineering" , as indicated in the title of the series these proceedings were given the honour to be published in, and "Scientific and Engineering Computing", as chosen for the title of our book? Actually, though the protagonists of scientific com puting persist in its independence as a scientific discipline (and rightly so, of course), the ideas behind the term diverge wildly. Consequently, the variety of answers one can get to the question "What is scientific computing?" is really impressive and ranges from the (serious) "nothing else but numerical analysis" up to the more mocking "consuming as much CPU-time as possible on the most powerful number crunchers accessible" .
Since the pioneering work of U. S. VonEuler, G. O. Burr, B. Samuelsson, and others in the field of eicosanoids, research in this area continues to grow rapidly. Novel eicosanoids are being discovered even as enzymes that ca- lyze the synthesis of well-established eicosanoids are being critically studied with respect to their regulation and function. The novice in this field will most likely encounter three areas of intense research activity: regulation of expression and function of enzymes, i.e., ph- pholipases, cyclooxygenases, and lipoxygenases involved in the syntheses of established eicosanoids, characterization and distribution in tissues of eicosanoid receptors, and discovery and biologic roles of novel eicosanoids. This book is a compilation of chapters addressing these three areas. Most chapters of Eicosanoid Protocols address the first area, giving p- ticular emphasis to the cyclooxygenases and their two isoforms. This was done intentionally, because the discovery of the constitutive and inducible isoforms of this enzyme have introduced new concepts in the pathobiology of inflammation and in the use of nonsteroidal anti-inflammatory drugs. Although receptors of most established eicosanoids have been characterized and cloned, only one chapter (on the thromboxane A receptor) was devoted to this area.
This companion to The New Statistical Analysis of Data by Anderson and Finn provides a hands-on guide to data analysis using SPSS. Included with this guide are instructions for obtaining the data sets to be analysed via the World Wide Web. First, the authors provide a brief review of using SPSS, and then, corresponding to the organisation of The New Statistical Analysis of Data, readers participate in analysing many of the data sets discussed in the book. In so doing, students both learn how to conduct reasonably sophisticated statistical analyses using SPSS whilst at the same time gaining an insight into the nature and purpose of statistical investigation.
Matrix isolation is a technique used for studying short-lived atoms and molecules at very low temperatures. This book offers detailed practical advice on how to carry out matrix-isolation experiments, and is a unique introduction to the subject. It is an essential practical text that covers a range of topics, from how to build a matrix-isolation laboratory from scratch, to detailed instructions for carrying out experiments.
ism (i. e. , Saccharomyces carlsbergensis, or brewer's yeast) and one of its corresponding enzymes. The experiments on this organism and enzyme are not limited to the materials suggested and can be easily adapted to the desired technical level and available budget. Similarly, the subse- quent cloning experiments suggest that use of particular vectors and strains, but, as indicated, alternative materials can be used to success- fully perform the laboratory exercises. We would like to thank the corporate sponsors of the Biotechnology Training Institute for providing the materials and expertise for the devel- opment of our programs, and thus for the materials in this manual. These sponsors include: * Barnsteadffhermolyne, Dubuque, IA * Beckman Instruments, Somerset, NJ * Bio-Rad Laboratories, Hercules, CA * Boehringer Mannheim Corporation, Indianapolis, IN * Coming Costar Corporation, Cambridge, MA * FMC BioProducts, Rockland, ME * Kodak Laboratory Products, New Haven, CT * Labconco, Kansas City, MO * MJ Research, Cambridge, MA * Olympus Instruments, Lake Success, NY * Pharmacia Biotech, Piscataway, NJ * Savant, Inc. , Farmingdale, NY * VWR Scientific, Philadelphia, P A We would also like to thank the following individuals for their input, comments, and suggestions: Tom Slyker, Bernie Janoson, Steven Piccoli, John Ford,JeffGarelik, Yanan Tian, and Douglas Beecher. Special thanks to Alan Williams for his critique of the chromatography experiments and Shannon Gentile for her work in the laboratory. We would especial- ly like to thank Maryann Burden for her comments and encouragement.
The intent of this work is to bring together in a single volume the techniques that are most widely used in the study of protein stability and protein folding. Over the last decade our understanding of how p- teins fold and what makes the folded conformation stable has advanced rapidly. The development of recombinant DNA techniques has made possible the production of large quantities of virtually any protein, as well as the production of proteins with altered amino acid sequence. Improvements in instrumentation, and the development and refinement of new techniques for studying these recombinant proteins, has been central to the progress made in this field. To give the reader adequate background information about the s- ject, the first two chapters of this book review two different, yet related, aspects of protein stability. The first chapter presents a review of our current understanding of the forces involved in determining the conf- mational stability of proteins as well as their three-dimensional folds. The second chapter deals with the chemical stability of proteins and the pathways by which their covalent structure can degrade. The remainder of the book is devoted to techniques used in the study of these two major areas of protein stability, as well as several areas of active research. Although some techniques, such as X-ray crystallography and mass spectroscopy, are used in the study of protein stability, they are beyond the scope of this book and will not be covered extensively.
Computational Fluid Dynamics research, especially for aeronautics, continues to be a rewarding and industrially relevant field of applied science in which to work. An enthusiastic international community of expert CFD workers continue to push forward the frontiers of knowledge in increasing number. Applications of CFD technology in many other sectors of industry are being successfully tackled. The aerospace industry has made significant investments and enjoys considerable benefits from the application of CFD to its products for the last two decades. This era began with the pioneering work ofMurman and others that took us into the transonic (potential flow) regime for the first time in the early 1970's. We have also seen momentous developments of the digital computer in this period into vector and parallel supercomputing. Very significant advances in all aspects of the methodology have been made to the point where we are on the threshold of calculating solutions for the Reynolds-averaged Navier-Stokes equations for complete aircraft configurations. However, significant problems and challenges remain in the areas of physical modelling, numerics and computing technology. The long term industrial requirements are captured in the U. S. Governments 'Grand Challenge' for 'Aerospace Vehicle Design' for the 1990's: 'Massively parallel computing systems and advanced parallel software technology and algorithms will enable the development and validation of multidisciplinary, coupled methods. These methods will allow the numerical simulation and design optimisation of complete aerospace vehicle systems throughout the flight envelope'.
This monograph presents the still young, but already large and very
active interdisciplinary realm of computer supported cooperative
work (CSCW) in a systematic and well-balanced way. Besides
technical progress also the cultural, social, legal, psychological
and economic aspects of CSCW are discussed. The book makes
accessible a wealth of information and culminates in the
development and detailed discussion of a "Collaboratory" suitable
to fulfil the needs of scientific cooperation in Europe.
Peptide synthesis has emerged as one of the most powerful tools in biochemical, pharmacological, immunological, and biophysical la- ratories. Recent improvements include general solid-phase method- ogy, new protecting groups, and automated equipment. These advances have allowed the facile synthesis of increasingly more complex p- tides. Many of these new and improved methods for the synthesis of peptides and peptide-related substances have been reported in various publications, but never compiled in a convenient handbook. Like other volumes in this series, Peptide Synthesis Protocols concentrates on the practical aspects of these procedures, providing the researcher with detailed descriptions and helpful tips about potential problems. This volume is not intended to serve as a basic guide to standard Merrifie- type solid-phase strategy, but rather to provide the researcher with some of the most recent applications in the field of peptide science. A c- panion volume, Peptide Analysis Protocols, will detail methodology for the charaterization of new synthetic peptides. Development of new methods and applications has continued actively even as this volume was in preparation. Owing to the number of contributors to this volume, it was necessary to establish a cutoff for publication purposes. We feel that all of the protocols presented are timely and up-to-date. Several promising new strategies, such as allyloxycarbonyl-based syntheses, were being developed at the time this volume was in the editing stages and will be included in future editions.
Professor John D. Roberts published a highly readable book on Molecular Orbital Calculations directed toward chemists in 1962. That timely book is the model for this book. The audience this book is directed toward are senior undergraduate and beginning graduate students as well as practicing bench chemists who have a desire to develop conceptual tools for understanding chemical phenomena. Although, ab initio and more advanced semi-empirical MO methods are regarded as being more reliable than HMO in an absolute sense, there is good evidence that HMO provides reliable relative answers particularly when comparing related molecular species. Thus, HMO can be used to rationalize electronic structure in 1t-systems, aromaticity, and the shape use HMO to gain insight of simple molecular orbitals. Experimentalists still into subtle electronic interactions for interpretation of UV and photoelectron spectra. Herein, it will be shown that one can use graph theory to streamline their HMO computational efforts and to arrive at answers quickly without the aid of a group theory or a computer program of which the experimentalist has no understanding. The merging of mathematical graph theory with chemical theory is the formalization of what most chemists do in a more or less intuitive mode. Chemists currently use graphical images to embody chemical information in compact form which can be transformed into algebraical sets. Chemical graph theory provides simple descriptive interpretations of complicated quantum mechanical calculations and is, thereby, in-itself-by-itself an important discipline of study.
It is now twenty years since Cohen and Boyer's first steps into DNA cloning. In the time since then, there has been an ever increasing acc- eration in the development and application of the cloning methodology. With the recent development of the polymerase chain reaction, a second generation of the technology has been born, enabling the isolation of DNA (and in particular, genes) with little more information than the p- tial knowledge of the sequence. In fact, DNA sequencing is now so advanced that it can almost be carried out on the industrial scale. As a consequence of these advances, it now appears feasible to sequence whole genomes, including one the size of the human. What are we going to do with this information? The future of basic molecular biology must lie in the ability to analyze DNA (and especially the genes within it) starting at the DNA level. It is for these problems that Protocols for Gene Analysis attempts to offer solutions. So you have a piece of DNA, possibly a gene--what do you do next? The first section of this book contains a number of "basic" te- niques that are required for further manipulation of the DNA. This s- tion is not intended to be a comprehensive collection of methods, but merely to serve as an up-to-date set of techniques. I refer you to other volumes in the Methods Molecular Biology series for further rec- binant DNA techniques.
Geophysical measurements are not done for the sake of art only. The ultimategoal is to solve some well-defined geological, tectonical or structural problems. For this purpose, the data have to be interpreted, translated, into a physical model of the subsurface. ... This book describes some ofthe most important common features of different geophysical data sets. (fromthe Introduction) Users at universities but also practitioners in exploration, physics or environmental sciences, wherever signal processing is necessary, will benefit from this textbook.
A treatment of the experimental techniques and instrumentation most often used in nuclear and particle physics experiments as well as in various other experiments, providing useful results and formulae, technical know-how and informative details. This second edition has been revised, while sections on Cherenkov radiation and radiation protection have been updated and extended.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
The scientist' s understanding of the cell at the molecular level has advanced rapidly over the last twenty years. This improved understa- ing has led to the development of many new laboratory methods that increasingly allow old problems to be tackled in new ways. Thus the modern scientist cannot specialize in just one field of knowledge, but must be aware of many disciplines. To aid the process of investigation, the Methods Molecular Biology series has brought together many protocols and has highlighted the useful variations and the pitfalls of the different methods. However, protocols frequently cannot be simply taken from the shelf. Thus the starting sample for a chosen protocol may be unavailable in the correct state or form, or the products of the procedure require a different sort of processing. Therefore the scientist needs more detailed information on the nature and requirements of the enzymes being used. This information, though usually available in the literature, is often widely dispersed and frequently occurs in older volumes of journals; not everyone has comprehensive library facilities available. Also many scientists searching out such information are not trained enzymologists and may be unaware of some of the parameters that are important in a specific enzyme reaction.
Separation Methods in Drug Synthesis and Purification, Second Edition, Volume Eight, provides an updated on the analytical techniques used in drug synthesis and purification. Unlike other books on either separation science or drug synthesis, this volume combines the two to explain the basic principles and comparisons of each separation technique. New sections to this volume include enantiomer separation using capillary electrophoresis (CE) and capillary electro- chromatography, the computer simulation of chromatographic separation for accelerating method development, the application of chromatography and capillary electrophoresis used as surrogates for biological processes, and new developments in the established techniques of chromatography and preparative methods.
Most cells will survive removal from the natural mic- environment of their in vivo tissue and placement into a sterile culture dish under optimal conditions. Not only do they survive, but they also multiply and express differen- ated properties in such a culture dish. A few cells do this in suspension, but most will need some kind of mechanical support substituting for their natural connections with other cells. The surface of a culture dish that might have to be coated is usually sufficient. The recent trend to standa- ization of conditions and the existence of commercial ent- prises with adequate funds and specializing in the needs of scientists were responsible for the tremendous proliferation of cell culture techniques in all fields of research in the last 20 years. No longer does a scientist have to concentrate all his/her efforts on that technology; the new trends make it feasible to employ cell culture techniques as only one of the many methods available in a small corner of a larger research laboratory. Some areas of research depend more heavily than others on cell culture techniques. Neuroscience is one of the areas that has developed hand in hand with the prol- eration of cell culture methodology. Molecular biological aspects, cell differentiation and development, neurophy- ological and neurochemical studies, as well as investigations into the nature of various diseases are now to a large extent dependent on the use of cell cultures.
Nucleic acid hybridization techniques allow the detection of
specific DNA or RNA sequences. This book is a clear and concise
guide to the techniques used for preparing DNA and RNA for membrane
hybridization. These include Southern blotting of DNA, northern
blotting of RNA, dot/slot blotting, Benton-and-Davis screening of
recombinant bacteriophage and Grunstein-Hogness screening of
recombinant plasmids. It also discusses the pros and cons of using
nitrocellulose filters and nylon membranes in these procedures. The
book demystifies the laboratory manuals by explaining the rationale
for each step in the published protocols and points out potential
pitfalls with tips on how to avoid them. |
You may like...
Measuring Nothing, Repeatedly - Null…
Allan Franklin, Ronald Laymon
Paperback
R758
Discovery Miles 7 580
Integrating Information Literacy into…
Charity Lovitt, Kristen Shuyler, …
Hardcover
R4,846
Discovery Miles 48 460
STEM Research for Students Volume 2…
Julia H Cothron, Ronald N Giese, …
Hardcover
R2,726
Discovery Miles 27 260
Chemistry as a Second Language…
Charity Flener Lovitt, Paul Kelter
Hardcover
R2,722
Discovery Miles 27 220
STEM Research for Students Volume 1…
Julia H Cothron, Ronald N Giese, …
Hardcover
R2,712
Discovery Miles 27 120
Advances in Teaching Physical Chemistry
Mark D. Ellison, Tracy A. Schoolcraft
Hardcover
R5,294
Discovery Miles 52 940
Microbial Surfaces - Structure…
Terri A. Camesano, Charlene Mello
Hardcover
R1,794
Discovery Miles 17 940
Tools of Chemistry Education Research
Diane M Bunce, Renee S. Cole
Hardcover
R5,206
Discovery Miles 52 060
|