Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Science: general issues > Scientific equipment & techniques, laboratory equipment
Synchrotron radiation has been a revolutionary and invaluable research tool for a wide range of scientists, including chemists, biologists, physicists, materials scientists, geophysicists. It has also found multidisciplinary applications with problems ranging from archeology through cultural heritage to paleontology. The subject of this book is x-ray spectroscopy using synchrotron radiation, and the target audience is both current and potential users of synchrotron facilities. The first half of the book introduces readers to the fundamentals of storage ring operations, the qualities of the synchrotron radiation produced, the x-ray optics required to transport this radiation, and the detectors used for measurements. The second half of the book describes the important spectroscopic techniques that use synchrotron x-rays, including chapters on x-ray absorption, x-ray fluorescence, resonant and non-resonant inelastic x-ray scattering, nuclear spectroscopies, and x-ray photoemission. A final chapter surveys the exciting developments of free electron laser sources, which promise a second revolution in x-ray science. Thanks to the detailed descriptions in the book, prospective users will be able to quickly begin working with these techniques. Experienced users will find useful summaries, key equations, and exhaustive references to key papers in the field, as well as outlines of the historical developments in the field. Along with plentiful illustrations, this work includes access to supplemental Mathematica notebooks, which can be used for some of the more complex calculations and as a teaching aid. This book should appeal to graduate students, postdoctoral researchers, and senior scientists alike.
This book deals with the subject of secondary energy spectroscopy in the scanning electron microscope (SEM). The SEM is a widely used research instrument for scientific and engineering research and its low energy scattered electrons, known as secondary electrons, are used mainly for the purpose of nanoscale topographic imaging. This book demonstrates the advantages of carrying out precision electron energy spectroscopy of its secondary electrons, in addition to them being used for imaging. The book will demonstrate how secondary electron energy spectroscopy can transform the SEM into a powerful analytical tool that can map valuable material science information to the nanoscale, superimposing it onto the instrument's normal topographic mode imaging. The book demonstrates how the SEM can then be used to quantify/identify materials, acquire bulk density of states information, capture dopant density distributions in semiconductor specimens, and map surface charge distributions.
Providing the knowledge and practical experience to begin analysing scientific data, this book is ideal for physical sciences students wishing to improve their data handling skills. The book focuses on explaining and developing the practice and understanding of basic statistical analysis, concentrating on a few core ideas, such as the visual display of information, modelling using the likelihood function, and simulating random data. Key concepts are developed through a combination of graphical explanations, worked examples, example computer code and case studies using real data. Students will develop an understanding of the ideas behind statistical methods and gain experience in applying them in practice. Further resources are available at www.cambridge.org/9781107607590, including data files for the case studies so students can practise analysing data, and exercises to test students' understanding.
Offers assistance to those involved in planning new laboratories, or expanding existing facilities. Emphasis throughout is on finding economical solutions without sacrificing quality.
As a response to the climate crisis and its effect on marine ecosystems and coastal populations, this book proposes concrete science driven solutions at establishing transformation pathways towards Sustainable Blue Growth, that are supported by technically and socially innovative innovations. This book proposes investment options and management solutions that have the potential of making our seas and oceans resilient to crises- climate, financial, health- by laying the foundations for a green/blue, circular economy that is anchored in science driven solutions and geared toward public well-being. Now is the time to usher in systemic economic change and the good news is that we have our blueprint: it's the combination of UN Agenda 2030 (17 SDG) and European Commission's European Green Deal! There is no doubt that the Earth's survival will depend on the protection and sustainable management of our seas and oceans and the resources they provide. This is recognized by the Joint Communication on International Ocean Governance, which is an integral part of the EU's response to the United Nations' 2030 Agenda for Sustainable Development, and in particular to the targets set out by Sustainable Development Goal 14 (SDG 14) to "conserve and sustainably use the oceans, seas and marine resources". The analytical framework and science-driven concrete management solutions proposed in this book can accelerate the transition to a sustainable management of our seas and oceans, by turning the current challenges into opportunities for sustainable economic growth which is both environmentally resilient and leaves no one behind.
This book is both an introduction and a demonstration of how Visual Basic for Applications (VBA) can greatly enhance Microsoft Excel (R) by giving users the ability to create their own functions within a worksheet and to create subroutines to perform repetitive actions. The book is written so readers are encouraged to experiment with VBA programming with examples using fairly simple physics or non-complicated mathematics such as root finding and numerical integration. Tested Excel (R) workbooks are available for each chapter and there is nothing to buy or install.
An award-winning science journalist details the quest to isolate and understand dark matter-and shows how that search has helped us to understand the universe we inhabit. When you train a telescope on outer space, you can see luminous galaxies, nebulae, stars, and planets. But if you add all that together, it constitutes only 15 percent of the matter in the universe. Despite decades of research, the nature of the remaining 85 percent is unknown. We call it dark matter. In The Elephant in the Universe, Govert Schilling explores the fascinating history of the search for dark matter. Evidence for its existence comes from a wealth of astronomical observations. Theories and computer simulations of the evolution of the universe are also suggestive: they can be reconciled with astronomical measurements only if dark matter is a dominant component of nature. Physicists have devised huge, sensitive instruments to search for dark matter, which may be unlike anything else in the cosmos-some unknown elementary particle. Yet so far dark matter has escaped every experiment. Indeed, dark matter is so elusive that some scientists are beginning to suspect there might be something wrong with our theories about gravity or with the current paradigms of cosmology. Schilling interviews both believers and heretics and paints a colorful picture of the history and current status of dark matter research, with astronomers and physicists alike trying to make sense of theory and observation. Taking a holistic view of dark matter as a problem, an opportunity, and an example of science in action, The Elephant in the Universe is a vivid tale of scientists puzzling their way toward the true nature of the universe.
As this book. Antibacterial Peptide Protocols, will attest, my enthusi asm for the field of antibacterial peptides is based on a conviction (and I am unashamed to say, prejudice) that these substances are in essence antibiotics produced by the host that then participate in host defense against infectious agents. Because of their capacity to exert antibiotic-like action against patho genic microorganisms (bacteria, fungi, parasites, and viruses), there is reason to believe that these agents will soon be used clinically to treat infectious diseases. In fact, in recent years, biotechnology companies have been formed for the sole purpose of developing antibacterial peptides for clinical use. It should be emphasized that antibacterial peptides will likely play a major role in the treatment of infectious diseases, particularly with the increasing prob lem of multidrug-resistant microbes and the relative dearth of new antibiotics being provided by pharmaceutical companies. The topic of this volume of Methods in Molecular Biology, the diverse methods used in research on antibacterial peptides, is thus quite timely. As the subject of antibacterial peptides develops into its own discipline (something strongly suggested by the explosion in the number of papers published over the past decade), it is essential that reliable techniques and strategies be made available not only to those of us in the field, but also to the newcomers and researchers in complementary disciplines."
* Inclusion of realistic 3D simulations that behave very much like the real thing. This isn't just setting a value and reading something off the screen. These incorporate the physicality of the experiments, which might mean positioning yourself so that a moving needle can be seen accurately by using your position to remove parallax. * Based on academic research into teaching online. * Coverage of all of the required practicals (AQA). * Inclusion of background information on each experiment. * Detailed accounts of how to perform the experiment for real or with the simulation. * The simulations have the Association for Science's Green Tick of approval.
This volume contains papers presented at the NATO Advanced Research Workshop (ARW) on Photons and Local Probes. The workshop had two predecessors. The first was the NATO ARW on Near Field Optics, held in October 1992 at Arc et Senans and was organized by Daniel Courjon and Dieter Pohl. The other predecessor was a workshop on Photons and Scanning Probe Microscopies held at the University of Konstanz in July 1992. The workshop on Photons and Local Probes was held at the Loechnerhaus on the Reichenau Island at the Lake of Constance, from September 11 to 17, 1994. The Reichenau Island was an important place in Europe in the middle age. Even the tomb of one of the carolingian emperors, Charles the Fat, is located there. At this workshop more than 60 scientists from Europe and the United States met to communicate their latest results in the field of local probes in combination with optical techniques. In eight sessions 31 talks as well as 9 posters were presented. Among those 31 publications were submitted for publication in the NATO proceedings. They were accepted after a strict, but constructive refereeing process.
The new techniques of molecular cytogenetics, mainly fluorescence in situ hybridization (FISH) of DNA probes to metaphase chromosomes or interphase nuclei, have been developed in the past two decades. Many FISH techniques have been implemented for diagnostic services, whereas some others are mainly used for investigational purposes. Several hundreds of FISH probes and hybridization kits are now commercially available, and the list is growing rapidly. FISH has been widely used as a powerful diagnostic tool in many areas of medicine including pediatrics, medical genetics, maternal-fetal medicine, reproductive medicine, pathology, hematology, and oncology. Frequently, a physician may be puzzled by the variety of FISH techniques and wonder what test to order. It is not uncommon that a sample is referred to a laboratory for FISH without indicating a specific test. On the other hand, a cytogeneticist or a technologist in a laboratory needs, from case to case, to determine which procedure to perform and which probe to use for an informative result. To obtain the best results, one must use the right DNA probes and have reliable protocols and measures of quality assurance in place. Also, one must have sufficient knowledge in both traditional and molecular cytogenetics, as well as the particular areas of medicine for which the test is used in order to appropriately interpret the FISH results, and to correlate them with clinical diagnosis, treatment, and prognosis.
This is the fifth volume of "Advances in Sonochemistry" the first
having been published in 1990. The definition of sonochemistry has
developed to include not only the ways in which ultrsound has been
harnessed to effect chemistry but also its uses in material
processing. Subjects included range from chemical dosimetry to
ultrasound in microbiology to ultrasound in the extraction of plant
materials and in leather technology.
Making a clear distinction is made between nano- and micro-mechanical testing for physical reasons, this monograph describes the basics and applications of the supermicroscopies AFM and SNOM, and of the nanomechanical testing on rough and technical natural surfaces in the submicron range down to a lateral resolution of a few nm. New or improved instrumentation, new physical laws and unforeseen new applications in all branches of natural sciences (around physics, chemistry, mineralogy, materials science, biology and medicine) and nanotechnology are covered as well as the sources for pitfalls and errors. It outlines the handling of natural and technical samples in relation to those of flat standard samples and emphasizes new special features. Pitfalls and sources of errors are clearly demonstrated as well as their efficient remedy when going from molecularly flat to rough surfaces. The academic or industrial scientist learns how to apply the principles for tackling their scientific or manufacturing tasks that include roughness far away from standard samples.
The Transmission Electron Microscope (TEM) is the ultimate tool to see and measure structures on the nanoscale and to probe their elemental composition and electronic structure with sub-nanometer spatial resolution. Recent technological breakthroughs have revolutionized our understanding of materials via use of the TEM, and it promises to become a significant tool in understanding biological and biomolecular systems such as viruses and DNA molecules. This book is a practical guide for scientists who need to use the TEM as a tool to answer questions about physical and chemical phenomena on the nanoscale.
In its simplest form, the scientific method can be thought of as learning from our mistakes and trying to correct them. True scientists try to think rationally, never adopt dogmatic opinions and are always willing to listen to opposing views. They never claim to know the absolute truth but are relentless in their search for it. In this timely book, the author describes the fundamentals of critical scientific thinking. The book further examines the correct use of the scientific method and how to apply it to current events and scientific topics to obtain honest assessments. Current controversies discussed include climate change and COVID-related lockdowns. Additional Features include: Demonstrates the use of the scientific method to assist with objective analysis of issues. Addresses that induction plays a role but the true method for advancing knowledge is hypothesis-deduction. Explores current hot topics within the framework of the scientific method. Outlines common misunderstandings of the scientific method. Applying the Scientific Method to Learn from Mistakes and Approach Truth is approachable enough for the general public and recommended for university and advanced high school science educators and their students.
The papers in this volume arose out of two workshops entitled "Confinement and Remediation of Environmental Hazards," and "Resource Recovery," as part of the IMA 1999-2000 program year. These workshops brought together mathematicians, engineers and scientists to summarize recent theoretical, computational, and experimental advances in the theory of phenomena in porous media. The first workshop focused on the mathematical problems which arise in groundwater transport of contamination, and the spreading, confinement and remediation of biological, chemical and radioactive waste. In the second conference, the processes underlying petroleum recovery and the geological time scale of deformation, flow and reaction in porous media were discussed. Simulation techniques were used to simulate complex domains with widely-ranging spatial resolution and types of physics. Probability funcional methods for determining the most probable state of the subsurface and related uncertainty were discussed. Practical examples included breakout from chemical and radioactive waste repositories, confinement by injection of pore plugging material and bioremediation of petroleum and other wastes. This volume will be of interest to subsurface science practitioners who would like a view of recent mathematical and experimental efforts to examine subsurface science phenomena related to resource recovery and remediation issues.
Though many separation processes are available for use in todays analytical laboratory, chromatographic methods are the most widely used. The applications of chromatography have grown explosively in the last four decades, owing to the development of new techniques and to the expanding need of scientists for better methods of separating complex mixtures. With its comprehensive, unified approach, this book will greatly assist the novice in need of a reference to chromatographic techniques, as well as the specialist suddenly faced with the need to switch from one technique to another.
This book should be on the shelf of every practising statistician who designs experiments. Good design considers units and treatments first, and then allocates treatments to units. It does not choose from a menu of named designs. This approach requires a notation for units that does not depend on the treatments applied. Most structure on the set of observational units, or on the set of treatments, can be defined by factors. This book develops a coherent framework for thinking about factors and their relationships, including the use of Hasse diagrams. These are used to elucidate structure, calculate degrees of freedom and allocate treatment subspaces to appropriate strata. Based on a one-term course the author has taught since 1989, the book is ideal for advanced undergraduate and beginning graduate courses. Examples, exercises and discussion questions are drawn from a wide range of real applications: from drug development, to agriculture, to manufacturing.
This book explores how machine learning can be used to improve the efficiency of expensive fundamental science experiments. The first part introduces the Belle and Belle II experiments, providing a detailed description of the Belle to Belle II data conversion tool, currently used by many analysts. The second part covers machine learning in high-energy physics, discussing the Belle II machine learning infrastructure and selected algorithms in detail. Furthermore, it examines several machine learning techniques that can be used to control and reduce systematic uncertainties. The third part investigates the important exclusive B tagging technique, unique to physics experiments operating at the resonances, and studies in-depth the novel Full Event Interpretation algorithm, which doubles the maximum tag-side efficiency of its predecessor. The fourth part presents a complete measurement of the branching fraction of the rare leptonic B decay "B tau nu", which is used to validate the algorithms discussed in previous parts.
The 3rd International Multidisciplinary Microscopy Congress (InterM2015), held from 19 to 23 October 2015, focused on the latest developments concerning applications of microscopy in the biological, physical and chemical sciences at all dimensional scales, advances in instrumentation, techniques in and educational materials on microscopy. These proceedings gather 17 peer-reviewed technical papers submitted by leading academic and research institutions from nine countries and representing some of the most cutting-edge research available.
The first book to chronicle how innovation in laboratory designs for botanical research energized the emergence of physiological plant ecology as a vibrant subdiscipline  Laboratory innovation since the mid-twentieth century has powered advances in the study of plant adaptation, evolution, and ecosystem function. The phytotron, an integrated complex of controlled-environment greenhouse and laboratory spaces, invented by Frits W. Went in the 1950s, set off a worldwide laboratory movement and transformed the plant sciences. Sharon Kingsland explores this revolution through a comparative study of work in the United States, France, Australia, Israel, the USSR, and Hungary.  These advances in botanical research energized physiological plant ecology. Case studies explore the development of phytotron spinoffs such as mobile laboratories, rhizotrons, and ecotrons. Scientific problems include the significance of plant emissions of volatile organic compounds, symbiosis between plants and soil fungi, and the discovery of new pathways for photosynthesis as an adaptation to hot, dry climates. The advancement of knowledge through synthesis is a running theme: linking disciplines, combining laboratory and field research, and moving across ecological scales from leaf to ecosystem. The book also charts the history of modern scientific responses to the emerging crisis of food insecurity in the era of global warming.
This volume is the first of its kind on focusing gamma-ray telescopes. Forty-eight refereed papers provide a comprehensive overview of the scientific potential and technical challenges of this nascent tool for nuclear astrophysics. The book features articles dealing with pivotal technologies such as grazing incident mirrors, multilayer coatings, Laue- and Fresnel-lenses - and even an optic using the curvature of space-time.
This spectacularly illustrated book celebrates the structural beauty of everyday materials and the space-age technologies used to probe their surface features and internal structures. It introduces the reader to the various instruments and their uses: scanning electron, ion, and tunneling microscopies, acoustic microscopy and transmission electron microscopy. The book describes how images are processed and analyzed, and how modern materials science is based on these techniques and their ability to "see" materials at the atomic level. The book includes hundreds of illustrations and 32 pages of beautiful color plates depicting the complex microscopic realm within such everyday materials as the metals used in cars and planes, polymer fabrics, ceramics, and the ubiquitous silicon semiconductors, without which society today would fall into disarray and confusion. The many full-color and black-and-white illustrations make this book a pleasure for the eye, in addition to being a useful resource for scientists, students, researchers, and engineers involved in solid-state physics, materials science, geology, and chemistry. |
You may like...
|