![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues
The first part of this text provides an overview of the physics of lasers and it describes some of the more common types of lasers and their applications. The production of laser light requires the formation of a resonant cavity where stimulated emission of radiation occurs. The light produced in this way is intense, coherent and monochromatic. Applications of lasers include CD/DVD players, laser printers and fiber optic communication devices. While these devices depend largely on the monochromaticity and coherence of the light that lasers produce, other well-known applications, such as laser machining and laser fusion depend on the intensity of laser light. The second part of the book describes the phenomenon of Bose-Einstein condensation. These condensates represent a state of matter that exists in some dilute gases at very low temperature as predicted first by Satyendra Nath Bose and Albert Einstein. Bose-Einstein condensates were first observed experimentally in 1995 by Eric Cornell and Carl Wieman at the University of Colorado, and shortly thereafter by Wolfgang Ketterle at the Massachusetts Institute of Technology. The experimental techniques used to create a Bose-Einstein condensate provide an interesting and unconventional application of lasers: the cooling and confinement of a dilute gas at very low temperature.
Semiconductors and Modern Electronics is a brief introduction to the physics behind semiconductor technologies. Chuck Winrich explores the topic of semiconductors from a qualitative approach to understanding the theories and models used to explain semiconductor devices, which is intended to bring the advanced ideas behind semiconductors to a broader audience of students who will not major in physics. Applications of semiconductors are explored and understood through the models developed in the book. Much of the inspiration for this text comes from Winrich's experience teaching a general electronics course to students majoring in business. The goal of that class, and this work, is to bring forward the science behind semiconductors, and then to look at how that science affects the lives of people.
The nuclear Nonproliferation Treaty (NPT) is the cornerstone of nonproliferation and disarmament efforts, yet its negotiation and success was not inevitable. This book aims to address the developments that led to the negotiation of the treaty, examine its implementation, and address challenges that the NPT faces going forward. It begins with an overview of precursor efforts to establish international limits on nuclear weapons and why these efforts failed. It also looks at the changes in the political environment and technical advances, which together increased the threat of proliferation and drove states to negotiate the NPT. The second chapter considers the negotiation of the treaty itself and looks at the gap between US and Soviet positions on key areas like alliance control of nuclear weapons, and how the two governments found common ground on nonproliferation language. It also explores the critical role played by the non-aligned movement to push inclusion of disarmament provisions that would become the foundation for Article VI of the treaty and the hesitancy of nuclear-armed states to support disbarment language and timelines. Chapter 3 of the book focuses on implementation of the NPT and its initial successes in heading off states with nuclear weapons research programs. It addresses how the treaty responded to challenges like the dissolution of the Soviet Union and gaps identified by the illicit nuclear weapons programs in Iraq and North Korea in the early 1990s. Chapter 3 also includes a section on the debate in 1995 over extending the treaty indefinitely, and the compromises reached to satisfy the concerns of the non-nuclear weapon states. Finally, Chapter 4 addresses some of the outstanding challenges to the NPT that remain unresolved, such as the continued failure to convene a conference on the Middle East WMD-free zone and specify the consequences of withdrawing from the NPT, and repurposing civilian nuclear technology transferred under the treaty weapons purposes. It also looks at how the ban treaty under negotiations in the United Nations will support or undermine the NPT's objectives.
A venerable tradition in the metaphysics of science commends ontological reduction: the practice of analysis of theoretical entities into further and further proper parts, with the understanding that the original entity is nothing but the sum of these. This tradition implicitly subscribes to the principle that all the real action of the universe (also referred to as its "causation") happens at the smallest scales-at the scale of microphysics. A vast majority of metaphysicians and philosophers of science, covering a wide swath of the spectrum from reductionists to emergentists, defend this principle. It provides one pillar of the most prominent theory of science, to the effect that the sciences are organized in a hierarchy, according to the scales of measurement occupied by the phenomena they study. On this view, the fundamentality of a science is reckoned inversely to its position on that scale. This venerable tradition has been justly and vigorously countered-in physics, most notably: it is countered in quantum theory, in theories of radiation and superconduction, and most spectacularly in renormalization theories of the structure of matter. But these counters-and the profound revisions they prompt-lie just below the philosophical radar. This book illuminates these counters to the tradition principle, in order to assemble them in support of a vaster (and at its core Aristotelian) philosophical vision of sciences that are not organized within a hierarchy. In so doing, the book articulates the principle that the universe is active at absolutely all scales of measurement. This vision, as the book shows, is warranted by philosophical treatment of cardinal issues in the philosophy of science: fundamentality, causation, scientific innovation, dependence and independence, and the proprieties of explanation.
In a groundbreaking examination of the antislavery origins of liberal Protestantism, Molly Oshatz contends that the antebellum slavery debates forced antislavery Protestants to adopt an historicist understanding of truth and morality. Unlike earlier debates over slavery, the antebellum slavery debates revolved around the question of whether or not slavery was a sin in the abstract. Unable to use the letter of the Bible to answer the proslavery claim that slavery was not a sin in and of itself, antislavery Protestants, including William Ellery Channing, Francis Wayland, Moses Stuart, Leonard Bacon, and Horace Bushnell, argued that biblical principles opposed slavery and that God revealed slavery's sinfulness through the gradual unfolding of these principles. Although they believed that slavery was a sin, antislavery Protestants' sympathy for individual slaveholders and their knowledge of the Bible made them reluctant to denounce all slaveholders as sinners. In order to reconcile slavery's sinfulness with their commitments to the Bible and to the Union, antislavery Protestants defined slavery as a social rather than an individual sin. Oshatz demonstrates that the antislavery notions of progressive revelation and social sin had radical implications for Protestant theology. Oshatz carries her study through the Civil War to reveal how emancipation confirmed for northern Protestants the antislavery notion that God revealed His will through history. She describes how after the war, a new generation of liberal theologians, including Newman Smyth, Charles Briggs, and George Harris, drew on the example of antislavery and emancipation to respond to evolution and historical biblical criticism. The theological innovations rooted in the slavery debates came to fruition in liberal Protestantism's acceptance of the historical and evolutionary nature of religious truth.
This is the fifth volume of "Advances in Sonochemistry" the first
having been published in 1990. The definition of sonochemistry has
developed to include not only the ways in which ultrsound has been
harnessed to effect chemistry but also its uses in material
processing. Subjects included range from chemical dosimetry to
ultrasound in microbiology to ultrasound in the extraction of plant
materials and in leather technology.
This book provides a rigorous, physics-focused introduction to set theory that is geared towards natural science majors. The science major is presented with a robust introduction to set theory, which concentrates on the specific knowledge and skills that will be needed in calculus topics and natural science topics in general.
The sea is steadily rising, presently at 3.4 mm per year, and it is already costing billions in Venice, on the Thames river and in New York City, to counter sea-level-related surges. Experts anticipate an accelerated rise, and credible predictions for sea-level rise by the year 2100 range from 12 inches to above six feet. Study of the Earth's geologic history, through ice-core samples, links sea-level rise to temperature rise. Since the lifetime of carbon dioxide in the atmosphere is measured in centuries, and it has upset the balance of incoming and outgoing energy, the Earth's temperature will continue to rise, even if carbon burning ceases. Engineering the Earth's solar input appears increasingly attractive and practical as a means to lower the Earth's temperature and, thus, to lower the sea level. The cost of engineering the climate appears small; comparable, even, to the already-incurred costs of sea-level rise represented by civil engineering projects in London, Venice and New York City. Feasible deployment of geoengineering, accompanied by some reduction in carbon burning, is predicted to lower the sea level by the order of one foot by 2100, which negates the expected rise and would provide an immense economic benefit. The accompanying lower global temperature would reduce the severity of extreme weather and restore habitability to lethally hot parts of the world.
Electrostatic accelerators have been at the forefront of modern technology since 1932, when Sir John Cockroft and Ernest Walton developed the first accelerator. Although the electrostatic accelerator field is more than 90 years old, the field and the number of accelerators is growing more rapidly than ever. This book provides an overview of the basic science and technology that underlies the electrostatic accelerator field so it can serve as a reference guide and textbook for accelerator engineers as well as students and researchers who work with electrostatic accelerators.
An examination of the ways that digital technologies play an increasingly important role in the lives of precarious workers, far beyond the gig economy apps like Uber and Lyft. Over the past three decades, digital technologies like smartphones and laptops have transformed the way we work in the US. At the same time, workers at both ends of the income ladder have experienced rising levels of job insecurity and anxiety about their economic futures. In Left to Our Own Devices, Julia Ticona explores the ways that workers use their digital technologies to navigate insecure and flexible labor markets. Through 100 interviews with high and low-wage precarious workers across the US, she explores the surprisingly similar "digital hustles" they use to find work and maintain a sense of dignity and identity. Ticona then reveals how the digital hustle ultimately reproduces inequalities between workers at either end of polarized labor markets. A moving and accessible look at the intimate consequences of contemporary capitalism, Left to Our Own Devices will be of interest to sociologists, communication and media studies scholars, as well as a general audience of readers interested in digital technologies, inequality, and the future of work in the US.
This inquiry into the technical advances that shaped the 20th
century follows the evolutions of all the principal innovations
introduced before 1913 (as detailed in the first volume) as well as
the origins and elaborations of all fundamental 20th century
advances. The history of the 20th century is rooted in amazing
technical advances of 1871-1913, but the century differs so
remarkably from the preceding 100 years because of several
unprecedented combinations. The 20th century had followed on the
path defined during the half century preceding the beginning of
World War I, but it has traveled along that path at a very
different pace, with different ambitions and intents. The new
century's developments elevated both the magnitudes of output and
the spatial distribution of mass industrial production and to new
and, in many ways, virtually incomparable levels. Twentieth century
science and engineering conquered and perfected a number of
fundamental challenges which remained unresolved before 1913, and
which to many critics appeared insoluble. This book is organized in
topical chapters dealing with electricity, engines, materials and
syntheses, and information techniques. It concludes with an
extended examination of contradictory consequences of our admirable
technical progress by confronting the accomplishments and perils of
systems that brought liberating simplicity as well as overwhelming
complexity, that created unprecedented affluence and equally
unprecedented economic gaps, that greatly increased both our
security and fears as well as our understanding and ignorance, and
that provided the means for greater protection of the biosphere
while concurrently undermining some of the keybiophysical
foundations of life on Earth.
The Oxford Handbook of German Philosophy in the Nineteenth Century is the first collective critical study of this important period in intellectual history. The volume is divided into four parts. The first part explores individual philosophers, including Fichte, Hegel, Schopenhauer, Marx, and Nietzsche, amongst other great thinkers of the period. The second addresses key philosophical movements: Idealism, Romanticism, Neo-Kantianism, and Existentialism. The essays in the third part engage with different areas of philosophy that received particular attention at this time, including philosophy of nature, philosophy of mind, philosophy of language, philosophy of history, and hermeneutics. Finally, the contributors turn to discuss central philosophical topics, from skepticism to mat-erialism, from dialectics to ideas of historical and cultural Otherness, and from the reception of antiquity to atheism. Written by a team of leading experts, this Handbook will be an essential resource for anyone working in the area and will lead the direction of future research.
Though many separation processes are available for use in todays analytical laboratory, chromatographic methods are the most widely used. The applications of chromatography have grown explosively in the last four decades, owing to the development of new techniques and to the expanding need of scientists for better methods of separating complex mixtures. With its comprehensive, unified approach, this book will greatly assist the novice in need of a reference to chromatographic techniques, as well as the specialist suddenly faced with the need to switch from one technique to another.
This book is meant to be a companion volume for the ACS Symposium Series Book entitled Nuts and Bolts of Chemical Education Research. In the Nuts and Bolts book (edited by Diane M. Bunce and Renee Cole), readers were presented with information on how to conduct quality chemical education research. In the Myth book, exemplars of chemical education research are featured. In the cases where the chapter in the book is describing research that has already been published (typically in the Journal of Chemical Education), additional information is provided either in terms of research questions investigated that were not reported in the published article or background information on decisions made in the research that helped the investigation. The main focus of this type of discussion is to engage the reader in the reality of doing chemical education research including a discussion of the authors' motivation. It is expected that these two books could be used as textbooks for graduate chemical education courses showing how to do chemical education research and then providing examples of quality research.
The role of chance changed in the nineteenth century, and American literature changed with it. Long dismissed as a nominal concept, chance was increasingly treated as a natural force to be managed but never mastered. New theories of chance sparked religious and philosophical controversies while revolutionizing the sciences as probabilistic methods spread from mathematics, economics, and sociology to physics and evolutionary biology. Chance also became more visible in everyday life as Americans struggled to control its power through weather forecasting, insurance, game theory, statistics, military science, and financial strategy. Uncertain Chances shows how the rise of chance shaped the way nineteenth-century American writers faced questions of doubt and belief. Poe in his detective fiction critiques probabilistic methods. Melville in Moby-Dick and beyond struggles to vindicate moral action under conditions of chance. Douglass and other African American authors fight against statistical racism. Thoreau learns to appreciate the play between nature's randomness and order. Dickinson works faithfully to render poetically the affective experience of chance-surprise. These and other nineteenth-century writers dramatize the inescapable dangers and wonderful possibilities of chance. Their writings even help to navigate extremes that remain with us today-fundamentalism and relativism, determinism and chaos, terrorism and risk-management, the rational confidence of the Enlightenment and the debilitating doubts of modernity.
Dalton's theory of the atom is generally considered to be what made
the atom a scientifically fruitful concept in chemistry. To be
sure, by Dalton's time the atom had already had a two-millenium
history as a philosophical idea, and corpuscular thought had long
been viable in natural philosophy (that is, in what we would today
call physics).
In 1687 Isaac Newton ushered in a new scientific era in which laws of nature could be used to predict the movements of matter with almost perfect precision. Newton's physics also posed a profound challenge to our self-understanding, however, for the very same laws that keep airplanes in the air and rivers flowing downhill tell us that it is in principle possible to predict what each of us will do every second of our entire lives, given the early conditions of the universe. Can it really be that even while you toss and turn late at night in the throes of an important decision and it seems like the scales of fate hang in the balance, that your decision is a foregone conclusion? Can it really be that everything you have done and everything you ever will do is determined by facts that were in place long before you were born? This problem is one of the staples of philosophical discussion. It is discussed by everyone from freshman in their first philosophy class, to theoretical physicists in bars after conferences. And yet there is no topic that remains more unsettling, and less well understood. If you want to get behind the facade, past the bare statement of determinism, and really try to understand what physics is telling us in its own terms, read this book. The problem of free will raises all kinds of questions. What does it mean to make a decision, and what does it mean to say that our actions are determined? What are laws of nature? What are causes? What sorts of things are we, when viewed through the lenses of physics, and how do we fit into the natural order? Ismael provides a deeply informed account of what physics tells us about ourselves. The result is a vision that is abstract, alien, illuminating, and-Ismael argues-affirmative of most of what we all believe about our own freedom. Written in a jargon-free style, How Physics Makes Us Free provides an accessible and innovative take on a central question of human existence.
Inspired by the opportunities and challenges presented by rapid advances in the fields of retrieval of chemical and other scientific information, several speakers presented at a symposium, The History of the Future of Chemical Information, on Aug. 20, 2012, at the 244th Meeting of the American Chemical Society in Philadelphia, PA. Storage and retrieval is of undeniable value to the conduct of chemical research. The participants believe that past practices in this field have not only contributed to the increasingly rapid evolution of the field but continue to do so, hence the somewhat unusual title. Even with archival access to several of the presentations, a number of the presenters felt that broader access to this information is of value. Thus, the presenters decided to create an ACS Symposium book based on the topic, with the conviction that it would be valuable to chemists of all disciplines. The past is a moving target depending on the vagaries of technology, economics, politics and how researchers and professionals choose to build on it. The aim of The History of the Future of Chemical Information is to critically examine trajectories in chemistry, information and communication as determined by the authors in the light of current and possible future practices of the chemical information profession. Along with some additional areas primarily related to present and future directions, this collection contains most of the topics covered in the meeting symposium. Most of the original authors agreed to write chapters for this book. Much of the historical and even current material is scattered throughout the literature so the authors strived to gather this information into a discrete source. Faced with the rapid evolution of such aspects as mobile access to information, cloud computing, and public resource production, this book will be not only of interest but provide valuable insight to this rapidly evolving field, both to practitioners within the field of chemical information and chemists everywhere whose need for current and accurate information on chemistry and related fields is increasingly important.
Students taught with inquiry-based methods have been shown to make significant progress in their ability to formulate hypotheses, make proper assumptions, design and execute investigations, understand variables, record data, and synthesize new knowledge. are taught with it. This text presents a series of experiments that are intended to serve as the solid basis for a first-year chemistry or physical sciences course, using an inquiry based approach. Each provides: 1)instructions for an experiment; 2) in-depth teachers notes and 3) a sample lab report.
From the beginnings of industrial capitalism to contemporary
disputes over evolution, nature has long been part of the public
debate over the social good. As such, many natural scientists
throughout American history have understood their work as a
cultural activity contributing to social stability and their field
as a powerful tool for enhancing the quality of American life. In
the late Victorian era, interwar period, and post-war decades,
massive social change, economic collapse and recovery, and the
aftermath of war prompted natural scientists to offer up a
civic-minded natural science concerned with the political
well-being of American society. In Science and the Social Good,
John P. Herron explores the evolving internal and external forces
influencing the design and purpose of American natural science, by
focusing on three representative scientists-geologist Clarence
King, forester Robert Marshall, and biologist Rachel Carson-who
purposefully considered the social outcomes of their work.
By Parallel Reasoning is the first comprehensive philosophical
examination of analogical reasoning in more than forty years
designed to formulate and justify standards for the critical
evaluation of analogical arguments. It proposes a normative theory
with special focus on the use of analogies in mathematics and
science.
In recent decades there has been an explosion in work in the social and physical sciences describing the similarities between human and nonhuman as well as human and non-animal thinking. This work has explicitly decentered the brain as the sole, self-contained space of thought, and it has found thinking to be an activity that operates not only across bodies but also across bodily or cellular membranes, as well as multifaceted organic and inorganic environments. For example, researchers have looked at the replication and spread of slime molds (playfully asking what would happen if they colonized the earth) to suggest that they exhibit 'smart behavior' in the way they move as a potential way of considering the spread of disease across the globe. Other scholars have applied this model of non-human thought to the reach of data mining and global surveillance. In The Biopolitics of Alphabets and Embryos, Ruth Miller argues that these types of phenomena are also useful models for thinking about the growth, reproduction, and spread of political thought and democratic processes. Giving slime, data and unbounded entities their political dues, Miller stresses their thinking power and political significance and thus challenges the anthropocentrism of mainstream democratic theories. Miller emphasizes the non-human as highly organized, systemic and productive of democratic growth and replication. She examines developments such as global surveillance, embryonic stem cell research, and cloning, which have been characterized as threats to the privacy, dignity, and integrity of the rational, maximizing and freedom-loving democratic citizen. By shifting her level of analysis from the politics of self-determining subjects to the realm of material environments and information systems, Miller asks what might happen if these alternative, nonhuman thought processes become the normative thought processes of democratic engagement. |
You may like...
The Four Horsemen - The Discussion That…
Richard Dawkins, Sam Harris, …
Hardcover
(3)
Sapiens - A Brief History Of Humankind
Yuval Noah Harari
Paperback
(4)
Science vs Religion - What Scientists…
Elaine Howard Ecklund
Hardcover
R1,240
Discovery Miles 12 400
Third Millennium Thinking - Creating…
Saul Perlmutter, Robert Maccoun, …
Paperback
|