![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues
This volume, from an international authority on the subject, deals
with the physical and instrumentation aspects of measurement
science, the availability of major measurement tools, and how to
use them. This book not only lays out basic concepts of electronic
measurement systems, but also provides numerous examples and
exercises for the student.
What happens when the Dalai Lama meets with leading physicists and a historian? This book is the carefully edited record of the fascinating discussions at a Mind and Life conference in which five leading physicists and a historian (David Finkelstein, George Greenstein, Piet Hut, Arthur Zajonc, Anton Zeilinger, and Tu Weiming) discussed with the Dalai Lama current thought in theoretical quantum physics, in the context of Buddhist philosophy. A contribution to the science-religion interface, and a useful explanation of our basic understanding of quantum reality, couched at a level that intelligent readers without a deep involvement in science can grasp. In the tradition of other popular books on resonances between modern quantum physics and Zen or Buddhist mystical traditions--notably The Dancing Wu Li Masters and The Tao of Physics, this book gives a clear and useful update of the genuine correspondences between these two rather disparate approaches to understanding the nature of reality.
The authors of this book argue that there is a great divide between species that makes extrapolation of biochemical research from one group to another utterly invalid. In their previous book, "Sacred Cows and Golden Geese: The Human Cost of Experiments on Animals", the Greeks showed how an amorphous but insidious network of drug manufacturers, researchers dependent on government grants to earn their living, even cage-manufacurers - among others benefiting from "white-coat welfare" - have perpetuated animal research in spite of its total unpredictability when applied to humans. (Cancer in mice, for example, has long been cured. Chimps live long and relatively healthy lives with AIDS. There is no animal form of Alzheimer's disease.) In doing so, the Greeks aimed to blow the lid off the "specious science" we have been culturally conditioned to accept. Taking these revelations one step further, this book uses accessible language to provide the scientific underpinning for the Greeks' philosophy of "do no harm to any animal, human or not," by examining paediatrics, diseases of the brain, new surgical techniques, in vitro research, the Human Genome and Proteome Projects, an array of scien
The powerful potential of digital media to engage citizens in political actions has now crossed our news screens many times. But scholarly focus has tended to be on "networked," anti-institutional forms of collective action, to the neglect of advocacy and service organizations. This book investigates the changing fortunes of the citizen-civil society relationship by exploring how social changes and innovations in communication technology are transforming the information expectations and preferences of many citizens, especially young citizens. In doing so, it is the first work to bring together theories of civic identity change with research on civic organizations. Specifically, it argues that a shift in "information styles" may help to explain the disjuncture felt by many young people when it comes to institutional participation and politics. The book theorizes two paradigms of information style: a dutiful style, which was rooted in the society, communication system and citizen norms of the modern era, and an actualizing style, which constitutes the set of information practices and expectations of the young citizens of late modernity for whom interactive digital media are the norm. Hypothesizing that civil society institutions have difficulty adapting to the norms and practices of the actualizing information style, two empirical studies apply the dutiful/actualizing framework to innovative content analyses of organizations' online communications-on their websites, and through Facebook. Results demonstrate that with intriguing exceptions, most major civil society organizations use digital media more in line with dutiful information norms than actualizing ones: they tend to broadcast strategic messages to an audience of receivers, rather than encouraging participation or exchange among an active set of participants. The book concludes with a discussion of the tensions inherent in bureaucratic organizations trying to adapt to an actualizing information style, and recommendations for how they may more successfully do so.
Artificial intelligence (AI) is often discussed as something extraordinary, a dream-or a nightmare-that awakens metaphysical questions on human life. Yet far from a distant technology of the future, the true power of AI lies in its subtle revolution of ordinary life. From voice assistants like Siri to natural language processors, AI technologies use cultural biases and modern psychology to fit specific characteristics of how users perceive and navigate the external world, thereby projecting the illusion of intelligence. Integrating media studies, science and technology studies, and social psychology, Deceitful Media examines the rise of artificial intelligence throughout history and exposes the very human fallacies behind this technology. Focusing specifically on communicative AIs, Natale argues that what we call "AI" is not a form of intelligence but rather a reflection of the human user. Using the term "banal deception," he reveals that deception forms the basis of all human-computer interactions rooted in AI technologies, as technologies like voice assistants utilize the dynamics of projection and stereotyping as a means for aligning with our existing habits and social conventions. By exploiting the human instinct to connect, AI reveals our collective vulnerabilities to deception, showing that what machines are primarily changing is not other technology but ourselves as humans. Deceitful Media illustrates how AI has continued a tradition of technologies that mobilize our liability to deception and shows that only by better understanding our vulnerabilities to deception can we become more sophisticated consumers of interactive media.
This review provides a quantitative and qualitative assessment of Southeast Asian countries' capacity in S&T and innovation.
This book presents a brief compilation of results from nearly a century of research on the globular star clusters in the Andromeda Galaxy (M31). It explores the techniques and limitations of the observations, the successes and challenges of the models, and the paradigm for the formation of M31 that has gradually emerged. These results will eventually be superseded by new data, better analysis techniques, and more complex models. However, the emphasis of this book is on the techniques, thought processes, and connections with other studies.
There have been many recent discussions of the replication crisis in psychology and other social sciences. This has been attributed, in part, to the fact that researchers hesitate to submit null results and journals fail to publish such results. In this book, Allan Franklin and Ronald Laymon analyze what constitutes a null result and present evidence, spanning a 400-year history, that null results play significant roles in physics. They begin with Galileo's experiments on falling bodies and conclude with tests of the weak equivalence principle in general relativity, the search for physics beyond the Standard Model, and the search for neutrinoless double beta decay, all in the 21st century. As these case studies make evident, null results have refuted theories, confirmed theories, provided evidence for potential new theories to explain, introduced new experimental techniques, corrected previous incorrect or misinterpreted results, and have been used to explore previously unstudied phenomena. What makes these many roles possible is the development of increasingly more accurate replications of a zero value result and the value of these replications for the effective treatment of systematic uncertainty. The book concludes with a brief analysis of certain fundamental differences between physics and social psychology in the role played by replication where these differences explain the absence of a replication crisis in physics.
A deeper understanding of neutrinos, with the goal to reveal their nature and exact role within particle physics, is at the frontier of current research. This book reviews the field in a concise fashion and highlights the most pressing issues, in addition to the strongest areas of topical interest. The text provides a clear, self-contained, and logical treatment of the fundamental physics aspects appropriate for graduate students. Starting with the relevant basics of the SM, neutrinos are introduced and the quantum mechanical effect of oscillations is explained in detail. A strong focus is then set on the phenomenon of lepton number violation, especially in 0nbb decay, as the crucial probe to understand the nature of neutrinos. The role of neutrinos in astrophysics - expected to be of increasing importance for future research - is then described. Finally, models to explain the neutrino properties are outlined. The central theme of the book is the nature of neutrino masses and the above topics revolve around this issue.
This book may be used as a companion for introductory laboratory courses, as well as possible STEM projects. It covers essential Microsoft EXCEL(R) computational skills while analyzing introductory physics projects. Topics of numerical analysis include: multiple graphs on the same sheet, calculation of descriptive statistical parameters, a 3-point interpolation, the Euler and the Runge-Kutter methods to solve equations of motion, the Fourier transform to calculate the normal modes of a double pendulum, matrix calculations to solve coupled linear equations of a DC circuit, animation of waves and Lissajous figures, electric and magnetic field calculations from the Poisson equation and its 3D surface graphs, variational calculus such as Fermat's least traveling time principle, and the least action principle. Nelson's stochastic quantum dynamics is also introduced to draw quantum particle trajectories.
This book brings together two broad themes that have generated a great deal of interested and excitement in the scientific and technical community in the last 100 years or so: quantum tunnelling and nonlinear dynamical systems. It applies these themes to nanostructured solid state heterostructures operating at room temperature to gain insight into novel photonic devices, systems and applications.
For a physicist noise is not just about sounds. It refers to any random physical process that blurs measurements and, in so doing, stands in the way of scientific knowledge. This short book deals with the most common types of noise, their properties, and some of their unexpected virtues. The text assumes that the reader knows the basics of probability theory and explains the most useful mathematical concepts related to noise. Finally, it aims at making this subject more widely known, and stimulating interest in its study in young physicists.
The goal of this book is to introduce a reader to a new philosophy of teaching and learning physics - Investigative Science Learning Environment, or ISLE (pronounced as a small island). ISLE is an example of an "intentional" approach to curriculum design and learning activities (MacMillan and Garrison 1988 A Logical Theory of Teaching: Erotetics and Intentionality). Intentionality means that the process through which the learning occurs is as crucial for learning as the final outcome or learned content. In ISLE, the process through which students learn mirrors the practice of physics.
This book provides a detailed overview of cancer theranostics applications of magnetic iron oxide nanoparticles. Their synthesis, characterization, multifunctionality, disease targeting, biodistribution, pharmacokinetics and toxicity are highlighted, along with current examples of clinical trials of magnetic nanoparticles in cancer theranostics, and their future scopes and challenges.
The emergence of Shaken Baby Syndrome (SBS) presents an object lesson in the dangers that lie at the intersection of science and criminal law. As often occurs in the context of scientific knowledge, understandings of SBS have evolved. We now know that the diagnostic triad alone does not prove beyond a reasonable doubt that an infant was abused, or that the last person with the baby was responsible for the babys condition. Nevertheless, our legal system has failed to absorb this new consensus. As a result, innocent parents and caregivers remain incarcerated and, perhaps more perplexingly, triad-only prosecutions continue even to this day. Flawed Convictions: Shaken Baby Syndrome and the Inertia of Injustice is the first book to survey the scientific, cultural, and legal history of Shaken Baby Syndrome from inception to formal dissolution. It exposes extraordinary failings in the criminal justice systems treatment of what is, in essence, a medical diagnosis of murder. The story of SBS highlights fundamental inadequacies in the legal response to science dependent prosecution. A proposed restructuring of the law contends with the uncertainty of scientific knowledge.
Since the first publication of this definitive work nearly 40 years ago, this fourth edition has been completely rewritten. Crystallization is used at some stage in nearly all process
industries as a method of production, purification or recovery of
solid materials.
Practically every display technology in use today relies on the flat, energy-efficient construction made possible by liquid crystals. These displays provide visually-crisp, vibrantly-colored images that a short time ago were thought only possible in science fiction. Liquid crystals are known mainly for their use in display technologies, but they also provide many diverse and useful applications: adaptive optics, electro-optical devices, films, lasers, photovoltaics, privacy windows, skin cleansers and soaps, and thermometers. The striking images of liquid crystals changing color under polarized lighting conditions are even on display in many museums and art galleries - true examples of science meeting art. Yet, although liquid crystals provide us with visually stunning displays, fascinating applications, and are a rich and fruitful source of interdisciplinary research, their full potential may remain untapped.
The first part of this text provides an overview of the physics of lasers and it describes some of the more common types of lasers and their applications. The production of laser light requires the formation of a resonant cavity where stimulated emission of radiation occurs. The light produced in this way is intense, coherent and monochromatic. Applications of lasers include CD/DVD players, laser printers and fiber optic communication devices. While these devices depend largely on the monochromaticity and coherence of the light that lasers produce, other well-known applications, such as laser machining and laser fusion depend on the intensity of laser light. The second part of the book describes the phenomenon of Bose-Einstein condensation. These condensates represent a state of matter that exists in some dilute gases at very low temperature as predicted first by Satyendra Nath Bose and Albert Einstein. Bose-Einstein condensates were first observed experimentally in 1995 by Eric Cornell and Carl Wieman at the University of Colorado, and shortly thereafter by Wolfgang Ketterle at the Massachusetts Institute of Technology. The experimental techniques used to create a Bose-Einstein condensate provide an interesting and unconventional application of lasers: the cooling and confinement of a dilute gas at very low temperature.
Semiconductors and Modern Electronics is a brief introduction to the physics behind semiconductor technologies. Chuck Winrich explores the topic of semiconductors from a qualitative approach to understanding the theories and models used to explain semiconductor devices, which is intended to bring the advanced ideas behind semiconductors to a broader audience of students who will not major in physics. Applications of semiconductors are explored and understood through the models developed in the book. Much of the inspiration for this text comes from Winrich's experience teaching a general electronics course to students majoring in business. The goal of that class, and this work, is to bring forward the science behind semiconductors, and then to look at how that science affects the lives of people.
The nuclear Nonproliferation Treaty (NPT) is the cornerstone of nonproliferation and disarmament efforts, yet its negotiation and success was not inevitable. This book aims to address the developments that led to the negotiation of the treaty, examine its implementation, and address challenges that the NPT faces going forward. It begins with an overview of precursor efforts to establish international limits on nuclear weapons and why these efforts failed. It also looks at the changes in the political environment and technical advances, which together increased the threat of proliferation and drove states to negotiate the NPT. The second chapter considers the negotiation of the treaty itself and looks at the gap between US and Soviet positions on key areas like alliance control of nuclear weapons, and how the two governments found common ground on nonproliferation language. It also explores the critical role played by the non-aligned movement to push inclusion of disarmament provisions that would become the foundation for Article VI of the treaty and the hesitancy of nuclear-armed states to support disbarment language and timelines. Chapter 3 of the book focuses on implementation of the NPT and its initial successes in heading off states with nuclear weapons research programs. It addresses how the treaty responded to challenges like the dissolution of the Soviet Union and gaps identified by the illicit nuclear weapons programs in Iraq and North Korea in the early 1990s. Chapter 3 also includes a section on the debate in 1995 over extending the treaty indefinitely, and the compromises reached to satisfy the concerns of the non-nuclear weapon states. Finally, Chapter 4 addresses some of the outstanding challenges to the NPT that remain unresolved, such as the continued failure to convene a conference on the Middle East WMD-free zone and specify the consequences of withdrawing from the NPT, and repurposing civilian nuclear technology transferred under the treaty weapons purposes. It also looks at how the ban treaty under negotiations in the United Nations will support or undermine the NPT's objectives.
A venerable tradition in the metaphysics of science commends ontological reduction: the practice of analysis of theoretical entities into further and further proper parts, with the understanding that the original entity is nothing but the sum of these. This tradition implicitly subscribes to the principle that all the real action of the universe (also referred to as its "causation") happens at the smallest scales-at the scale of microphysics. A vast majority of metaphysicians and philosophers of science, covering a wide swath of the spectrum from reductionists to emergentists, defend this principle. It provides one pillar of the most prominent theory of science, to the effect that the sciences are organized in a hierarchy, according to the scales of measurement occupied by the phenomena they study. On this view, the fundamentality of a science is reckoned inversely to its position on that scale. This venerable tradition has been justly and vigorously countered-in physics, most notably: it is countered in quantum theory, in theories of radiation and superconduction, and most spectacularly in renormalization theories of the structure of matter. But these counters-and the profound revisions they prompt-lie just below the philosophical radar. This book illuminates these counters to the tradition principle, in order to assemble them in support of a vaster (and at its core Aristotelian) philosophical vision of sciences that are not organized within a hierarchy. In so doing, the book articulates the principle that the universe is active at absolutely all scales of measurement. This vision, as the book shows, is warranted by philosophical treatment of cardinal issues in the philosophy of science: fundamentality, causation, scientific innovation, dependence and independence, and the proprieties of explanation.
In a groundbreaking examination of the antislavery origins of liberal Protestantism, Molly Oshatz contends that the antebellum slavery debates forced antislavery Protestants to adopt an historicist understanding of truth and morality. Unlike earlier debates over slavery, the antebellum slavery debates revolved around the question of whether or not slavery was a sin in the abstract. Unable to use the letter of the Bible to answer the proslavery claim that slavery was not a sin in and of itself, antislavery Protestants, including William Ellery Channing, Francis Wayland, Moses Stuart, Leonard Bacon, and Horace Bushnell, argued that biblical principles opposed slavery and that God revealed slavery's sinfulness through the gradual unfolding of these principles. Although they believed that slavery was a sin, antislavery Protestants' sympathy for individual slaveholders and their knowledge of the Bible made them reluctant to denounce all slaveholders as sinners. In order to reconcile slavery's sinfulness with their commitments to the Bible and to the Union, antislavery Protestants defined slavery as a social rather than an individual sin. Oshatz demonstrates that the antislavery notions of progressive revelation and social sin had radical implications for Protestant theology. Oshatz carries her study through the Civil War to reveal how emancipation confirmed for northern Protestants the antislavery notion that God revealed His will through history. She describes how after the war, a new generation of liberal theologians, including Newman Smyth, Charles Briggs, and George Harris, drew on the example of antislavery and emancipation to respond to evolution and historical biblical criticism. The theological innovations rooted in the slavery debates came to fruition in liberal Protestantism's acceptance of the historical and evolutionary nature of religious truth.
This book provides a rigorous, physics-focused introduction to set theory that is geared towards natural science majors. The science major is presented with a robust introduction to set theory, which concentrates on the specific knowledge and skills that will be needed in calculus topics and natural science topics in general.
This is the fifth volume of "Advances in Sonochemistry" the first
having been published in 1990. The definition of sonochemistry has
developed to include not only the ways in which ultrsound has been
harnessed to effect chemistry but also its uses in material
processing. Subjects included range from chemical dosimetry to
ultrasound in microbiology to ultrasound in the extraction of plant
materials and in leather technology.
The sea is steadily rising, presently at 3.4 mm per year, and it is already costing billions in Venice, on the Thames river and in New York City, to counter sea-level-related surges. Experts anticipate an accelerated rise, and credible predictions for sea-level rise by the year 2100 range from 12 inches to above six feet. Study of the Earth's geologic history, through ice-core samples, links sea-level rise to temperature rise. Since the lifetime of carbon dioxide in the atmosphere is measured in centuries, and it has upset the balance of incoming and outgoing energy, the Earth's temperature will continue to rise, even if carbon burning ceases. Engineering the Earth's solar input appears increasingly attractive and practical as a means to lower the Earth's temperature and, thus, to lower the sea level. The cost of engineering the climate appears small; comparable, even, to the already-incurred costs of sea-level rise represented by civil engineering projects in London, Venice and New York City. Feasible deployment of geoengineering, accompanied by some reduction in carbon burning, is predicted to lower the sea level by the order of one foot by 2100, which negates the expected rise and would provide an immense economic benefit. The accompanying lower global temperature would reduce the severity of extreme weather and restore habitability to lethally hot parts of the world. |
You may like...
Third Millennium Thinking - Creating…
Saul Perlmutter, Robert Maccoun, …
Paperback
R437
Discovery Miles 4 370
|