|
|
Books > Professional & Technical
McKittrick’s history of the 1918 Kalahari Thirstland Redemption Scheme reveals the environment to have been central to South African understandings of race. The plan fanned white settlers’ visions for South Africa, stoked mistrust in scientific experts, and influenced ideas about race and the environment in South Africa for decades to come.
In 1918, South Africa’s climate seemed to be drying up. White farmers claimed that rainfall was dwindling, while nineteenth-century missionaries and explorers had found riverbeds, seashells, and other evidence of a verdant past deep in the Kalahari Desert. Government experts insisted, however, that the rains weren’t disappearing; the land, long susceptible to periodic drought, had been further degraded by settler farmers’ agricultural practices—an explanation that white South Africans rejected. So when the geologist Ernest Schwarz blamed the land itself, the farmers listened. Schwarz held that erosion and topography had created arid conditions, that rainfall was declining, and that agriculture was not to blame. As a solution, he proposed diverting two rivers to the Kalahari’s basins, creating a lush country where white South Africans could thrive. This plan, which became known as the Kalahari Thirstland Redemption Scheme, was rejected by most scientists. But it found support among white South Africans who worried that struggling farmers undermined an image of racial superiority.
Green Lands for White Men explores how white agriculturalists in southern Africa grappled with a parched and changing terrain as they sought to consolidate control over a black population. Meredith McKittrick’s timely history of the Redemption Scheme reveals the environment to have been central to South African understandings of race.
While Schwarz’s plan was never implemented, it enjoyed suffi cient support to prompt government research into its feasibility, and years of debate. McKittrick shows how white farmers rallied around a plan that represented their interests over those of the South African state and delves into the reasons behind this schism between expert opinion and public perception. This backlash against the predominant scientific view, McKittrick argues, displayed the depth of popular mistrust in an expanding scientific elite.
A detailed look at the intersection of a settler society, climate change, white nationalism, and expert credibility, Green Lands for White Men examines the reverberations of a scheme that ultimately failed but influenced ideas about race and the environment in South Africa for decades to come.
Low-Energy Nuclear Reactions and New Energy is a summary of
selected experimental and theoretical research performed over the
last 19 years that gives profound and unambiguous evidence for low
energy nuclear reaction (LENR), historically known as cold fusion.
In 1989, the subject was announced with great fanfare, to the
chagrin of many people in the science community. However, the
significant claim of its discoverers, Martin Fleischmann and
Stanley Pons, excess heat without harmful neutron emissions or
strong gamma radiation, involving electrochemical cells using heavy
water and palladium, has held strong.
In recent years, LENR, within the field of condensed matter nuclear
science, has begun to attract widespread attention and is regarded
as a potential alternative and renewable energy source to confront
climate change and energy scarcity. The aim of the research is to
collect experimental findings for LENR in order to present
reasonable explanations and a conclusive theoretical and practical
working model.
The goal of the field is directed toward the fabrication of LENR
devices with unique commercial potential demonstrating an
alternative energy source that does not produce greenhouse gases,
long-lived radiation or strong prompt radiation. The idea of LENR
has led to endless discussions about the kinetic impossibility of
intense nuclear reactions with high coulomb barrier potential.
However, recent theoretical work may soon shed light on this
mystery.
Understanding this process is one of the most challenging and
perhaps important issues in the scientific world. This book
includes previously unpublished studies, new and controversial
theories to approach LENR with access to new sources and
experimental results. The book offers insight into this
controversial subject and will help readers re-evaluate their
perspective on LENR as a possible alternative energy source.
It is fitting that Book I of the series should be on the subject of
finite elements. The finite element method is now well established
as an engineering tool with wide application. At the same time is
has attracted considerable attention from mathematicians over the
last ten years, so that a large body of mathematical theory now
exists.
This book focuses on the fundamentals of rock mechanics as a basis
for the safe and economical design and construction of tunnels,
dam foundations and slopes in jointed and anisotropic rock.
It is divided into four main parts:
- Fundamentals and models
- Analysis and design methods
- Exploration, testing and monitoring
- Applications and case histories.
The rock mechanical models presented account for the influence of
discontinuities on the stress-strain behavior and the permeability
of jointed rock masses.
This book is for:
- Civil- and Mining-Engineers
- Geologists
- Students in the related fields
This text book is for senior and graduate engineers. It should be used for senior and advanced design classes. It follows Suh's other book with OUP, Principles of Design (OUP, 1990). Suh has proposed axiomatic design as a means of creating the science base for the field of design.
This volume provides an up-to-the-minute review of the open economy approach to analysing environmental problems and policies, which has produced a wealth of research over the past decade. It contains non-technical, issue-oriented, and comprehensive surveys written by specialists in international and environmental economics. The volume will appeal to scholars and students of economics and political science.
Chemometrics and Chemoinformatics gives chemists and other
scientists an introduction to the field of chemometrics and
chemoinformatics. Chemometrics is an approach to analytical
chemistry based on the idea of indirect observation. Measurements
related to the chemical composition of a substance are taken, and
the value of a property of interest is inferred from them through
some mathematical relation. Basically, chemometrics is a process.
Measurements are made, data is collected, and information is
obtained to periodically assess and acquire knowledge. This, in
turn, has led to a new approach for solving scientific problems:
(1) measure a phenomenon or process using chemical instrumentation
that generates data inexpensively, (2) analyze the multivariate
data, (3) iterate if necessary, (4) create and test the model, and
(5) develop fundamental multivariate understanding of the process.
Chemoinformatics is a subfield of chemometrics, which encompasses
the analysis, visualization, and use of chemical structural
information as a surrogate variable for other data or information.
The boundaries of chemoinformatics have not yet been defined. Only
recently has this term been coined. Chemoinformatics takes
advantage of techniques from many disciplines such as molecular
modeling, chemical information, and computational chemistry. The
reason for the interest in chemoinformatics is the development of
experimental techniques such as combinatorial chemistry and
high-throughput screening, which require a chemist to analyze
unprecedented volumes of data. Access to appropriate algorithms is
crucial if such experimental techniques are to be effectively
exploited for discovery. Many chemists want to use chemoinformatic
methods in their work but lack the knowledge required to decide
which techniques are the most appropriate.
In recent decades there has been an explosion in work in the social
and physical sciences describing the similarities between human and
nonhuman as well as human and non-animal thinking. This work has
explicitly decentered the brain as the sole, self-contained space
of thought, and it has found thinking to be an activity that
operates not only across bodies but also across bodily or cellular
membranes, as well as multifaceted organic and inorganic
environments. For example, researchers have looked at the
replication and spread of slime molds (playfully asking what would
happen if they colonized the earth) to suggest that they exhibit
'smart behavior' in the way they move as a potential way of
considering the spread of disease across the globe. Other scholars
have applied this model of non-human thought to the reach of data
mining and global surveillance. In The Biopolitics of Alphabets and
Embryos, Ruth Miller argues that these types of phenomena are also
useful models for thinking about the growth, reproduction, and
spread of political thought and democratic processes. Giving slime,
data and unbounded entities their political dues, Miller stresses
their thinking power and political significance and thus challenges
the anthropocentrism of mainstream democratic theories. Miller
emphasizes the non-human as highly organized, systemic and
productive of democratic growth and replication. She examines
developments such as global surveillance, embryonic stem cell
research, and cloning, which have been characterized as threats to
the privacy, dignity, and integrity of the rational, maximizing and
freedom-loving democratic citizen. By shifting her level of
analysis from the politics of self-determining subjects to the
realm of material environments and information systems, Miller asks
what might happen if these alternative, nonhuman thought processes
become the normative thought processes of democratic engagement.
Gain insight into the mechanical properties and performance of
engineering ceramics and composites. This collection of articles
illustrates the Mechanical Behavior and Performance of Ceramics
& Composites symposium, which included over 100 presentations
representing 10 countries. The symposium addressed the cutting-edge
topics on mechanical properties and reliability of ceramics and
composites and their correlations to processing, microstructure,
and environmental effects.
The 37th International Symposium on the Scientific Basis for
Nuclear Waste Management (Materials Research Society Symposium
Proceedings Volume 1665) was held in Barcelona (Catalonia, Spain),
September 30-October 3, 2013. The symposium was officially opened
by Dr Antoni Gurgui, commissioner of Consejo Seguridad Nuclear
(Nuclear Safety Council) in Spain. About 80 attendees from 12
countries listened to 51 presentations and discussed 29 posters
during the three and a half days of scientific sessions. The
symposium covered the following topics: national and international
programs; performance assessment/geological disposal; radionuclide
solubility, speciation, sorption and migration; corrosion studies
of zircaloy, container and carbon steel; high-level waste; and
ceramic and advanced materials.
Forestry Economics introduces students and practitioners to all
aspects of the management and economics of forestry. The book
adopts the approach of managerial economics textbooks and applies
this to the unique processes and problems faced by managers of
forests. While most forestry economics books are written by
economists for future economists, what many future forest and
natural resource managers need is to understand what economic
information is and how to use it to make better business and
management decisions. John E. Wagner draws on his twenty years of
experience teaching and working in the field of forest resource
economics to present students with an accessible understanding of
the unique production processes and problems faced by forest and
other natural resource managers. There are three unique features of
this book: The first is its organization. The material is organized
around two common economic models used in forest and natural
resources management decision making. The second is the use of case
studies from various disciplines: Outdoor and Commercial
Recreation, Wood Products Engineering, Forest Products, and
Forestry. The purpose of these case studies is to provide students
with applications of the concepts being discussed within the text.
The third is revisiting the question of how to use economic
information to make better business decisions at the end of each
chapter. This ties each chapter to the preceding ones and
reinforces the hypothesis that a solid working knowledge of these
economic models and the information they contain are necessary for
making better business decisions. This textbook is an invaluable
source of clear and accessible information on forestry economics
and management for not only economics students, but for students of
other disciplines and those already working in forestry and natural
resources.
This second edition of Reese and Van Impe's book has been
extensively revised to be compatible in the classroom setting. New
features include homework problems with solution aides presented by
the student version of the software as well as new case studies and
updated existing case studies that agree with modern methods of
characterizing soil properties. The thrust of the book is a
detailed presentation of methods of analysis for single piles and
groups of piles under lateral loading. The method makes use of
load-transfer functions that are based heavily on testing results
of full-scale, heavily instrumented piles under carefully
controlled lateral loading, coupled with the use of soil-structured
interaction mechanics. This method is validated by comparing the
results from the method of analysis with experimental results from
case studies of un-instrumented piles. The book specifically
addresses the analysis of piles of varying stiffness installed into
soils with a variety of characteristics, accounting for the axial
load at the top of the pile and for the rotational restraint of the
pile head, possibly nonlinear, offered by the connection to the
superstructure. The text provides example designs as well as the
design of pile foundations that support an offshore platform. The
book also includes references to a rich body of technical material,
including citations of hundreds of relevant publications. The user
may find the material on pile groups under lateral loading to be
particularly helpful. The method begins with the loading at the
foundation origin and makes use of nonlinear pile-head functions
for the lateral load, the axial load, and the moment, taking
pile-soil-pile interaction into account. For two-dimensional cases,
the rotation and displacement of the foundation origin is computed
to achieve equilibrium, and the resulting pile-head loading may be
computed. Results for different loadings can also be readily
calculated to seek t
Nanotechnology can be defined as the science of manipulating matter
at the nanometer scale in order to discover new properties and
possibly produce new products. For the past 30 years, a
considerable amount of scientific interest and R&D funding
devoted to nanotechnology has led to rapid developments in all
areas of science and engineering, including chemistry, materials,
energy, medicine, biotechnology, agriculture, food, electronic
devices, and consumer products. In the U.S. alone, the federal
government has spent more than $22 billion in nanotechnology
research since 2001. The global funding of nanotechnologies was
estimated to be about $7 billion in 2011 and has increased about
20% per year since then, according to various studies. Already some
products have appeared in the marketplace and more will certainly
come in the future. A possible concern is the health, safety, and
environmental impact of some of these products. The U.S. is
certainly investing heavily in nanotechnology. It started the
National Nanotechnology Initiative (NNI) about 16 years ago,
pulling together the efforts of 20 federal departments and
independent agencies. This book contains a wealth of information on
research, product development, commercialization, and regulatory
issues related to nanotechnology.
|
|