|
Showing 1 - 25 of
68 matches in All Departments
This easy-to-understand textbook presents a modern approach to
learning numerical methods (or scientific computing), with a unique
focus on the modeling and applications of the mathematical content.
Emphasis is placed on the need for, and methods of, scientific
computing for a range of different types of problems, supplying the
evidence and justification to motivate the reader. Practical
guidance on coding the methods is also provided, through
simple-to-follow examples using Python. Topics and features:
provides an accessible and applications-oriented approach,
supported by working Python code for many of the methods;
encourages both problem- and project-based learning through
extensive examples, exercises, and projects drawn from practical
applications; introduces the main concepts in modeling, python
programming, number representation, and errors; explains the
essential details of numerical calculus, linear, and nonlinear
equations, including the multivariable Newton method; discusses
interpolation and the numerical solution of differential equations,
covering polynomial interpolation, splines, and the Euler,
Runge-Kutta, and shooting methods; presents largely self-contained
chapters, arranged in a logical order suitable for an introductory
course on scientific computing. Undergraduate students embarking on
a first course on numerical methods or scientific computing will
find this textbook to be an invaluable guide to the field, and to
the application of these methods across such varied disciplines as
computer science, engineering, mathematics, economics, the physical
sciences, and social science.
Drawing parallels between tribal behavior and international
relations to demonstrate that societies are not inherently
aggressive but are led into conflict when pride or in-group
pressures push people to fight, this profound look at the chilling
reality of cold war and its arsenal of nuclear destruction offers
valuable new insights into how prejudices and stereotypes
contribute to what may seem like an inexorable drift to war. Yet
the authors conclude that war is not inevitable, as they offer
suggestions for an end to the arms race in the nuclear age. Based
on original research, this is a long overdue contribution to the
study of war and peace in our time and a text for newly emerging
courses on the subject.
This collection of papers stems originally from a conference on
Property Theory, Type Theory and Semantics held in Amherst on March
13-16 1986. The conference brought together logicians,
philosophers, com puter scientists and linguists who had been
working on these issues (of ten in isolation from one another). Our
intent was to boost debate and exchange of ideas on these
fundamental issues at a time of rapid change in semantics and
cognitive science. The papers published in this work have evolved
substantially since their original presentation at the conference.
Given their scope, we thought it convenient to divide the work into
two volumes. The first deals primarily with logical and
philosophical foundations, the second with more empirical semantic
issues. While there is a common set of issues tying the two volumes
together, they are both self-contained and can be read
independently of one another. Two of the papers in the present
collection (van Benthem in volume 1 and Chierchia in volume II)
were not actually read at the conference. They are nevertheless
included here for their direct relevance to the topics of the
volumes. Regrettably, some of the papers that were presented
(Feferman, Klein, and Plotkin) could not be included in the present
work due to timing problems. We nevertheless thank the authors for
their contribu tion in terms of ideas and participation in the
debate."
ThiscollectionofpapersstemsoriginallyfromaconferenceonProperty
Theory, TypeTheoryandSemanticsheldinAmherstonMarch13-16 1986.The
conference brought together logicians, philosophers, com puter
scientists and linguists who had been working on these issues
(often in isolation from one another).Ourintent wastoboostdebate
and exchange of ideas on these fundamental issuesat a time ofrapid
changeinsemanticsandcognitivescience. The paperspublished in
thiswork have evolved substantially since their original
presentation at the conference. Given their scope, we thought it
convenient to divide the work into two volumes.The first deals
primarily withlogicaland philosophical foundations, the second with
more empirical semantic issues.Whilethere isa common set of
issuestyingthetwovolumestogether, theyareboth self-containedand
canbereadindependentlyofoneanother.
Twoofthepapersinthepresentcollection(vanBentheminvolume Iand
ChierchiainvolumeII)werenotactuallyread attheconference. They are
nevertheless included here for their direct relevance to the
topicsofthevolumes. Regrettably, some of the papers that were
presented (Feferman, Klein, and Plotkin) could not be included in
the presentwork due to timingproblems. Wenevertheless thank
theauthorsfortheircontribu
tionintermsofideasandparticipationinthedebate. The conference had a
group of invited discussants whichincluded Emmon Bach, JanetFodor,
Erhard Hinrichs, Angelika Kratzer, Fred Landman, Richard Larson,
Godehard Link, Chris Menzel, Uwe Mon nich,
andCarlPollard.Wethankthemall(alongwiththeotherpartici
pants)fortheirstimulatingandlivelypresence."
This book presents an up-to-date overview of the characterization, risk assessment and remediation of mercury-contaminated sites. Many industrial activities, including the mining of gold, silver, and mercury itself, have caused mercury contamination of terrestrial and aquatic systems. Unlike other metals, which are generally not very volatile, mercury from contaminated sites can have a significant impact on remote ecosystems via the atmospheric pathway. Thus, mercury contamination is not only a local issue, but also has global dimensions.This book summarizes, for the first time, works from Europe, Russia and the American continent. Review chapters are supplemented by detailed, international case studies.
In this book, Liz Turner argues that survey methods have gained an
unwarranted and unhealthy level of dominance when it comes to
understanding how the public views the criminal justice system. The
focus on measuring public confidence in criminal justice by
researchers, politicians and criminal justice agencies has tended
to prioritise the production of quantitative representations of
general opinions, at the expense of more specific, qualitative or
deliberative approaches. This has occurred not due to any inherent
methodological superiority of survey-based approaches, but due to
the congruence of the survey-based, general measure of opinion with
the prevailing neoliberal political tendency to engage with
citizens as consumers. By identifying the historical conditions on
which contemporary knowledge claims rest, and tracing the political
power struggles out of which sprang the idea of public confidence
in criminal justice as a real and measurable object, Turner shows
that things could be otherwise. She also draws attention to the
ways in which survey researchers have asserted their dominance over
other approaches, suppressing convincing claims by advocates of
deliberative methods that a better politics of crime and justice is
possible. Ultimately, Turner concludes, researchers need to be more
upfront about their political objectives, and more alert to the
political responsibilities that go along with the making of
knowledge claims. Providing a provocative critique of the dominant
approaches to measuring public confidence, this timely study will
be of special interest to scholars of the criminal justice system,
research methods, and British politics.
Awarded the 2009 Olle Prize! The field of coordination polymer
research has undergone rapid expansion in recent years. No longer
are these materials the vaguely defined 'insoluble material' at the
bottom of your vessel that spell death for your reaction. They have
gone from 'polymeric rubbish' to 'materials of the future'. Great
leaps in the deliberate design of coordination polymers were made
in the 1990s. These were allied with similar advances in related
areas such as organic crystal engineering, metallosupramolecular
chemistry and X-ray diffraction. No longer did we assemble things
atom by atom. Whole molecules were used as building blocks and new
materials were made. This is the first book to provide a broad
overview of all the major facets of coordination polymer research
in one place. It combines chapters on nets and interpenetration
with wide-ranging surveys of transition metal and lanthanoid
coordination polymers and their properties. The aim is to provide a
flavour of each aspect whilst introducing the important concepts
and developments using carefully selected examples. After an
introduction, the text is split into three sections: -Design (nets,
interpenetration, malleability) -Analysis (transition metal
coordination polymers, lanthanoid coordination polymers,
organometallic networks, organic-inorganic hybrids) -Application
(magnetic properties, porosity, acentric and chiral networks,
reactive coordination polymers, other properties). Written in the
style of a tutorial review, the book is suitable for both senior
specialists and new postgraduate students taking their first steps
in the field. It also provides an authoritative and detailed
reference source.
Within the past ten years, the discussion of the nature of folk
psychology and its role in explaining behavior and thought has
become central to the philosophy of mind. However, no comprehensive
account of the contemporary debate or collection of the works that
make up this debate has yet been available. Intending to fill this
gap, this volume begins with the crucial background for the
contemporary debate and proceeds with a broad range of responses to
and developments of these works -- from those who argue that "folk
theory" is a misnomer to those who regard folk theory as
legitimately explanatory and necessary for any adequate account of
human behavior. Intended for courses in the philosophy of mind,
psychology, and science, as well as anthropology and social
psychology, this anthology is also of great value in courses
focusing on folk models, eliminative materialism, explanation,
psychological theory, and -- in particular -- intentional
psychology. It is accessible to both graduate students and
upper-division undergraduate students of philosophy and psychology
as well as researchers. As an aid to students, a thorough
discussion of the field and the articles in the anthology is
provided in the introduction; as an aid to researchers, a complete
bibliography is also provided.
Someday computers will be artists. They'll be able to write amusing
and original stories, invent and play games of unsurpassed
complexity and inventiveness, tell jokes and suffer writer's block.
But these things will require computers that can both achieve
artistic goals and be creative. Both capabilities are far from
accomplished. This book presents a theory of creativity that
addresses some of the many hard problems which must be solved to
build a creative computer. It also presents an exploration of the
kinds of goals and plans needed to write simple short stories.
These theories have been implemented in a computer program called
MINSTREL which tells stories about King Arthur and his knights.
While far from being the silicon author of the future, MINSTREL
does illuminate many of the interesting and difficult issues
involved in constructing a creative computer. The results presented
here should be of interest to at least three different groups of
people. Artificial intelligence researchers should find this work
an interesting application of symbolic AI to the problems of
story-telling and creativity. Psychologists interested in
creativity and imagination should benefit from the attempt to build
a detailed, explicit model of the creative process. Finally,
authors and others interested in how people write should find
MINSTREL's model of the author-level writing process
thought-provoking.
In this book, Liz Turner argues that survey methods have gained an
unwarranted and unhealthy level of dominance when it comes to
understanding how the public views the criminal justice system. The
focus on measuring public confidence in criminal justice by
researchers, politicians and criminal justice agencies has tended
to prioritise the production of quantitative representations of
general opinions, at the expense of more specific, qualitative or
deliberative approaches. This has occurred not due to any inherent
methodological superiority of survey-based approaches, but due to
the congruence of the survey-based, general measure of opinion with
the prevailing neoliberal political tendency to engage with
citizens as consumers. By identifying the historical conditions on
which contemporary knowledge claims rest, and tracing the political
power struggles out of which sprang the idea of public confidence
in criminal justice as a real and measurable object, Turner shows
that things could be otherwise. She also draws attention to the
ways in which survey researchers have asserted their dominance over
other approaches, suppressing convincing claims by advocates of
deliberative methods that a better politics of crime and justice is
possible. Ultimately, Turner concludes, researchers need to be more
upfront about their political objectives, and more alert to the
political responsibilities that go along with the making of
knowledge claims. Providing a provocative critique of the dominant
approaches to measuring public confidence, this timely study will
be of special interest to scholars of the criminal justice system,
research methods, and British politics.
Each week of this three week meeting was a self-contained event,
although each had the same underlying theme - the effect of
parallel processing on numerical analysis. Each week provided the
opportunity for intensive study to broaden participants' research
interests or deepen their understanding of topics of which they
already had some knowledge. There was also the opportunity for
continuing individual research in the stimulating environment
created by the presence of several experts of international
stature. This volume contains lecture notes for most of the major
courses of lectures presented at the meeting; they cover topics in
parallel algorithms for large sparse linear systems and
optimization, an introductory survey of level-index arithmetic and
superconvergence in the finite element method.
ThiscollectionofpapersstemsoriginallyfromaconferenceonProperty
Theory, TypeTheoryandSemanticsheldinAmherstonMarch13-16 1986.The
conference brought together logicians, philosophers, com puter
scientists and linguists who had been working on these issues
(often in isolation from one another).Ourintent wastoboostdebate
and exchange of ideas on these fundamental issuesat a time ofrapid
changeinsemanticsandcognitivescience. The paperspublished in
thiswork have evolved substantially since their original
presentation at the conference. Given their scope, we thought it
convenient to divide the work into two volumes.The first deals
primarily withlogicaland philosophical foundations, the second with
more empirical semantic issues.Whilethere isa common set of
issuestyingthetwovolumestogether, theyareboth self-containedand
canbereadindependentlyofoneanother.
Twoofthepapersinthepresentcollection(vanBentheminvolume Iand
ChierchiainvolumeII)werenotactuallyread attheconference. They are
nevertheless included here for their direct relevance to the
topicsofthevolumes. Regrettably, some of the papers that were
presented (Feferman, Klein, and Plotkin) could not be included in
the presentwork due to timingproblems. Wenevertheless thank
theauthorsfortheircontribu
tionintermsofideasandparticipationinthedebate. The conference had a
group of invited discussants whichincluded Emmon Bach, JanetFodor,
Erhard Hinrichs, Angelika Kratzer, Fred Landman, Richard Larson,
Godehard Link, Chris Menzel, Uwe Mon nich,
andCarlPollard.Wethankthemall(alongwiththeotherpartici
pants)fortheirstimulatingandlivelypresence."
Each week of this three week meeting was a self-contained event,
although each had the same underlying theme - the effect of
parallel processing on numerical analysis. Each week provided the
opportunity for intensive study to broaden participants' research
interests or deepen their understanding of topics of which they
already had some knowledge. There was also the opportunity for
continuing individual research in the stimulating environment
created by the presence of several experts of international
stature. This volume contains lecture notes for most of the major
courses of lectures presented at the meeting; they cover topics in
parallel algorithms for large sparse linear systems and
optimization, an introductory survey of level-index arithmetic and
superconvergence in the finite element method.
This collection of papers stems originally from a conference on
Property Theory, Type Theory and Semantics held in Amherst on March
13-16 1986. The conference brought together logicians,
philosophers, com puter scientists and linguists who had been
working on these issues (of ten in isolation from one another). Our
intent was to boost debate and exchange of ideas on these
fundamental issues at a time of rapid change in semantics and
cognitive science. The papers published in this work have evolved
substantially since their original presentation at the conference.
Given their scope, we thought it convenient to divide the work into
two volumes. The first deals primarily with logical and
philosophical foundations, the second with more empirical semantic
issues. While there is a common set of issues tying the two volumes
together, they are both self-contained and can be read
independently of one another. Two of the papers in the present
collection (van Benthem in volume 1 and Chierchia in volume II)
were not actually read at the conference. They are nevertheless
included here for their direct relevance to the topics of the
volumes. Regrettably, some of the papers that were presented
(Feferman, Klein, and Plotkin) could not be included in the present
work due to timing problems. We nevertheless thank the authors for
their contribu tion in terms of ideas and participation in the
debate."
cB) 114 7. 8 Constant electric and magnetic fields at right angles
(8,
Within the past ten years, the discussion of the nature of folk
psychology and its role in explaining behavior and thought has
become central to the philosophy of mind. However, no comprehensive
account of the contemporary debate or collection of the works that
make up this debate has yet been available. Intending to fill this
gap, this volume begins with the crucial background for the
contemporary debate and proceeds with a broad range of responses to
and developments of these works -- from those who argue that "folk
theory" is a misnomer to those who regard folk theory as
legitimately explanatory and necessary for any adequate account of
human behavior. Intended for courses in the philosophy of mind,
psychology, and science, as well as anthropology and social
psychology, this anthology is also of great value in courses
focusing on folk models, eliminative materialism, explanation,
psychological theory, and -- in particular -- intentional
psychology. It is accessible to both graduate students and
upper-division undergraduate students of philosophy and psychology
as well as researchers. As an aid to students, a thorough
discussion of the field and the articles in the anthology is
provided in the introduction; as an aid to researchers, a complete
bibliography is also provided.
This practical, thorough, and concise pocketbook is the perfect
companion to the clinical skills needed for life on the wards. It
covers all the essential elements that lie at the heart of medical
practice in which students must prove their competence, and lays
the foundations needed for the rest of their medical career.
Part One covers history taking, examination and communication;
and Part Two provides an overview of key practical procedures and
diagnostic skills, all of which are typically examined via
Objective Structured Clinical Examinations (OSCEs) or other
clinical case format examinations. The coverage of examination
skills alongside practical procedures and explanations of typical
tests and investigations make this pocketbook invaluable for
students new to clinical medicine.
The authors are specialists in teaching clinical skills from
both a medical and surgical perspective, and are perfectly placed
to cover these cornerstones of medical practice.
Current understanding of neurological disease has been evolving
over the past 150 years. With the increasing and earlier
sub-specialization of neurology trainees, and their variable
exposure to higher academic study, there is little opportunity to
put this development into a historical context as a whole.
Understanding the 'evidence-base', or appreciating the lack of it
in some cases, is an important part of training but this is rarely
presented in a palatable, entertaining form. Part of the Landmark
Papers in series, this book brings together the ten most important
papers for each sub-speciality within neurology, covering the full
range of major neurological conditions. Papers have been selected
by leading international experts, who not only summarize what each
paper showed, but place them into a wider context that makes a
coherent story of how their sub-speciality has developed.
|
|