|
|
Showing 1 - 14 of
14 matches in All Departments
Setting forth the state of the art, leading researchers present a
survey on the fast-developing field of Connectionist
Psycholinguistics: using connectionist or "neural" networks, which
are inspired by brain architecture, to model empirical data on
human language processing. Connectionist psycholinguistics has
already had a substantial impact on the study of a wide range of
aspects of language processing, ranging from inflectional
morphology, to word recognition, to parsing and language
production. Christiansen and Chater begin with an extended tutorial
overview of Connectionist Psycholinguistics which is followed by
the latest research by leading figures in each area of research.
The book also focuses on the implications and prospects for
connectionist models of language, not just for psycholinguistics,
but also for computational and linguistic perspectives on natural
language. The interdisciplinary approach will be relevant for, and
accessible to psychologists, cognitive scientists, linguists,
philosophers, and researchers in artificial intelligence.
Are people rational? This question was central to Greek thought;
and has been at the heart of psychology and philosophy for
millennia. This book provides a radical and controversial
reappraisal of conventional wisdom in the psychology of reasoning,
proposing that the Western conception of the mind as a logical
system is flawed at the very outset. It argues that cognition
should be understood in terms of probability theory, the calculus
of uncertain reasoning, rather than in terms of logic, the calculus
of certain reasoning.
This book brings together an influential sequence of papers that
argue for a radical re-conceptualisation of the psychology of
inference, and of cognitive science more generally. The papers
demonstrate that the thesis that logic provides the basis of human
inference is central to much cognitive science, although the
commitment to this view is often implicit. They then note that
almost all human inference is uncertain, whereas logic is the
calculus of certain inference. This mismatch means that logic is
not the appropriate model for human thought. Oaksford and Chater's
argument draws on research in computer science, artificial
intelligence and philosophy of science, in addition to experimental
psychology. The authors propose that probability theory, the
calculus of uncertain inference, provides a more appropriate model
for human thought. They show how a probabilistic account can
provide detailed explanations of experimental data on Wason's
selection task, which many have viewed as providing a paradigmatic
demonstration of human irrationality. Oaksford and Chater show that
people's behaviour appears irrational only from a logical point of
view, whereas it is entirely rational from a probabilistic
perspective. The shift to a probabilistic framework for human
inference has significant implications for the psychology of
reasoning, cognitive science more generally, and forour picture of
ourselves as rational agents.
This book brings together an influential sequence of papers that
argue for a radical re-conceptualisation of the psychology of
inference, and of cognitive science more generally. The papers
demonstrate that the thesis that logic provides the basis of human
inference is central to much cognitive science, although the
commitment to this view is often implicit. They then note that
almost all human inference is uncertain, whereas logic is the
calculus of certain inference. This mismatch means that logic is
not the appropriate model for human thought.
Oaksford and Chater's argument draws on research in computer
science, artificial intelligence and philosophy of science, in
addition to experimental psychology. The authors propose that
probability theory, the calculus of uncertain inference, provides a
more appropriate model for human thought. They show how a
probabilistic account can provide detailed explanations of
experimental data on Wason's selection task, which many have viewed
as providing a paradigmatic demonstration of human irrationality.
Oaksford and Chater show that people's behavior appears irrational
only from a logical point of view, whereas it is entirely rational
from a probabilistic perspective. The shift to a probabilistic
framework for human inference has significant implications for the
psychology of reasoning, cognitive science more generally, and for
our picture of ourselves as rational agents.
The conditional, if...then, is probably the most important term in
natural language and forms the core of systems of logic and mental
representation. It occurs in all human languages and allows people
to express their knowledge of the causal or law-like structure of
the world and of others' behaviour, e.g., if you turn the key the
car starts, if John walks the dog he stops for a pint of beer; to
make promises, e.g., if you cook tonight, I'll wash up all week; to
regulate behaviour, e.g., if you are drinking beer, you must be
over 18 years of age; to suggest what would have happened had
things been different, e.g., if the match had been dry it would
have lit, among many other possible uses. The way in which the
conditional is modelled also determines the core of most logical
systems. Unsurprisingly, it is also the most researched expression
in the psychology of human reasoning.
Cognition and Conditionals is the first volume for over 20 years
(On Conditionals, 1986, CUP) that brings together recent
developments in the cognitive science and psychology of conditional
reasoning. Over the last 10 to 15 years, research on conditionals
has come to dominate the psychology of reasoning providing a rich
seam of results that have created new theoretical possibilities.
This book shows how these developments have led researchers to view
people's conditional reasoning behaviour more as succesful
probabilistic reasoning rather than as errorful logical reasoning.
It shows how the multifarious, and apparently competing,
theoretical positions developed over the last 50 years in this area
- mental logics, mental models, heuristic approaches, dual process
theory, and probabilistic approaches-have responded to these
insights. Its organisation reflects the view that an integrative
approach is emerging that may need to exploit aspects of all these
theoretical positions to explain the rich and complex phenomenon of
reasoning with conditionals. It includes an introductory chapter
relating the development of the psychology of reasoning to
developments in the logic and semantics of the conditional. It also
includes chapters by many of the leading figures in this field.
Cognition and Conditionals will be a valuable resource for
cognitive scientists, psychologists and philosophers interested how
people actually reason with conditionals.
Are people rational? This question was central to Greek thought;
and has been at the heart of psychology, philosophy, rational
choice in social sciences, and probabilistic approaches to
artificial intelligence. This book provides a radical re-appraisal
of conventional wisdom in the psychology of reasoning.
For almost two and a half thousand years, the Western conception
of what it is to be a human being has been dominated by the idea
that the mind is the seat of reason - humans are, almost by
definition, the rational animal. From Aristotle to the present day,
rationality has been explained by comparison to systems of logic,
which distinguish valid (i.e., rationally justified) from invalid
arguments. Within psychology and cognitive science, such a logicist
conception of the mind was adopted wholeheartedly from Piaget
onwards. Simultaneous with the construction of the logicist program
in cognition, other researchers found that people appeared
surprisingly and systematically illogical in some experiments.
Proposals within the logicist paradigm suggested that these were
mere performance errors, although in some reasoning tasks only as
few as 5% of people's reasoning was logically correct.
In this book a more radical suggestion for explaining these
puzzling aspects of human reasoning is put forward: the Western
conception of the mind as a logical system is flawed at the very
outset. The human mind is primarily concerned with practical action
in the face of a profoundly complex and uncertain world. Oaksford
and Chater argue that cognition should be understood in terms of
probability theory, the calculus of uncertain reasoning, rather
than in terms of logic, the calculus of certainreasoning. Thus, the
logical mind should be replaced by the probabilistic mind - people
may possess not logical rationality, but Bayesian rationality.
The rational analysis method, first proposed by John R. Anderson,
has been enormously influential in helping us understand high-level
cognitive processes.
The Probabilistic Mind is a follow-up to the influential and highly
cited 'Rational Models of Cognition' (OUP, 1998). It brings
together developments in understanding how, and how far, high-level
cognitive processes can be understood in rational terms, and
particularly using probabilistic Bayesian methods. It synthesizes
and evaluates the progress in the past decade, taking into account
developments in Bayesian statistics, statistical analysis of the
cognitive 'environment' and a variety of theoretical and
experimental lines of research. The scope of the book is broad,
covering important recent work in reasoning, decision making,
categorization, and memory. Including chapters from many of the
leading figures in this field,
The Probabilistic Mind will be valuable for psychologists and
philosophers interested in cognition.
A radical reinterpretation of how your mind works - and why it
could change your life 'An astonishing achievement. Nick Chater has
blown my mind' Tim Harford 'A total assault on all lingering
psychiatric and psychoanalytic notions of mental depths ... Light
the touchpaper and stand well back' New Scientist We all like to
think we have a hidden inner life. Most of us assume that our
beliefs and desires arise from the murky depths of our minds, and,
if only we could work out how to access this mysterious world, we
could truly understand ourselves. For more than a century,
psychologists and psychiatrists have struggled to discover what
lies below our mental surface. In The Mind Is Flat, pre-eminent
behavioural scientist Nick Chater reveals that this entire
enterprise is utterly misguided. Drawing on startling new research
in neuroscience, behavioural psychology and perception, he shows
that we have no hidden depths to plumb, and unconscious thought is
a myth. Instead, we generate our ideas, motives and thoughts in the
moment. This revelation explains many of the quirks of human
behaviour - for example why our supposedly firm political beliefs,
personal preferences and even our romantic attractions are
routinely proven to be inconsistent and changeable. As the reader
discovers, through mind-bending visual examples and
counterintuitive experiments, we are all characters of our own
creation, constantly improvising our behaviour based on our past
experiences. And, as Chater shows us, recognising this can be
liberating.
This interdisciplinary new work explores one of the central
theoretical problems in linguistics: learnability. The authors,
from different backgrounds--linguistics, philosophy, computer
science, psychology and cognitive science-explore the idea that
language acquisition proceeds through general purpose learning
mechanisms, an approach that is broadly empiricist both
methodologically and psychologically. For many years, the
empiricist approach has been taken to be unfeasible on practical
and theoretical grounds. In the book, the authors present a variety
of precisely specified mathematical and computational results that
show that empiricist approaches can form a viable solution to the
problem of language acquisition. It assumes limited technical
background and explains the fundamental principles of probability,
grammatical description and learning theory in an accessible and
non-technical way. Different chapters address the problem of
language acquisition using different assumptions: looking at the
methodology of linguistic analysis using simplicity based criteria,
using computational experiments on real corpora, using theoretical
analysis using probabilistic learning theory, and looking at the
computational problems involved in learning richly structured
grammars. Written by four researchers in the full range of relevant
fields: linguistics (John Goldsmith), psychology (Nick Chater),
computer science (Alex Clark), and cognitive science (Amy Perfors),
the book sheds light on the central problems of learnability and
language, and traces their implications for key questions of
theoretical linguistics and the study of language acquisition.
Imitation is not the low-level, cognitively undemanding behavior it
is often assumed to be, but rather--along with language and the
ability to understand other minds--one of a trio of related
capacities that are fundamental to human mentality. In these
landmark volumes, leading researchers across a range of disciplines
provide a state-of-the-art view of imitation, integrating the
latest findings and theories with reviews of seminal work, and
revealing why imitation is a topic of such intense current
scientific interest. Perspectives are drawn from neuroscience and
brain imaging, animal and developmental psychology, primatology,
ethology, philosophy, anthropology, media studies, economics,
sociology, education, and law. These volumes provide a resource
that makes this research accessible across disciplines and
clarifies its importance for the social sciences and philosophy as
well as for the cognitive sciences. As a further aid to
cross-fertilization, each volume includes extensive
interdisciplinary commentary and discussion. The first volume
considers possible mechanisms of imitation, including discussion of
mirror systems, ideomotor and common coding theories, and the
possibility of "shared circuits" for control, imitation, and
simulation, and then takes up imitation in animals, with
illuminating comparisons to human imitation. The second volume
focuses first on the roles of imitation in human development and in
learning to understand the minds of others, and then on the broader
social and cultural roles and functions of imitation, including
discussions of meme theory and cultural evolution, and of the
pervasive imitative tendencies of normal adults and their relevance
forunderstanding the effects of the media on human behavior.
Setting forth the state of the art, leading researchers present
a survey on the fast-developing field of Connectionist
Psycholinguistics: using connectionist or neural networks, which
are inspired by brain architecture, to model empirical data on
human language processing. Connectionist psycholinguistics has
already had a substantial impact on the study of a wide range of
aspects of language processing, ranging from inflectional
morphology, to word recognition, to parsing and language
production.
Christiansen and Chater begin with an extended tutorial overview
of Connectionist Psycholinguistics which is followed by the latest
research by leading figures in each area of research. The book also
focuses on the implications and prospects for connectionist models
of language, not just for psycholinguistics, but also for
computational and linguistic perspectives on natural language. The
interdisciplinary approach will be relevant for, and accessible to
psychologists, cognitive scientists, linguists, philosophers, and
researchers in artificial intelligence.
A work that reveals the profound links between the evolution,
acquisition, and processing of language, and proposes a new
integrative framework for the language sciences. Language is a
hallmark of the human species; the flexibility and unbounded
expressivity of our linguistic abilities is unique in the
biological world. In this book, Morten Christiansen and Nick Chater
argue that to understand this astonishing phenomenon, we must
consider how language is created: moment by moment, in the
generation and understanding of individual utterances; year by
year, as new language learners acquire language skills; and
generation by generation, as languages change, split, and fuse
through the processes of cultural evolution. Christiansen and
Chater propose a revolutionary new framework for understanding the
evolution, acquisition, and processing of language, offering an
integrated theory of how language creation is intertwined across
these multiple timescales. Christiansen and Chater argue that
mainstream generative approaches to language do not provide
compelling accounts of language evolution, acquisition, and
processing. Their own account draws on important developments from
across the language sciences, including statistical natural
language processing, learnability theory, computational modeling,
and psycholinguistic experiments with children and adults.
Christiansen and Chater also consider some of the major
implications of their theoretical approach for our understanding of
how language works, offering alternative accounts of specific
aspects of language, including the structure of the vocabulary, the
importance of experience in language processing, and the nature of
recursive linguistic structure.
|
You may like...
Kamikaze
Eminem
CD
R372
Discovery Miles 3 720
Loot
Nadine Gordimer
Paperback
(2)
R367
R340
Discovery Miles 3 400
Loot
Nadine Gordimer
Paperback
(2)
R367
R340
Discovery Miles 3 400
|