Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Social sciences > Psychology > Philosophy & theory of psychology > Cognitive theory
This volume of new work by prominent phonologists goes to the heart
of current debates in phonological and linguistic theory: should
the explanation of phonological variety be constraint or rule-based
and, in the light of the resolution of this question, how in the
mind does phonology interface with other components of the grammar.
The book includes contributions from leading proponents of both
sides of the argument and an extensive introduction setting out the
history, nature, and more general linguistic implications of
current phonological theory.
This book fills a long standing need for a basic introduction to Cognitive Grammar that is current, authoritative, comprehensive, and approachable. It presents a synthesis that draws together and refines the descriptive and theoretical notions developed in this framework over the course of three decades. In a unified manner, it accommodates both the conceptual and the social-interactive basis of linguistic structure, as well as the need for both functional explanation and explicit structural description. Starting with the fundamentals, essential aspects of the theory are systematically laid out with concrete illustrations and careful discussion of their rationale. Among the topics surveyed are conceptual semantics, grammatical classes, grammatical constructions, the lexicon-grammar continuum characterized as assemblies of symbolic structures (form-meaning pairings), and the usage-based account of productivity, restrictions, and well-formedness. The theory's central claim - that grammar is inherently meaningful - is thereby shown to be viable. The framework is further elucidated through application to nominal structure, clause structure, and complex sentences. These are examined in broad perspective, with exemplification from English and numerous other languages. In line with the theory's general principles, they are discussed not only in terms of their structural characterization, but also their conceptual value and functional motivation. Other matters explored include discourse, the temporal dimension of language structure, and what grammar reveals about cognitive processes and the construction of our mental world.
A recurrent issue in linguistic theory and psychology concerns the
cognitive status of memorized lists and their internal structure.
In morphological theory, the collections of inflected forms of a
given noun, verb, or adjective into inflectional paradigms are
thought to constitute one such type of list. This book focuses on
the question of which elements in a paradigm can stand in a
relation of partial or total phonological identity. Leading
scholars consider inflectional identity from a variety of
theoretical perspectives, with an emphasis on both case studies and
predictive theories of where syncretism and other "paradigmatic
pressures" will occur in natural language. The authors consider
phenomena such as allomorphy and syncretism while exploring
questions of underlying representations, the formal properties of
markedness, and the featural representation of conjugation and
declension classes. They do so from the perspective of contemporary
theories of morphology and phonology, including Distributed
Morphology and Optimality Theory, and in the context of a wide
range of languages, among them Amharic, Greek, Romanian, Russian,
Saami, and Yiddish. The subjects addressed in the book include the
role of featural decomposition of morphosyntactic features, the
status of paradigms as the unit of syncretism, asymmetric effects
in identity-dependence, and the selection of a base-of-derivation.
This is the first comprehensive account of the segmental phonology
of Hungarian in English. Part I introduces the general features of
the language. Part II examines its vowel and consonant systems, and
its phonotactics (syllable structure constraints, transsyllabic
constraints, and morpheme structure constraints). Part III
describes the phonological processes that vowels, consonants, and
syllables undergo and/or trigger. The authors provide a new
analysis of vowel harmony as well as discussions of vowel length
alternations, palatalization, voice assimilation, and processes
targeting nasals and liquids. The final chapters cover processes
conditioned by syllable structure, and briefly describe a selection
of surface phenomena.
This is the first exhaustive investigation of gradience in syntax,
conceived of as grammatical indeterminacy. It looks at gradience in
English word classes, phrases, clauses and constructions, and
examines how it may be defined and differentiated. Professor Aarts
addresses the tension between linguistic concepts and the
continuous phenomena they describe by testing and categorizing
grammatical vagueness and indeterminacy. He considers to what
extent gradience is a grammatical phenomenon or a by-product of
imperfect linguistic description, and makes a series of linked
proposals for its theoretical formalization.
Teleosemantics seeks to explain meaning and other intentional phenomena in terms of their function in the life of the species. This volume of new essays from an impressive line-up of well-known contributors offers a valuable summary of the current state of the teleosemantics debate.
Statistics for Applied Behavior Analysis Practitioners and Researchers provides practical and useful content for individuals who work directly with, or supervise those who work directly with, individuals with ASD. This book introduces core concepts and principles of modern statistical analysis that practitioners will need to deliver ABA services. The organization of the book works through the flow of behavior analytic service provision, aiming to help practitioners read through research, evaluate intervention options, incorporate statistics in their analysis of time-series intervention and assessment data, and effectively communicate assessment and intervention effects using statistics. As professionals who provide applied behavior analysis (ABA) services are required to use evidence-based practices and make data-based decisions regarding assessments and interventions, this book will help them take a modern, scientific approach to derive knowledge and make decisions based on statistical literacy.
How a computational framework can account for the successes and failures of human cognition At the heart of human intelligence rests a fundamental puzzle: How are we incredibly smart and stupid at the same time? No existing machine can match the power and flexibility of human perception, language, and reasoning. Yet, we routinely commit errors that reveal the failures of our thought processes. What Makes Us Smart makes sense of this paradox by arguing that our cognitive errors are not haphazard. Rather, they are the inevitable consequences of a brain optimized for efficient inference and decision making within the constraints of time, energy, and memory-in other words, data and resource limitations. Framing human intelligence in terms of these constraints, Samuel Gershman shows how a deeper computational logic underpins the "stupid" errors of human cognition. Embarking on a journey across psychology, neuroscience, computer science, linguistics, and economics, Gershman presents unifying principles that govern human intelligence. First, inductive bias: any system that makes inferences based on limited data must constrain its hypotheses in some way before observing data. Second, approximation bias: any system that makes inferences and decisions with limited resources must make approximations. Applying these principles to a range of computational errors made by humans, Gershman demonstrates that intelligent systems designed to meet these constraints yield characteristically human errors. Examining how humans make intelligent and maladaptive decisions, What Makes Us Smart delves into the successes and failures of cognition.
From self-help to medication, therapy, and cognitive neuroscience, this book traces the uses and limits of psychology. Offering a systematic exploration of the ways in which psychology is used in contemporary society, it refines our understanding of the extent of the field. In addition to conceptual analysis of how science, truth, biology, mind, and meaning intersect and interact in the mind sciences, A Suspicious Science draws from history and anthropology to articulate an interdisciplinary multi-level form of psychology that may serve to orient the field. The book synthesizes debates in psychology and philosophy concerning methodology and the nature of explanation with debates about its practical context as a human science. Ultimately, it suggests psychology provides us myths and rituals that ground a particular sense of meaning and motivation in our lives. By aligning cultural, emotional, and philosophical uses of psychology, this book clarifies a synoptic, humanistic model of the mind within the human sciences.
In this important and pioneering book Frederick Newmeyer takes on the question of language variety. He considers why some language types are impossible and why some grammatical features are more common than others. The task of trying to explain typological variation among languages has been mainly undertaken by functionally-oriented linguists. Generative grammarians entering the field of typology in the 1980s put forward the idea that cross-linguistic differences could be explained by linguistic parameters within Universal Grammar, whose operation might vary from language to language. Unfortunately, this way of looking at variation turned out to be much less successful than had been hoped for. Professor Newmeyer's alternative to parameters combines leading ideas from functionalist and formalist approaches which in the past have been considered incompatible. He throws fresh light on language typology and variation, and provides new insights into the principles of Universal The book is written in a clear, readable style and will be readily understood by anyone with a couple of years' study of linguistics. It will interest a wide range of scholars and students of language, including typologists, historical linguists, and theorists of every shade.
In this important and pioneering book Frederick Newmeyer takes on the question of language variety. He considers why some language types are impossible and why some grammatical features are more common than others. The task of trying to explain typological variation among languages has been mainly undertaken by functionally-oriented linguists. Generative grammarians entering the field of typology in the 1980s put forward the idea that cross-linguistic differences could be explained by linguistic parameters within Universal Grammar, whose operation might vary from language to language. Unfortunately, this way of looking at variation turned out to be much less successful than had been hoped for. Professor Newmeyer's alternative to parameters combines leading ideas from functionalist and formalist approaches which in the past have been considered incompatible. He throws fresh light on language typology and variation, and provides new insights into the principles of Universal Grammar. The book is written in a clear, readable style and will be readily understood by anyone with a couple of years' study of linguistics. It will interest a wide range of scholars and students of language, including typologists, historical linguists, and theorists of every shade.
"The question for me is how can the human mind occur in the
physical universe. We now know that the world is governed by
physics. We now understand the way biology nestles comfortably
within that. The issue is how will the mind do that as
well."--Allen Newell, December 4, 1991, Carnegie Mellon University
The notions of 'function', 'feature' and 'functional feature' are associated with relatively new developments and insights in several areas of cognition. This book brings together different definitions, insights and research related to defining these notions from such diverse areas as language, perception, categorization and development. Each of the contributors in this book explicitly defines the notion of 'function', 'feature' or 'functional feature' within their own theoretical framework, presents research in which such a notion plays a pivotal role, and discusses the contribution of functional features in relation to their insights in a particular area of cognition. As such, this book not only presents new developments devoted to defining 'function', 'feature' and 'functional feature' in several sub-disciplines of cognitive science, but also offers a focused account of how these notions operate within the cognitive interface linking language and spatial representation. All book chapters are accessible for the interested novice, and offer the specialized researcher new empirical and theoretical insights into defining function, both with respect to the language and space interface and across cognition. The introduction to the book presents the reader with the main issues and viewpoints that are discussed in more detail in each of the book chapters.
This cutting-edge book offers a theoretical account of the evolution of multiple memory systems of the brain. The authors conceptualize these memory systems from both behavioural and neurobiological perspectives, guided by three related principles. First, that our understanding of a wide range of memory phenomena can be advanced by breaking down memory into multiple forms with different operating characteristics. Second, that different forms of memory representation are supported by distinct brain pathways with circuitry and neural coding properties. Third, that the contributions of different brain systems can be compared and contrasted by distinguishing between dedicated (or specific) and elaborate (or general) memory systems. A primary goal of this work is to relate the neurobiological properties of dedicated and elaborate systems to their neuropsychological counterparts, and in so doing, account for the phenomenology of memory, from conditioning to conscious recollection.
Adrian Johnston and Catherine Malabou defy theoretical humanities' deeply-entrenched resistance to engagements with the life sciences. Rather than treat biology and its branches as hopelessly reductive and politically suspect, they view recent advances in neurobiology and its adjacent scientific fields as providing crucial catalysts to a radical rethinking of subjectivity. Merging three distinct disciplines -- European philosophy from Descartes to the present, Freudian-Lacanian psychoanalysis, and affective neuroscience -- Johnston and Malabou triangulate the emotional life of affective subjects as conceptualized in philosophy and psychoanalysis with neuroscience. Their experiments yield different outcomes. Johnston finds psychoanalysis and neurobiology have the potential to enrich each other, though affective neuroscience demands a reconsideration of whether affects can be unconscious. Investigating this vexed issue has profound implications for theoretical and practical analysis, as well as philosophical understandings of the emotions.Malabou believes scientific explorations of the brain seriously problematize established notions of affective subjectivity in Continental philosophy and Freudian-Lacanian analysis. She confronts philosophy and psychoanalysis with something neither field has seriously considered: the concept of wonder and the cold, disturbing visage of those who have been affected by disease or injury, such that they are no longer affected emotionally. At stake in this exchange are some of philosophy's most important claims concerning the relationship between the subjective mind and the objective body, the structures and dynamics of the unconscious dimensions of mental life, the role emotion plays in making us human, and the functional differences between philosophy and science.
More than one third of the human brain is devoted to the processes of seeing - vision is after all the main way in which we gather information about the world. But human vision is a dynamic process during which the eyes continually sample the environment. Where most books on vision consider it as a passive activity, this book is unique in focusing on vision as an 'active' process. It goes beyond most accounts of vision where the focus is on seeing, to provide an integrated account of seeing AND looking. The book starts by pointing out the weaknesses in our traditional approaches to vision and the reason we need this new approach. It then gives a thorough description of basic details of the visual and oculomotor systems necessary to understand active vision. The book goes on to show how this approach can give a new perspective on visual attention, and how the approach has progressed in the areas of visual orienting, reading, visual search, scene perception and neuropsychology. Finally, the book summarises progress by showing how this approach sheds new light on the old problem of how we maintain perception of a stable visual world. Written by two leading vision scientists, this book will be valuable for vision researchers and psychology students, from undergraduate level upwards.
Leading philosophers and psychologists join forces to investigate a set of problems to do with agency and self-awareness, in eighteen specially written essays. In recent years there has been much psychological and neurological work purporting to show that consciousness and self-awareness play no role in causing actions, and indeed to demonstrate that free will is an illusion. The essays in this volume subject the assumptions that motivate such claims to sustained interdisciplinary scrutiny. The book will be compulsory reading for psychologists and philosophers working on action explanation, and for anyone interested in the relation between the brain sciences and consciousness.
It is a simple observation that children make mistakes when they learn a language. Yet, to the trained eye, these mistakes are far from random; in fact, they closely resemble perfectly grammatical utterances by adults - who speak other languages. This type of error analysis suggests a novel view of language learning: children are born with a fixed set of hypotheses about language - Chomsky's Universal Grammar - and these hypotheses compete to match the child's ambient language in a Darwinian fashion. The book presents evidence for this perspective from the study of children's words and grammar, and how language changes over time.
Reason and Nature investigates the norms of reason--the standards which contribute to determining whether beliefs, inferences, and actions are rational. Nine philosophers and two psychologists discuss what kinds of things these norms are, how they can be situated within the natural world, and what role they play in the psychological explanation of belief and action. Current work in the theory of rationality is subject to very diverse influences ranging from experimental and theoretical psychology, through philosophy of logic and language, to metaethics and the theory of practical reasoning; this range is well represented here.
A new speculative ontology of aesthetics In Aesthesis and Perceptronium, Alexander Wilson presents a theory of materialist and posthumanist aesthetics founded on an original speculative ontology that addresses the interconnections of experience, cognition, organism, and matter. Entering the active fields of contemporary thought known as the new materialisms and realisms, Wilson argues for a rigorous redefining of the criteria that allow us to discriminate between those materials and objects where aesthesis (perception, cognition) takes place and those where it doesn't. Aesthesis and Perceptronium negotiates between indiscriminately pluralist views that attribute mentation to all things and eliminative views that deny the existence of mentation even in humans. By recasting aesthetic questions within the framework of "epistemaesthetics," which considers cognition and aesthetics as belonging to a single category that can neither be fully disentangled nor fully reduced to either of its terms, Wilson forges a theory of nonhuman experience that avoids this untenable dilemma. Through a novel consideration of the evolutionary origins of cognition and its extension in technological developments, the investigation culminates in a rigorous reevaluation of the status of matter, information, computation, causality, and time in terms of their logical and causal engagement with the activities of human and nonhuman agents.
What explains our ability to refer to the objects we perceive? John
Cambell argues that our capacity for reference is explained by our
capacity to attend selectively to the objects of which we are
aware; that this capacity for conscious attention to a perceived
object is what provides us with our knowledge of reference. When
someone makes a reference to a perceived object, your knowledge of
which thing they are talking about is constituted by your
consciously attending to the relevant object. Campbell articulates
the connections between these three concepts: reference, attention,
and consciousness. He looks at the metaphysical conception of the
environment demanded by such an account, and at the demands imposed
on our conception of consciousness by the point that consciousness
of objects is what explains our capacity to think about them. He
argues that empirical work on the binding problem can illuminate
our grasp of the way in which we have knowledge of reference,
supplied by conscious attention to the relevant object.
John Campbell investigates how consciousness of the world explains our ability to think about the world. So your ability to think about objects you can see depends on your capacity for conscious visual attention to those things. Reference and Consciousness illuminates classical problems about thought, reference, and experience by looking at the underlying psychological mechanisms on which conscious attention depends. It is an original and stimulating contribution to philosophy and to cognitive science.
The essence of religion was once widely thought to be a unique form of experience that could not be explained in neurological, psychological, or sociological terms. In recent decades scholars have questioned the privileging of the idea of religious experience in the study of religion, an approach that effectively isolated the study of religion from the social and natural sciences. "Religious Experience Reconsidered" lays out a framework for research into religious phenomena that reclaims experience as a central concept while bridging the divide between religious studies and the sciences. Ann Taves shifts the focus from "religious experience," conceived as a fixed and stable thing, to an examination of the processes by which people attribute meaning to their experiences. She proposes a new approach that unites the study of religion with fields as diverse as neuroscience, anthropology, sociology, and psychology to better understand how these processes are incorporated into the broader cultural formations we think of as religious or spiritual. Taves addresses a series of key questions: how can we set up studies without obscuring contestations over meaning and value? What is the relationship between experience and consciousness? How can research into consciousness help us access and interpret the experiences of others? Why do people individually or collectively explain their experiences in religious terms? How can we set up studies that allow us to compare experiences across times and cultures? "Religious Experience Reconsidered" demonstrates how methods from the sciences can be combined with those from the humanities to advance a naturalistic understanding of the experiences that people deem religious.
This book provides both a review of the literature and a theoretical framework for understanding the development of visual attention from infancy through early childhood. Taking a functional approach to the topic, the authors discuss the development of the selective and state-related aspects of attention, as well as the emergence of higher-level controls. They also explore the individual differences in these facets of attention, and consider the possible origins of early deficits in attention, which has obvious implications for children with developmental disorders such as attention-deficit hyperactive disorder. These findings will be invaluable to developmental, cognitive, and clinical psychologists and psychiatrists. |
You may like...
The Philosophy and Science of Predictive…
Dina Mendonca, Manuel Curado, …
Hardcover
R3,143
Discovery Miles 31 430
The Power Of Habit - Why We Do What We…
Charles Duhigg
Paperback
(3)
Handbook of Research on Neurocognitive…
Francisco Alcantud Marin, Laxmi Paudel, …
Hardcover
R9,808
Discovery Miles 98 080
|