![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Social sciences > Psychology > Cognition & cognitive psychology > General
What function is played by concepts related to meaning and mereology in the cognitive sciences? How can one outline adequate simulation models for the continuous emergence of new categorization forms characterizing cognitive processes? To what extent is it possible to define new measures of meaningful complexity? How is it possible to describe synergetically the symbolic dynamics inherent in cognitive processes? What types of formal models should we use to describe the evolutionary aspects of cognition? These, the most difficult questions in cognitive science, are discussed here in 15 original essays written by distinguished scholars drawn from a variety of disciplines. The first part presents an up-to-date account of some current trends in the functional modeling of cognitive activities, the synergetic models in particular are widely discussed. The second part focuses on recent advances in complexity theory, self-organizing networks, the theory of self- reference, and the theory of self organization. The third part focuses on some methods of formal analysis that can be used to delineate consistent simulation models of cognitive activities at the perceptual, linguistic and computational levels. The volume also highlights traditional philosophical concerns about realism, the cognitivism-connectionism debate, and the role of holistic assumptions. Readership: psychologists, cognitive scientists, theorists working in complexity theory, self-organization theory, synergetics, semantics of natural language and mereology, philosophers, epistemologists.
Many metaphors go beyond pionting to the existing similarities between two objects -- they create the similarities. Such metaphors, which have been relegated to the back seat in most of the cognitive science research, are the focus of attention in this study, which addresses the creation of similarity within an elaborately laid out interactive framework of cognition. Starting from the constructivist views of Nelson Goodman and Jean Piaget, this framework resolves an apparent paradox in interactionism: how can reality not have a mind-independent ontology and structure, but still manage to constrain the possible worlds a cognitive agent can create in it? A comprehensive theory of metaphor is proposed in this framework that explains how metaphors can create similarities, and why such metaphors are an invaluable asset to cognition. The framework is then applied to related issues of analogical reasoning, induction, and computational modeling of creative metaphors.
Considerable evidence indicates that religion is a motivational force in the lives of most of the world's population, and recent social and political events have placed religion center stage. Motivation is considered an essential component of any adequate answer to the question, 'Why religion?'. That question concerned early psychologists, such as Freud and James, but was relatively neglected with the ascendancy of behaviorism. It has since regained momentum as an important area of research and scholarship. In spite of the fact that motivational principles are implicit in many analytical treatments of religion, and that some articles and book chapters discuss motivation and religion, this literature is widely dispersed and confined primarily to Judeo-Christian world views. This volume of the "Advances" series presents a systematic approach to the topic, as viewed through the lens of such contemporary theories of motivation as expectancy-value, self-determination, and achievement goal theory. An international group of scholars offers a comprehensive view of how such theories help to understand religiosity and its impact on human experiences and behavior. In addition, authors consider the implications of religious experiences and behavior for motivation theory. Separately, these contributions provide unique perspectives. Collectively, they represent the prominent theoretical approaches to motivation, include the world's dominant religions, and address a wide variety of significant issues related to this very significant subject.
This book, by leading scholars, represents some of the main work in progress in biolinguistics. It offers fresh perspectives on language evolution and variation, new developments in theoretical linguistics, and insights on the relations between variation in language and variation in biology. The authors address the Darwinian questions on the origin and evolution of language from a minimalist perspective, and provide elegant solutions to the evolutionary gap between human language and communication in all other organisms. They consider language variation in the context of current biological approaches to species diversity - the 'evo-devo revolution' - which bring to light deep homologies between organisms. In dispensing with the classical notion of syntactic parameters, the authors argue that language variation, like biodiversity, is the result of experience and thus not a part of the language faculty in the narrow sense. They also examine the nature of this core language faculty, the primary categories with which it is concerned, the operations it performs, the syntactic constraints it poses on semantic interpretation and the role of phases in bridging the gap between brain and syntax. Written in language accessible to a wide audience, The Biolinguistic Enterprise will appeal to scholars and students of linguistics, cognitive science, biology, and natural language processing.
The conscious mind is life as we experience it; we see the world, feel our emotions and think our thoughts thanks to consciousness. This book provides an easy introduction to the foundations of consciousness; how can subjective consciousness be measured scientifically? What happens to the conscious mind and self when the brain gets injured? How does consciousness, our subjective self or soul, arise from the activities of the brain? Addressing the philosophical and historical roots of the problems alongside current scientific approaches to consciousness in psychology and neuroscience, Foundations of Consciousness examines key questions as well as delving deeper to look at altered and higher states of consciousness. Using student-friendly pedagogy throughout, the book discusses some of the most difficult to explain phenomena of consciousness, including dreaming, hypnosis, out-of-body experiences, and mystical experiences. Foundations of Consciousness provides an essential introduction to the scientific and philosophical approaches to consciousness for students in psychology, neuroscience, cognitive science, and philosophy. It will also appeal to those interested in the nature of the human soul, giving an insight into the motivation behind scientist's and philosopher's attempts to understand our place as conscious beings in the physical world.
This professional guide and reference examines the challenges of assessing security vulnerabilities in computing infrastructure. Various aspects of vulnerability assessment are covered in detail, including recent advancements in reducing the requirement for expert knowledge through novel applications of artificial intelligence. The work also offers a series of case studies on how to develop and perform vulnerability assessment techniques using start-of-the-art intelligent mechanisms. Topics and features: provides tutorial activities and thought-provoking questions in each chapter, together with numerous case studies; introduces the fundamentals of vulnerability assessment, and reviews the state of the art of research in this area; discusses vulnerability assessment frameworks, including frameworks for industrial control and cloud systems; examines a range of applications that make use of artificial intelligence to enhance the vulnerability assessment processes; presents visualisation techniques that can be used to assist the vulnerability assessment process. In addition to serving the needs of security practitioners and researchers, this accessible volume is also ideal for students and instructors seeking a primer on artificial intelligence for vulnerability assessment, or a supplementary text for courses on computer security, networking, and artificial intelligence.
This book surveys the entire field of learning and memory. It describes the major approaches to its study and looks at basic assumptions and philosophical underpinnings. Howard integrates work from quite different perspectives into a single framework, and describes peripheral areas not usually mentioned in mainstream books, such as prenatal learning, constraints on knowledge, nonconnectionist machine learning, intelligence and learning, and skills learning. He gives the reader a broad knowledge of what the field is all about, what its parts are and how they interrelate, its major principles and key applications. The primary contribution of this work is the integration of current thinking about learning with the literature and research on memory.
This is the first book-length work to integrate the insights of cognitive science fully into economics. It reviews a wide range of related work in both fields and proposes new approaches to choice theory, rationality, and interaction (equilibrium) that are consistent with the limited cognitive capacity of real human beings. While joining with neoclassical economics in supporting the validity of supply-and-demand theory where it is literally applicable, McCain challenges most neoclassical theory, especially monopoly, oligopoly, and general equilibrium theory and welfare economics. His work aims to further and unite recent notions of behavioral and social economics. This important work will be of interest to behavioral, social, and Keynesian economists, as well as other social scientists and philosophers interested in economic phenomena.
The findings of split-brain research and the mind's symbolic processes are combined to examine the implications for understanding subjective experience of the religious and the sacred.
The global prevalence of neurodevelopmental disorders is accelerating. Numbers of children affected by an autism spectrum disorder (ASD) in the United States have reached 1 in 88 -- 1 in 56 among boys -- and even more children have developed attention deficit/hyperactivity disorders (ADHD). The burden of these disorders to individuals and society overall is enormous; ASD alone costs the United States a staggering $130 billion, with ADHD costs reaching similar heights. Genetic causes of these neurodevelopmental disorders cannot account for such radically increased rates of incidence. The causes must also implicate environmental chemicals, many of which have been shown to disrupt normal thyroid function. In this book, Barbara Demeneix makes the case that thyroid hormone signaling bridges the environment and gene programs needed for brain development--and that environmental chemicals that disrupt normal thyroid function pose significant risks to the inherited intelligence and mental health of future generations. The first chapter provides an historical overview of documented cases in which environmental pollution has caused IQ loss across populations. The following chapters explain the physiology of thyroid hormone action, the importance of iodine and selenium for thyroid hormone signaling and brain development, and why thyroid hormone is such a sensitive target for environmental pollution. The final chapters discuss the role of gene-environment interactions in neurodevelopmental disorders and address what can and must be done by individuals, associations, and decision-makers to staunch these epidemics.
In Part I, the impact of an integro-differential operator on parity logic engines (PLEs) as a tool for scientific modeling from scratch is presented. Part II outlines the fuzzy structural modeling approach for building new linear and nonlinear dynamical causal forecasting systems in terms of fuzzy cognitive maps (FCMs). Part III introduces the new type of autogenetic algorithms (AGAs) to the field of evolutionary computing. Altogether, these PLEs, FCMs, and AGAs may serve as conceptual and computational power tools.
At the time of his death Hans Hormann, then Professor of Psy- chology at the Ruhr University, Bochum, West Germany, was pre- paring an English language version of his Eirifilhrung in die Psycholinguistik. The goal of this book, in both the German and English editions, was to present in compact and readily accessi- ble form the essentials of his approach to the psychology of lan- guage. Basing his work upon the materials treated at length and in depth in two previous comprehensive and more technical works, Psycholinguistics: An Introduction to Research and Theo- ry and To Mean-To Understand, Hormann had made a selection of themes and problems suitable for beginners and for those who wanted a convenient introduction to the specific framework with- in which Hormann thought psycholinguistics was to be pursued. The result is a remarkably clear, succinct, and provocative account of central issues and options of the psychology of lan- guage, that broad and not strictly delimited part of psychology that takes as its domain the multiform conditions, processes, and structures involved in the acqUisition, development, production, and grasp of linguistic meaning. Hormann's approach is admit- tedly contentious and goes directly against a great deal of Anglo- American psycholinguistics. In particular, it radically devalues the relevance of certain types of theoretical linguistics, prin- Cipally, though not exclusively, Chomskyan, for the psychology of vii viii PREFACE language.
Extending the visionary early work of the late Marshall McLuhan,
The Global Village, one of his last collaborative efforts, applies
that vision to today's worldwide, integrated electronic
network.
Thisbook explores the meaning and import of neurophenomenology and the philosophy of enactive or embodied cognition for psychology. It introduces the psychologist to an experiential, non-reductive, holistic, theoretical, and practical framework that integrates the approaches of natural and human science to consciousness. In integrating phenomenology with cognitive science, neurophenomenology provides a bridge between the natural and human sciences that opens an interdisciplinary dialogue on the nature of awareness, the ontological primacy of experience, the perception of the observer, and the mind-brain relationship, which will shape the future of psychological theory, research, and practice.
Can psychoanalysis offer a new computer model? Can computer designers help psychoanalysts to understand their theory better?In contemporary publications human psyche is often related to neural networks. Why? The wiring in computers can also be related to application software. But does this really make sense? Artificial Intelligence has tried to implement functions of human psyche. The reached achievements are remarkable; however, the goal to get a functional model of the mental apparatus was not reached. Was the selected direction incorrect?The editors are convinced: yes, and they try to give answers here. If one accepts that the brain is an information processing system, then one also has to accept that computer theories can be applied to the brain s functions, the human mental apparatus. The contributors of this book - Solms, Panksepp, Sloman and many others who are all experts in computer design, psychoanalysis and neurology are united in one goal: finding synergy in their interdisciplinary fields."
At the beginning of the new millennium, and after a turbulent development process of almost fifty years, Cognitive Psychotherapy still does not seem to have reached a full epistemological and applicative maturity. However at a clinical level, Cognitive Psychotherapy may be considered as one of the most valid and efficient instruments; it is supported by an enormous mass of research and experimental data covering a numerous series of disorders such as mood disturbances, with particular reference to depression, as well as anxiety, personality and eating disorders. Also recently in the field of schizophrenia several studies have been carried out, capable of suggesting an original cognitive approach to the therapy and rehabilitation of psychotic patients. Along with the classic approach by the Philadelphia School started by A. T. Beck, a pool of further evolutions of the original cognitive paradigm have been taking place and are still under development. Among these, of particular importance are the relational and constructivist approaches. This book is a useful instrument for an extensive review of the varied landscape of contemporary Cognitive Psychotherapy. Starting from the introduction chapter, "Cognitive Psychotherapy toward a new millennium," by the Editors, the theoretical chapters of the first part of the book, focus on the great issues of Contemporary Cognitive Psychotherapy. The second part includes a series of chapters dealing with clinical applications. The third part covers almost all psychiatric disorders. This volume will be a greatly useful contribution to the critical reflection about the development of Cognitive Psychotherapy at the beginning of the new millennium.
"Cognitive Iconology" is a new theory of the relation of psychology to art. Instead of being an application of psychological principles, it is a methodologically aware account of psychology, art and the nature of explanation. Rather than fight over biology or culture, it shows how they must fit together. The term "cognitive iconology" is meant to mirror other disciplines like cognitive poetics and musicology but the fear that images must be somehow transparent to understanding is calmed by the stratified approach to explanation that is outlined. In the book, cognitive iconology is a theory of cognitive tendencies that contribute to but are not determinative of an artistic meaning. At the center of the book are three case studies: images depicted within images, basic corrections to architectural renderings in images, and murals and paintings seen from the side. In all cases, there is a primitive perceptual pull that contribute to but do not override larger cultural meaning. The book then moves beyond the confines of the image to behavior around the image, and then ends with the concluding question of why some images are harder to understand than others. "Cognitive Iconology" promises to be important because it moves beyond the turf battles typically fought in image studies. It argues for a sustainable practice of interpretation that can live with other disciplines.
This book reaches way beyond a description of principles, methods and techniques to provide an accessible technology for all. Nearly all the strategies can be used as adjuncts to conventional behaviourist and analytical approaches to therapy including NLP and Gestalt. As well as describing the art of RCT, the authors have provided the therapist with the means to get started, outlining the structures for the first few sessions and giving full scripts for analytical and non-analytical work with the client.
Cognitive science is a multidisciplinary science concerned with understanding and utilizing models of cognition. It has spawned a great dealof research on applications such as expert systems and intelligent tutoring systems, and has interacted closely with psychological research. However, it is generally accepted that it is difficult to apply cognitive-scientific models to medical training and practice. This book is based on a NATO Advanced Research Workshop held in Italy in 1991, the purpose of which was to examine the impact ofmodels of cognition on medical training and practice and to outline future research programmes relating cognition and education, and in particular to consider the potential impact of cognitive science on medical training and practice. A major discovery presented in the book is that the research areas related to artificial intelligence, cognitive psychology, and medical decision making are considerably closer, both conceptually and theoretically, than many of the workshop participants originally thought.
There have been exciting new developments in the treatment of schizophrenia and related psychoses in recent decades. Clinical guidelines increasingly recommend that patients be offered evidence-based psychosocial treatments in addition to medications, as such interventions can produce greater improvements and may prevent relapses better compared with medications alone. In parallel with these recent advancements, an evolution in the way cognitive-behavioral therapies are being conceptualized and implemented has occurred due to the incorporation of novel strategies that promote psychological processes such as acceptance and mindfulness. While there are a variety of acceptance/mindfulness approaches being developed to address psychosis, there is not currently a dominant approach. In Incorporating Acceptance and Mindfulness into the Treatment of Psychosis, Brandon Gaudiano brings together the researchers and clinicians working at the cutting edge of acceptance/mindfulness therapies for psychosis to compare and contrast emerging approaches and discuss them within the context of the more traditional cognitive-behavioral interventions. The book includes a section that focuses on six distinct treatment models that incorporate acceptance and mindfulness strategies for psychosis and a section that provides a synthesis and analysis of acceptance/mindfulness approaches to psychosis. It concludes with recommendations for moving the research forward in a constructive and responsible way. This volume will be an important resource for researchers and clinicians interested in gaining a deeper understanding of mindfulness- and acceptance-based approaches and newer psychosocial treatments for severe mental illness.
The role of orthography in reading and writing is not a new topic of inquiry. For example, in 1970 Venezky made a seminal contribution with The Structure of English Orthography in which he showed how both sequential redundancy (probable and permissible letter sequences) and rules of letter-sound correspondence contribute to orthographic structure. In 1980 Ehri introduced the concept of orthographic images, that is, the representation of written words in memory, and proposed that the image is created by an amalgamation of the word's orthographic and phonological properties. In 1981 Taylor described the evolution of orthographies in writing systems-from the earliest logographies for pictorial representation of ideas to syllabaries for phonetic representation of sounds to alphabets for phonemic representation of sounds. In 1985 Frith proposed a stage model for the role of orthographic knowledge in development of word recognition: Initially in the logographic stage a few words can be recognized on the basis of partial spelling information; in the alphabetic stage words are. recognized on the basis of grapheme-phoneme correspondence; in the orthographic stage spelling units are recognized automatically without phonological mediation. For an historical overview of research on visual processing of written language spanning the earliest records of writing to the early work in experimental psychology, see Venezky (1993).
The enactive approach replaces the classical computer metaphor of mind with emphasis on embodiment and social interaction as the sources of our goals and concerns. Researchers from a range of disciplines unite to address the challenge of how to account for the more uniquely human aspects of cognition, including the abstract and the nonsensical.
What holds together the various fields that are supposed to consititute the general intellectual discipline that people now call cognitive science? In this book, Erneling and Johnson identify two problems with defining this discipline. First, some theorists identify the common subject matter as the mind, but scientists and philosophers have not been able to agree on any single, satisfactory answer to the question of what the mind is. Second, those who speculate about the general characteristics that belong to cognitive science tend to assume that all the particular fields falling under the rubric--psychology, linguistics, biology, and son on--are of roughly equal value in their ability to shed light on the nature of mind. This book argues that all the cognitive science disciplines are not equally able to provide answers to ontological questions about the mind, but rather that only neurophysiology and cultural psychology are suited to answer these questions. However, since the cultural account of mind has long been ignored in favor of the neurophysiological account, Erneling and Johnson bring together contributions that focus especially on different versions of the cultural account of the mind.
Trends and Prospects in Metacognition presents a collection of chapters dealing principally with independent areas of empirical Metacogition research. These research foci, such as animal metacognition, neuropsychology of metacognition, implicit learning, metacognitive experiences, metamemory, young children's Metacogition, theory of mind, metacognitive knowledge, decision making, and interventions for the enhancement of metacognition, have all emerged as trends in the field of metacognition. Yet, the resulting research has not converged, precluding an integration of concepts and findings. Presenting a new theoretical framework, Trends and Prospects in Metacognition extends the classical definitions offered by Flavell and Nelson to carry the prospect of more integrated work into the future. By opening the possibility to cross the boundaries posed by traditionally independent research areas, this volume provides a foundation for the integration of research paradigms and concepts and builds on the relationship between metacognition and consciousness, while integrating basic with applied research.
This book explains the foundation of approximate Bayesian computation (ABC), an approach to Bayesian inference that does not require the specification of a likelihood function. As a result, ABC can be used to estimate posterior distributions of parameters for simulation-based models. Simulation-based models are now very popular in cognitive science, as are Bayesian methods for performing parameter inference. As such, the recent developments of likelihood-free techniques are an important advancement for the field. Chapters discuss the philosophy of Bayesian inference as well as provide several algorithms for performing ABC. Chapters also apply some of the algorithms in a tutorial fashion, with one specific application to the Minerva 2 model. In addition, the book discusses several applications of ABC methodology to recent problems in cognitive science. Likelihood-Free Methods for Cognitive Science will be of interest to researchers and graduate students working in experimental, applied, and cognitive science. |
You may like...
Motivation, Volume 229 - Theory…
Bettina Studer, Stefan Knecht
Hardcover
R6,462
Discovery Miles 64 620
Cerebral Lateralization and Cognition…
Gillian Forrester, Kristelle Hudry, …
Hardcover
R6,207
Discovery Miles 62 070
Cognitive Psychology - EMEA Edition
E. Bruce Goldstein, Johanna C. van Hooff
Paperback
Executive Functions in Children's…
Maureen J. Hoskyn, Grace Iarocci, …
Hardcover
R1,852
Discovery Miles 18 520
Current Topics in Language, Volume 68
Kara D. Federmeier, Duane Watson
Hardcover
R3,115
Discovery Miles 31 150
Product Experience
Hendrik N. J. Schifferstein, Paul Hekkert
Hardcover
R4,202
Discovery Miles 42 020
|