![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Philosophy > Topics in philosophy > Logic
This book is meant as a part of the larger contemporary philosophical project of naturalizing logico-mathematical knowledge, and addresses the key question that motivates most of the work in this field: What is philosophically relevant about the nature of logico-mathematical knowledge in recent research in psychology and cognitive science? The question about this distinctive kind of knowledge is rooted in Plato's dialogues, and virtually all major philosophers have expressed interest in it. The essays in this collection tackle this important philosophical query from the perspective of the modern sciences of cognition, namely cognitive psychology and neuroscience. Naturalizing Logico-Mathematical Knowledge contributes to consolidating a new, emerging direction in the philosophy of mathematics, which, while keeping the traditional concerns of this sub-discipline in sight, aims to engage with them in a scientifically-informed manner. A subsequent aim is to signal the philosophers' willingness to enter into a fruitful dialogue with the community of cognitive scientists and psychologists by examining their methods and interpretive strategies.
Bayesian nets are widely used in artificial intelligence as a calculus for casual reasoning, enabling machines to make predictions, perform diagnoses, take decisions and even to discover casual relationships. But many philosophers have criticized and ultimately rejected the central assumption on which such work is based-the causal Markov Condition. So should Bayesian nets be abandoned? What explains their success in artificial intelligence? This book argues that the Causal Markov Condition holds as a default rule: it often holds but may need to be repealed in the face of counter examples. Thus, Bayesian nets are the right tool to use by default but naively applying them can lead to problems. The book develops a systematic account of causal reasoning and shows how Bayesian nets can be coherently employed to automate the reasoning processes of an artificial agent. The resulting framework for causal reasoning involves not only new algorithms, but also new conceptual foundations. Probability and causality are treated as mental notions - part of an agent's belief state. Yet probability and causality are also objective - different agents with the same background knowledge ought to adopt the same or similar probabilistic and causal beliefs. This book, aimed at researchers and graduate students in computer science, mathematics and philosophy, provides a general introduction to these philosophical views as well as exposition of the computational techniques that they motivate.
Inferentialism is a philosophical approach premised on the claim that an item of language (or thought) acquires meaning (or content) in virtue of being embedded in an intricate set of social practices normatively governed by inferential rules. Inferentialism found its paradigmatic formulation in Robert Brandom's landmark book Making it Explicit, and over the last two decades it has established itself as one of the leading research programs in the philosophy of language and the philosophy of logic. While Brandom's version of inferentialism has received wide attention in the philosophical literature, thinkers friendly to inferentialism have proposed and developed new lines of inquiry that merit wider recognition and critical appraisal. From Rules to Meaning brings together new essays that systematically develop, compare, assess and critically react to some of the most pertinent recent trends in inferentialism. The book's four thematic sections seek to apply inferentialism to a number of core issues, including the nature of meaning and content, reconstructing semantics, rule-oriented models and explanations of social practices and inferentialism's historical influence and dialogue with other philosophical traditions. With contributions from a number of distinguished philosophers-including Robert Brandom and Jaroslav Peregrin-this volume is a major contribution to the philosophical literature on the foundations of logic and language.
In this book, Bryan Wesley Hall breaks new ground in Kant scholarship, exploring the gap in Kant's Critical philosophy in relation to his post-Critical work by turning to Kant's final, unpublished work, the so-called Opus Postumum. Although Kant considered this project to be the "keystone" of his philosophical efforts, it has been largely neglected by scholars. Hall argues that only by understanding the Opus Postumum can we fully comprehend both Kant's mature view as well as his Critical project. In letters from 1798, Kant claims to have discovered a "gap" in the Critical philosophy that requires effecting a "transition from the metaphysical foundations of natural science to physics"; unfortunately, Kant does not make clear exactly what this gap is or how the transition is supposed to fill the gap. To resolve these issues, Hall draws on the Opus Postumum, arguing that Kant's transition project can solve certain perennial problems with the Critical philosophy. This volume provides a powerful alternative to all current interpretations of the Opus Postumum, arguing that Kant's transition project is best seen as the post-Critical culmination of his Critical philosophy. Hall carefully examines the deep connections between the Opus Postumum and the view Kant develops in the Critique of Pure Reason, to suggest that properly understanding the post-Critical Kant will significantly revise our view of Kant's Critical period.
First published in 1974. Despite the tendency of contemporary analytic philosophy to put logic and mathematics at a central position, the author argues it failed to appreciate or account for their rich content. Through discussions of such mathematical concepts as number, the continuum, set, proof and mechanical procedure, the author provides an introduction to the philosophy of mathematics and an internal criticism of the then current academic philosophy. The material presented is also an illustration of a new, more general method of approach called substantial factualism which the author asserts allows for the development of a more comprehensive philosophical position by not trivialising or distorting substantial facts of human knowledge.
This is the second of two volumes of essays in commemoration of Alan Turing, celebrating his intellectual legacy within the philosophy of mind and cognitive science. It focuses on the relationship between a scientific, computational image of the mind and a common-sense picture of the mind as an inner arena populated by concepts, beliefs, intentions, and qualia. Topics covered include the causal potency of folk-psychological states, the connectionist reconception of learning and concept formation, the understanding of the notion of computation itself, and the relation between philosophical and psychological theories of concepts.
This book is the first part of a comprehensive study of Wittgenstein's conception of language description. Describing language was no pastime occupation for the philosopher. It was hard work and it meant struggle. It made for a philosophy that required Wittgenstein's full attention and half his life. His approach had always been working on himself, on how he saw things. The central claim of this book is that nothing will come of our exegetical efforts to see what Wittgenstein's later philosophy amounts to if his work on describing language is not given the place and concern it deserves. The book shows what his philosophy might begin to look like in the light of critical questions around his interest to see the end of the day with descriptions, and these things only.
This book examines the progress to date in the many facets - conceptual, epistemological and methodological - of the field of legal semiotics. It reflects the fulfilment of the promise of legal semiotics when used to explore the law, its processes and interpretation. This study in Legal Semiotics brings together the theory, structure and practise of legal semiotics in an accessible style. The book introduces the concepts of legal semiotics and offers an insight in contemporary and future directions which the semiotics of law is going to take. A theoretical and practical oriented synthesis of the historical, contemporary and most recent ideas pertaining to legal semiotics, the book will be of interest to scholars and researchers in law and social sciences, as well as those who are interested in the interdisciplinary dynamics of law and semiotics.
The papers in this volume address fundamental, and interrelated, philosophical issues concerning modality and identity, issues that have not only been pivotal to the development of analytic philosophy in the twentieth century, but remain a key focus of metaphysical debate in the twenty-first. How are we to understand the concepts of necessity and possibility? Is chance a basic ingredient of reality? How are we to make sense of claims about personal identity? Do numbers require distinctive identity criteria? Does the capacity to identify an object presuppose an ability to bring it under a sortal concept? Rather than presenting a single, partisan perspective, Identity and Modality enriches our understanding of identity and modality by bringing together papers written by leading researchers working in metaphysics, the philosophy of mind, the philosophy of science, and the philosophy of mathematics. The resulting variety of perspectives correspondingly reflects both the breadth and depth of contemporary theorizing about identity and modality, each paper addressing a particular issue and advancing our knowledge of the area. This volume will provide essential reading for graduate students in the subject and professional philosophers.
Identity and Discrimination, originally published in 1990 and the first book by respected philosopher Timothy Williamson, is now reissued and updated with the inclusion of significant new material. Williamson here proposes an original and rigorous theory linking identity, a relation central to metaphysics, and indiscriminability, a relation central to epistemology. * Updated and reissued edition of Williamson s first publication, with the inclusion of significant new material * Argues for an original cognitive account of the relation between identity and discrimination that has been influential in the philosophy of perception * Pioneers the use of epistemic logic to solve puzzles about indiscriminability * Develops the application of techniques from mathematical logic to understand issues about identity over time and across possible worlds
This book develops new techniques in formal epistemology and applies them to the challenge of Cartesian skepticism. It introduces two formats of epistemic evaluation that should be of interest to epistemologists and philosophers of science: the dual-component format, which evaluates a statement on the basis of its safety and informativeness, and the relative-divergence format, which evaluates a probabilistic model on the basis of its complexity and goodness of fit with data. Tomoji Shogenji shows that the former lends support to Cartesian skepticism, but the latter allows us to defeat Cartesian skepticism. Along the way, Shogenji addresses a number of related issues in epistemology and philosophy of science, including epistemic circularity, epistemic closure, and inductive skepticism.
Peirce's Speculative Grammar: Logic as Semiotics offers a comprehensive, philologically accurate, and exegetically ambitious developmental account of Peirce's theory of speculative grammar. The book traces the evolution of Peirce's grammatical writings from his early research on the classification of arguments in the 1860s up to the complex semiotic taxonomies elaborated in the first decade of the twentieth century. It will be of interest to academic specialists working on Peirce, the history of American philosophy and pragmatism, the philosophy of language, the history of logic, and semiotics.
This book seeks to work out which commitments are minimally sufficient to obtain an ontology of the natural world that matches all of today's well-established physical theories. We propose an ontology of the natural world that is defined only by two axioms: (1) There are distance relations that individuate simple objects, namely matter points. (2) The matter points are permanent, with the distances between them changing. Everything else comes in as a means to represent the change in the distance relations in a manner that is both as simple and as informative as possible. The book works this minimalist ontology out in philosophical as well as mathematical terms and shows how one can understand classical mechanics, quantum field theory and relativistic physics on the basis of this ontology. Along the way, we seek to achieve four subsidiary aims: (a) to make a case for a holistic individuation of the basic objects (ontic structural realism); (b) to work out a new version of Humeanism, dubbed Super-Humeanism, that does without natural properties; (c) to set out an ontology of quantum physics that is an alternative to quantum state realism and that avoids any ontological dualism of particles and fields; (d) to vindicate a relationalist ontology based on point objects also in the domain of relativistic physics.
A Logical Introduction to Probability and Induction is a textbook on the mathematics of the probability calculus and its applications in philosophy. On the mathematical side, the textbook introduces these parts of logic and set theory that are needed for a precise formulation of the probability calculus. On the philosophical side, the main focus is on the problem of induction and its reception in epistemology and the philosophy of science. Particular emphasis is placed on the means-end approach to the justification of inductive inference rules. In addition, the book discusses the major interpretations of probability. These are philosophical accounts of the nature of probability that interpret the mathematical structure of the probability calculus. Besides the classical and logical interpretation, they include the interpretation of probability as chance, degree of belief, and relative frequency. The Bayesian interpretation of probability as degree of belief locates probability in a subject's mind. It raises the question why her degrees of belief ought to obey the probability calculus. In contrast to this, chance and relative frequency belong to the external world. While chance is postulated by theory, relative frequencies can be observed empirically. A Logical Introduction to Probability and Induction aims to equip students with the ability to successfully carry out arguments. It begins with elementary deductive logic and uses it as basis for the material on probability and induction. Throughout the textbook results are carefully proved using the inference rules introduced at the beginning, and students are asked to solve problems in the form of 50 exercises. An instructor's manual contains the solutions to these exercises as well as suggested exam questions. The book does not presuppose any background in mathematics, although sections 10.3-10.9 on statistics are technically sophisticated and optional. The textbook is suitable for lower level undergraduate courses in philosophy and logic.
This comprehensive account of the concept and practices of deduction is the first to bring together perspectives from philosophy, history, psychology and cognitive science, and mathematical practice. Catarina Dutilh Novaes draws on all of these perspectives to argue for an overarching conceptualization of deduction as a dialogical practice: deduction has dialogical roots, and these dialogical roots are still largely present both in theories and in practices of deduction. Dutilh Novaes' account also highlights the deeply human and in fact social nature of deduction, as embedded in actual human practices; as such, it presents a highly innovative account of deduction. The book will be of interest to a wide range of readers, from advanced students to senior scholars, and from philosophers to mathematicians and cognitive scientists.
We are happy to present to the reader the first book of our Applied Logic Series. Walton's book on the fallacies of ambiguity is firmly at the heart of practical reasoning, an important part of applied logic. There is an increasing interest in artifIcial intelligence, philosophy, psychol ogy, software engineering and linguistics, in the analysis and possible mechanisation of human practical reasoning. Continuing the ancient quest that began with Aristotle, computer scientists, logicians, philosophers and linguists are vigorously seeking to deepen our understanding of human reasoning and argumentation. Significant communities of researchers are actively engaged in developing new approaches to logic and argumentation, which are better suited to the urgent needs of today's applications. The author of this book has, over many years, made significant contributions to the detailed analysis of practical reasoning case studies, thus providing solid foundations for new and more applicable formal logical systems. We welcome Doug Walton's new book to our series."
This book is dedicated to Dov Gabbay, one of the most outstanding and most productive researchers in the area of logic, language and reasoning. He has exerted a profound influence in the major fields of logic, linguistics and computer science. Most of the chapters included, therefore, build on his work and present results or summarize areas where Dov has made major contributions. In particular his work on Labelled Deductive Systems is addressed in most of the contributions. The chapters on computational linguistics address logical and deductive aspects of linguistic problems. The papers by van Benthem Lambek and Moortgat investigate categorial considerations and the use of labels within the "parsing as deduction" approach. Analyses of particular linguistic problems are given in the remaining papers by Kamp, Kempson, Moravcsik, Konig and Reyle. They address the logic of generalized quantifiers, the treatment of cross-over phenomena and temporal/aspectual interpretation, as well as applicability of underspecified deduction in linguistic formalisms. The more logic-oriented chapters address philosophical and proof-theoretic problems and give algorithmic solutions for most of them. The spectrum ranges from K. Segerberg's contribution which brings together the two traditions of epistemic and doxastic logics of belief, to M. Finger and M. Reynold's chapter on two-dimensional executable logics with applications to temporal databases. The book demonstrates that a relatively small number of basic techniques and ideas, in particular the idea of labelled deductive systems, can be successfully applied in many different areas.
Timothy Williamson is one of the most influential living philosophers working in the areas of logic and metaphysics. His work in these areas has been particularly influential in shaping debates about metaphysical modality, which is the topic of his recent provocative and closely-argued book Modal Logic as Metaphysics (2013). This book comprises ten essays by metaphysicians and logicians responding to Williamson's work on metaphysical modality, as well as replies by Williamson to each essay. In addition, it contains an original essay by Williamson, 'Modal science,' concerning the role of modal claims in natural science. This book was originally published as a special issue of the Canadian Journal of Philosophy.
The problem of truth and the liar paradox is one of the most extensive problems of philosophy. The liar paradox can be avoided by assuming a so-called theory of partial truth instead of a classical theory of truth. Theories of partial truth, however, cannot solve the so-called strengthened liar paradox, which is the problem that many semantic statements about the so-called strengthened liar cannot be true in a theory of partial truth. If such semantic statements were true in the theory, another paradox would emerge. To proponents of contextual accounts, which assume that the concept of truth is context-dependent, the strengthened liar paradox is the core of the liar problem. This book provides an overview of current contextual approaches to the strengthened liar paradox. For this purpose, the author investigates formal theories of truth that result from formal reconstructions of such contextual approaches.
Philosophers have warned of the perils of a life spent without reflection, but what constitutes reflective inquiry - and why it's necessary in our lives - can be an elusive concept. Synthesizing ideas from minds as diverse as John Dewey and Paulo Freire, theHandbook of Reflection and Reflective Inquiry presents reflective thought in its most vital aspects, not as a fanciful or nostalgic exercise, but as a powerful means of seeing familiar events anew, encouraging critical thinking and crucial insight, teaching and learning. In its opening pages, two seasoned educators, Maxine Greene and Lee Shulman, discuss reflective inquiry as a form of active attention (Thoreau's "wide-awakeness"), an act of consciousness, and a process by which people can understand themselves, their work (particularly in the form of life projects), and others. Building on this foundation, the Handbook analyzes through the work of 40 internationally oriented authors: - Definitional issues concerning reflection, what it is and is not; - Worldwide social and moral conditions contributing to the growing interest in reflective inquiry in professional education; - Reflection as promoted across professional educational domains, including K-12 education, teacher education, occupational therapy, and the law; - Methods of facilitating and scaffolding reflective engagement; - Current pedagogical and research practices in reflection; - Approaches to assessing reflective inquiry. Educators across the professions as well as adult educators, counselors and psychologists, and curriculum developers concerned with adult learning will find the Handbook of Reflection and Reflective Inquiry an invaluable teaching tool for challenging times.
Metamathematics and the Philosophical Tradition is the first work to explore in such historical depth the relationship between fundamental philosophical quandaries regarding self-reference and meta-mathematical notions of consistency and incompleteness. Using the insights of twentieth-century logicians from Goedel through Hilbert and their successors, this volume revisits the writings of Aristotle, the ancient skeptics, Anselm, and enlightenment and seventeenth and eighteenth century philosophers Leibniz, Berkeley, Hume, Pascal, Descartes, and Kant to identify ways in which these both encode and evade problems of a priori definition and self-reference. The final chapters critique and extend more recent insights of late 20th-century logicians and quantum physicists, and offer new applications of the completeness theorem as a means of exploring "metatheoretical ascent" and the limitations of scientific certainty. Broadly syncretic in range, Metamathematics and the Philosophical Tradition addresses central and recurring problems within epistemology. The volume's elegant, condensed writing style renders accessible its wealth of citations and allusions from varied traditions and in several languages. Its arguments will be of special interest to historians and philosophers of science and mathematics, particularly scholars of classical skepticism, the Enlightenment, Kant, ethics, and mathematical logic.
Is reality logical and is logic real? What is the origin of logical intuitions? What is the role of logical structures in the operations of an intelligent mind and in communication? Is the function of logical structure regulative or constitutive or both in concept formation? This volume provides analyses of the logic-reality relationship from different approaches and perspectives. The point of convergence lies in the exploration of the connections between reality - social, natural or ideal - and logical structures employed in describing or discovering it. Moreover, the book connects logical theory with more concrete issues of rationality, normativity and understanding, thus pointing to a wide range of potential applications. The papers collected in this volume address cutting-edge topics in contemporary discussions amongst specialists. Some essays focus on the role of indispensability considerations in the justification of logical competence, and the wide range of challenges within the philosophy of mathematics. Others present advances in dynamic logical analysis such as extension of game semantics to non-logical part of vocabulary and development of models of contractive speech act."
This collection presents the first sustained examination of the nature and status of the idea of principles in early modern thought. Principles are almost ubiquitous in the seventeenth and eighteenth centuries: the term appears in famous book titles, such as Newton's Principia; the notion plays a central role in the thought of many leading philosophers, such as Leibniz's Principle of Sufficient Reason; and many of the great discoveries of the period, such as the Law of Gravitational Attraction, were described as principles. Ranging from mathematics and law to chemistry, from natural and moral philosophy to natural theology, and covering some of the leading thinkers of the period, this volume presents ten compelling new essays that illustrate the centrality and importance of the idea of principles in early modern thought. It contains chapters by leading scholars in the field, including the Leibniz scholar Daniel Garber and the historian of chemistry William R. Newman, as well as exciting, emerging scholars, such as the Newton scholar Kirsten Walsh and a leading expert on experimental philosophy, Alberto Vanzo. The Idea of Principles in Early Modern Thought: Interdisciplinary Perspectives charts the terrain of one of the period's central concepts for the first time, and opens up new lines for further research.
The founder of both American pragmatism and semiotics, Charles Sanders Peirce (1839-1914) is widely regarded as an enormously important and pioneering theorist. In this book, scholars from around the world examine the nature and significance of Peirce's work on perception, iconicity, and diagrammatic thinking. Abjuring any strict dichotomy between presentational and representational mental activity, Peirce's theories transform the Aristotelian, Humean, and Kantian paradigms that continue to hold sway today and, in so doing, forge a new path for understanding the centrality of visual thinking in science, education, art, and communication. The essays in this collection cover a wide range of issues related to Peirce's theories, including the perception of generality; the legacy of ideas being copies of impressions; imagination and its contribution to knowledge; logical graphs, diagrams, and the question of whether their iconicity distinguishes them from other sorts of symbolic notation; how images and diagrams contribute to scientific discovery and make it possible to perceive formal relations; and the importance and danger of using diagrams to convey scientific ideas. This book is a key resource for scholars interested in Perice's philosophy and its relation to contemporary issues in mathematics, philosophy of mind, philosophy of perception, semiotics, logic, visual thinking, and cognitive science. |
You may like...
Logic on the Track of Social Change
David Braybrooke, Bryson Brown, …
Hardcover
R1,459
Discovery Miles 14 590
|