![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Philosophy > Topics in philosophy > Logic
Quine's set theory, New Foundations, has often been treated as an anomaly in the history and philosophy of set theory. In this book, Sean Morris shows that it is in fact well-motivated, emerging in a natural way from the early development of set theory. Morris introduces and explores the notion of set theory as explication: the view that there is no single correct axiomatization of set theory, but rather that the various axiomatizations all serve to explicate the notion of set and are judged largely according to pragmatic criteria. Morris also brings out the important interplay between New Foundations, Quine's philosophy of set theory, and his philosophy more generally. We see that his early technical work in logic foreshadows his later famed naturalism, with his philosophy of set theory playing a crucial role in his primary philosophical project of clarifying our conceptual scheme and specifically its logical and mathematical components.
What do the rules of logic say about the meanings of the symbols they govern? In this book, James W. Garson examines the inferential behaviour of logical connectives (such as 'and', 'or', 'not' and 'if ... then'), whose behaviour is defined by strict rules, and proves definitive results concerning exactly what those rules express about connective truth conditions. He explores the ways in which, depending on circumstances, a system of rules may provide no interpretation of a connective at all, or the interpretation we ordinarily expect for it, or an unfamiliar or novel interpretation. He also shows how the novel interpretations thus generated may be used to help analyse philosophical problems such as vagueness and the open future. His book will be valuable for graduates and specialists in logic, philosophy of logic, and philosophy of language.
This book articulates and defends Fregean realism, a theory of properties based on Frege's insight that properties are not objects, but rather the satisfaction conditions of predicates. Robert Trueman argues that this approach is the key not only to dissolving a host of longstanding metaphysical puzzles, such as Bradley's Regress and the Problem of Universals, but also to understanding the relationship between states of affairs, propositions, and the truth conditions of sentences. Fregean realism, Trueman suggests, ultimately leads to a version of the identity theory of truth, the theory that true propositions are identical to obtaining states of affairs. In other words, the identity theory collapses the gap between mind and world. This book will be of interest to anyone working in logic, metaphysics, the philosophy of language or the philosophy of mind.
This comprehensive account of the concept and practices of deduction is the first to bring together perspectives from philosophy, history, psychology and cognitive science, and mathematical practice. Catarina Dutilh Novaes draws on all of these perspectives to argue for an overarching conceptualization of deduction as a dialogical practice: deduction has dialogical roots, and these dialogical roots are still largely present both in theories and in practices of deduction. Dutilh Novaes' account also highlights the deeply human and in fact social nature of deduction, as embedded in actual human practices; as such, it presents a highly innovative account of deduction. The book will be of interest to a wide range of readers, from advanced students to senior scholars, and from philosophers to mathematicians and cognitive scientists.
Is mathematics 'entangled' with its various formalisations? Or are the central concepts of mathematics largely insensitive to formalisation, or 'formalism free'? What is the semantic point of view and how is it implemented in foundational practice? Does a given semantic framework always have an implicit syntax? Inspired by what she calls the 'natural language moves' of Goedel and Tarski, Juliette Kennedy considers what roles the concepts of 'entanglement' and 'formalism freeness' play in a range of logical settings, from computability and set theory to model theory and second order logic, to logicality, developing an entirely original philosophy of mathematics along the way. The treatment is historically, logically and set-theoretically rich, and topics such as naturalism and foundations receive their due, but now with a new twist.
Fred Stoutland was a major figure in the philosophy of action and philosophy of language. This collection brings together essays on truth, language, action and mind and thus provides an important summary of many key themes in Stoutland's own work, as well as offering valuable perspectives on key issues in contemporary philosophy.
The perception of what he calls 'aspects' preoccupied Wittgenstein and gave him considerable trouble in his final years. The Wittgensteinian aspect defies any number of traditional philosophical dichotomies: the aspect is neither subjective (inner, metaphysically private) nor objective; it presents perceivable unity and sense that are (arguably) not (yet) conceptual; it is 'subject to the will', but at the same time is normally taken to be genuinely revelatory of the object perceived under it. This Element begins with a grammatical and phenomenological characterization of Wittgensteinian 'aspects'. It then challenges two widespread ideas: that aspects are to be identified with concepts; and that aspect perception has a continuous version that is characteristic of (normal) human perception. It concludes by proposing that aspect perception brings to light the distinction between the world as perceived and the world as objectively construed, and the role we play in the constitution of the former.
In this book we deal with combinations of concepts defining individuals in the Talmud. Consider for example Yom Kippur and Shabbat. Each concept has its own body of laws. Reality forces us to combine them when they occur on the same day. This is a case of "Identity Merging." As the combined body of laws may be inconsistent, we need a belief revision mechanism to reconcile the conflicting norms. The Talmud offers three options: 1 Take the union of the sets of the rules side by side 2. Resolve the conflicts using further meta-level Talmudic principles (which are new and of value to present day Artificial Intelligence) 3. Regard the new combined concept as a new entity with its own Halachic norms and create new norms for it out of the existing ones. This book offers a clear and precise logical model showing how the Talmud deals with these options.
W. V. Quine was one of the most influential figures of twentieth-century American analytic philosophy. Although he wrote predominantly in English, in Brazil in 1942 he gave a series of lectures on logic and its philosophy in Portuguese, subsequently published as the book O Sentido da Nova Logica. The book has never before been fully translated into English, and this volume is the first to make its content accessible to Anglophone philosophers. Quine would go on to develop revolutionary ideas about semantic holism and ontology, and this book provides a snapshot of his views on logic and language at a pivotal stage of his intellectual development. The volume also includes an essay on logic which Quine also published in Portuguese, together with an extensive historical-philosophical essay by Frederique Janssen-Lauret. The valuable and previously neglected works first translated in this volume will be essential for scholars of twentieth-century philosophy.
Numbers and other mathematical objects are exceptional in having no locations in space or time and no causes or effects in the physical world. This makes it difficult to account for the possibility of mathematical knowledge, leading many philosophers to embrace nominalism, the doctrine that there are no abstract entitles, and to embark on ambitious projects for interpreting mathematics so as to preserve the subject while eliminating its objects. A Subject With No Object cuts through a host of technicalities that have obscured previous discussions of these projects, and presents clear, concise accounts, with minimal prerequisites, of a dozen strategies for nominalistic interpretation of mathematics, thus equipping the reader to evaluate each and to compare different ones. The authors also offer critical discussion, rare in the literature, of the aims and claims of nominalistic interpretation, suggesting that it is significant in a very different way from that usually assumed.
Mathematics plays a central role in much of contemporary science,
but philosophers have struggled to understand what this role is or
how significant it might be for mathematics and science. In this
book Christopher Pincock tackles this perennial question in a new
way by asking how mathematics contributes to the success of our
best scientific representations. In the first part of the book this
question is posed and sharpened using a proposal for how we can
determine the content of a scientific representation. Several
different sorts of contributions from mathematics are then
articulated. Pincock argues that each contribution can be
understood as broadly epistemic, so that what mathematics
ultimately contributes to science is best connected with our
scientific knowledge.
A relative change occurs when some item changes a relation. This Element examines how Plato, Aristotle, Stoics and Sextus Empiricus approached relative change. Relative change is puzzling because the following three propositions each seem true but cannot be true together: (1) No relative changes are intrinsic changes; (2) Only intrinsic changes are proper changes; (3) Some relative changes are proper changes. Plato's Theaetetus and Phaedo property relative change. I argue that these dialogues assume relative changes to be intrinsic changes, so denying (1). Aristotle responds differently, by denying (3) that relative change is proper change. The Stoics claimed that some non-intrinsic changes are changes (denying (2)). Finally, I discuss Sextus' argument that relative change shows that there are no relatives at all.
Ideas about relativity underlie much ancient Greek philosophy, from Protagorean relativism, to Plato's theory of Forms, Aristotle's category scheme, and relational logic. In Ancient Relativity Matthew Duncombe explores how ancient philosophers, particularly Plato, Aristotle, the Stoics, and Sextus Empiricus, understood the phenomenon and how their theories of relativity affected, and were affected by, their broader philosophical outlooks. He argues that ancient philosophers shared a close-knit family of views referred to as 'constitutive relativity', whereby a relative is not simply linked by a relation but is constituted by it. Plato exploits this view in some key arguments concerning the Forms and the partition of the soul. Aristotle adopts the constitutive view in his discussions of relativity in Categories 7 and the Topics and retains it in Metaphysics Delta 15. Duncombe goes on to examine the role relativity plays in Stoic philosophy, especially Stoic physics and metaphysics, and the way Sextus Empiricus thinks about relativity, which does not appeal to the nature of relatives but rather to how we conceive of things as correlative.
Our beliefs come in degrees. I'm 70% confident it will rain tomorrow, and 0.001% sure my lottery ticket will win. What's more, we think these degrees of belief should abide by certain principles if they are to be rational. For instance, you shouldn't believe that a person's taller than 6ft more strongly than you believe that they're taller than 5ft, since the former entails the latter. In Dutch Book arguments, we try to establish the principles of rationality for degrees of belief by appealing to their role in guiding decisions. In particular, we show that degrees of belief that don't satisfy the principles will always guide action in some way that is bad or undesirable. In this Element, we present Dutch Book arguments for the principles of Probabilism, Conditionalization, and the Reflection Principle, among others, and we formulate and consider the most serious objections to them.
This collection of essays examines logic and its philosophy. The author investigates the nature of logic not only by describing its properties but also by showing philosophical applications of logical concepts and structures. He evaluates what logic is and analyzes among other aspects the relations of logic and language, the status of identity, bivalence, proof, truth, constructivism, and metamathematics. With examples concerning the application of logic to philosophy, he also covers semantic loops, the epistemic discourse, the normative discourse, paradoxes, properties of truth, truth-making as well as theology, being and logical determinism. The author concludes with a philosophical reflection on nothingness and its modelling.
Wittgenstein's 'middle period' is often seen as a transitional phase connecting his better-known early and later philosophies. The fifteen essays in this volume focus both on the distinctive character of his teaching and writing in the 1930s, and on its pivotal importance for an understanding of his philosophy as a whole. They offer wide-ranging perspectives on the central issue of how best to identify changes and continuities in his philosophy during those years, as well as on particular topics in the philosophy of mind, religion, ethics, aesthetics, and the philosophy of mathematics. The volume will be valuable for all who are interested in this formative period of Wittgenstein's development.
This book returns to the discussion in volume 1 on analogy and induction, and analyses their substance. The first part distinguishes between two kinds of logic: One kind based on union of the common features, and the other kind based on synthesis of different features. In the second part of the book we propose a formal scheme for synthesis of concepts. The third part analyses various mechanisms for kidushin and kinyan, which form a mathematical group.
A comprehensive collection of historical readings in the philosophy of mathematics and a selection of influential contemporary work, this much-needed introduction reveals the rich history of the subject. An Historical Introduction to the Philosophy of Mathematics: A Reader brings together an impressive collection of primary sources from ancient and modern philosophy. Arranged chronologically and featuring introductory overviews explaining technical terms, this accessible reader is easy-to-follow and unrivaled in its historical scope. With selections from key thinkers such as Plato, Aristotle, Descartes, Hume and Kant, it connects the major ideas of the ancients with contemporary thinkers. A selection of recent texts from philosophers including Quine, Putnam, Field and Maddy offering insights into the current state of the discipline clearly illustrates the development of the subject. Presenting historical background essential to understanding contemporary trends and a survey of recent work, An Historical Introduction to the Philosophy of Mathematics: A Reader is required reading for undergraduates and graduate students studying the philosophy of mathematics and an invaluable source book for working researchers.
According to Bayesian epistemology, rational learning from experience is consistent learning, that is learning should incorporate new information consistently into one's old system of beliefs. Simon M. Huttegger argues that this core idea can be transferred to situations where the learner's informational inputs are much more limited than Bayesianism assumes, thereby significantly expanding the reach of a Bayesian type of epistemology. What results from this is a unified account of probabilistic learning in the tradition of Richard Jeffrey's 'radical probabilism'. Along the way, Huttegger addresses a number of debates in epistemology and the philosophy of science, including the status of prior probabilities, whether Bayes' rule is the only legitimate form of learning from experience, and whether rational agents can have sustained disagreements. His book will be of interest to students and scholars of epistemology, of game and decision theory, and of cognitive, economic, and computer sciences.
This book analyses and defends the deflationist claim that there is nothing deep about our notion of truth. According to this view, truth is a 'light' and innocent concept, devoid of any essence which could be revealed by scientific inquiry. Cezary Cieslinski considers this claim in light of recent formal results on axiomatic truth theories, which are crucial for understanding and evaluating the philosophical thesis of the innocence of truth. Providing an up-to-date discussion and original perspectives on this central and controversial issue, his book will be important for those with a background in logic who are interested in formal truth theories and in current philosophical debates about the deflationary conception of truth.
While probabilistic logics in principle might be applied to solve a range of problems, in practice they are rarely applied - perhaps because they seem disparate, complicated, and computationally intractable. This programmatic book argues that several approaches to probabilistic logic fit into a simple unifying framework in which logically complex evidence is used to associate probability intervals or probabilities with sentences. Specifically, Part I shows that there is a natural way to present a question posed in probabilistic logic, and that various inferential procedures provide semantics for that question, while Part II shows that there is the potential to develop computationally feasible methods to mesh with this framework. The book is intended for researchers in philosophy, logic, computer science and statistics. A familiarity with mathematical concepts and notation is presumed, but no advanced knowledge of logic or probability theory is required.
This book introduces the theory of graded consequence (GCT) and its mathematical formulation. It also compares the notion of graded consequence with other notions of consequence in fuzzy logics, and discusses possible applications of the theory in approximate reasoning and decision-support systems. One of the main points where this book emphasizes on is that GCT maintains the distinction between the three different levels of languages of a logic, namely object language, metalanguage and metametalanguage, and thus avoids the problem of violation of the principle of use and mention; it also shows, gathering evidences from existing fuzzy logics, that the problem of category mistake may arise as a result of not maintaining distinction between levels.
Model theory begins with an audacious idea: to consider statements about mathematical structures as mathematical objects of study in their own right. While inherently important as a tool of mathematical logic, it also enjoys connections to and applications in diverse branches of mathematics, including algebra, number theory and analysis. Despite this, traditional introductions to model theory assume a graduate-level background of the reader. In this innovative textbook, Jonathan Kirby brings model theory to an undergraduate audience. The highlights of basic model theory are illustrated through examples from specific structures familiar from undergraduate mathematics, paying particular attention to definable sets throughout. With numerous exercises of varying difficulty, this is an accessible introduction to model theory and its place in mathematics. |
You may like...
Logic on the Track of Social Change
David Braybrooke, Bryson Brown, …
Hardcover
R1,459
Discovery Miles 14 590
|