![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Logic
Is mathematics 'entangled' with its various formalisations? Or are the central concepts of mathematics largely insensitive to formalisation, or 'formalism free'? What is the semantic point of view and how is it implemented in foundational practice? Does a given semantic framework always have an implicit syntax? Inspired by what she calls the 'natural language moves' of Goedel and Tarski, Juliette Kennedy considers what roles the concepts of 'entanglement' and 'formalism freeness' play in a range of logical settings, from computability and set theory to model theory and second order logic, to logicality, developing an entirely original philosophy of mathematics along the way. The treatment is historically, logically and set-theoretically rich, and topics such as naturalism and foundations receive their due, but now with a new twist.
Fred Stoutland was a major figure in the philosophy of action and philosophy of language. This collection brings together essays on truth, language, action and mind and thus provides an important summary of many key themes in Stoutland's own work, as well as offering valuable perspectives on key issues in contemporary philosophy.
The perception of what he calls 'aspects' preoccupied Wittgenstein and gave him considerable trouble in his final years. The Wittgensteinian aspect defies any number of traditional philosophical dichotomies: the aspect is neither subjective (inner, metaphysically private) nor objective; it presents perceivable unity and sense that are (arguably) not (yet) conceptual; it is 'subject to the will', but at the same time is normally taken to be genuinely revelatory of the object perceived under it. This Element begins with a grammatical and phenomenological characterization of Wittgensteinian 'aspects'. It then challenges two widespread ideas: that aspects are to be identified with concepts; and that aspect perception has a continuous version that is characteristic of (normal) human perception. It concludes by proposing that aspect perception brings to light the distinction between the world as perceived and the world as objectively construed, and the role we play in the constitution of the former.
In this book we deal with combinations of concepts defining individuals in the Talmud. Consider for example Yom Kippur and Shabbat. Each concept has its own body of laws. Reality forces us to combine them when they occur on the same day. This is a case of "Identity Merging." As the combined body of laws may be inconsistent, we need a belief revision mechanism to reconcile the conflicting norms. The Talmud offers three options: 1 Take the union of the sets of the rules side by side 2. Resolve the conflicts using further meta-level Talmudic principles (which are new and of value to present day Artificial Intelligence) 3. Regard the new combined concept as a new entity with its own Halachic norms and create new norms for it out of the existing ones. This book offers a clear and precise logical model showing how the Talmud deals with these options.
W. V. Quine was one of the most influential figures of twentieth-century American analytic philosophy. Although he wrote predominantly in English, in Brazil in 1942 he gave a series of lectures on logic and its philosophy in Portuguese, subsequently published as the book O Sentido da Nova Logica. The book has never before been fully translated into English, and this volume is the first to make its content accessible to Anglophone philosophers. Quine would go on to develop revolutionary ideas about semantic holism and ontology, and this book provides a snapshot of his views on logic and language at a pivotal stage of his intellectual development. The volume also includes an essay on logic which Quine also published in Portuguese, together with an extensive historical-philosophical essay by Frederique Janssen-Lauret. The valuable and previously neglected works first translated in this volume will be essential for scholars of twentieth-century philosophy.
The proceedings of the Los Angeles Caltech-UCLA 'Cabal Seminar' were originally published in the 1970s and 1980s. Large Cardinals, Determinacy and Other Topics is the final volume in a series of four books collecting the seminal papers from the original volumes together with extensive unpublished material, new papers on related topics and discussion of research developments since the publication of the original volumes. This final volume contains Parts VII and VIII of the series. Part VII focuses on 'Extensions of AD, models with choice', while Part VIII ('Other topics') collects material important to the Cabal that does not fit neatly into one of its main themes. These four volumes will be a necessary part of the book collection of every set theorist.
Numbers and other mathematical objects are exceptional in having no locations in space or time and no causes or effects in the physical world. This makes it difficult to account for the possibility of mathematical knowledge, leading many philosophers to embrace nominalism, the doctrine that there are no abstract entitles, and to embark on ambitious projects for interpreting mathematics so as to preserve the subject while eliminating its objects. A Subject With No Object cuts through a host of technicalities that have obscured previous discussions of these projects, and presents clear, concise accounts, with minimal prerequisites, of a dozen strategies for nominalistic interpretation of mathematics, thus equipping the reader to evaluate each and to compare different ones. The authors also offer critical discussion, rare in the literature, of the aims and claims of nominalistic interpretation, suggesting that it is significant in a very different way from that usually assumed.
Mathematics plays a central role in much of contemporary science,
but philosophers have struggled to understand what this role is or
how significant it might be for mathematics and science. In this
book Christopher Pincock tackles this perennial question in a new
way by asking how mathematics contributes to the success of our
best scientific representations. In the first part of the book this
question is posed and sharpened using a proposal for how we can
determine the content of a scientific representation. Several
different sorts of contributions from mathematics are then
articulated. Pincock argues that each contribution can be
understood as broadly epistemic, so that what mathematics
ultimately contributes to science is best connected with our
scientific knowledge.
A relative change occurs when some item changes a relation. This Element examines how Plato, Aristotle, Stoics and Sextus Empiricus approached relative change. Relative change is puzzling because the following three propositions each seem true but cannot be true together: (1) No relative changes are intrinsic changes; (2) Only intrinsic changes are proper changes; (3) Some relative changes are proper changes. Plato's Theaetetus and Phaedo property relative change. I argue that these dialogues assume relative changes to be intrinsic changes, so denying (1). Aristotle responds differently, by denying (3) that relative change is proper change. The Stoics claimed that some non-intrinsic changes are changes (denying (2)). Finally, I discuss Sextus' argument that relative change shows that there are no relatives at all.
Ideas about relativity underlie much ancient Greek philosophy, from Protagorean relativism, to Plato's theory of Forms, Aristotle's category scheme, and relational logic. In Ancient Relativity Matthew Duncombe explores how ancient philosophers, particularly Plato, Aristotle, the Stoics, and Sextus Empiricus, understood the phenomenon and how their theories of relativity affected, and were affected by, their broader philosophical outlooks. He argues that ancient philosophers shared a close-knit family of views referred to as 'constitutive relativity', whereby a relative is not simply linked by a relation but is constituted by it. Plato exploits this view in some key arguments concerning the Forms and the partition of the soul. Aristotle adopts the constitutive view in his discussions of relativity in Categories 7 and the Topics and retains it in Metaphysics Delta 15. Duncombe goes on to examine the role relativity plays in Stoic philosophy, especially Stoic physics and metaphysics, and the way Sextus Empiricus thinks about relativity, which does not appeal to the nature of relatives but rather to how we conceive of things as correlative.
This book clarifies the idea of critical thinking by investigating the 'critical' practices of academics across a range of disciplines. Drawing on key theorists - Wittgenstein, Geertz, Williams, Halliday - and using a 'textographic' approach, the book explores how the concept of critical thinking is understood by academics and also how it is constructed discursively in the texts and practices they employ in their teaching. Critical thinking is one of the most widely discussed concepts in debates on university learning. For many, the idea of teaching students to be critical thinkers characterizes more than anything else the overriding purpose of 'higher education'. But whilst there is general agreement about its importance as an educational ideal, there is surprisingly little agreement about what the concept means exactly. Also at issue is how and what students need to be taught in order to be properly critical in their field. This searching monograph seeks answers to these important questions.
Our beliefs come in degrees. I'm 70% confident it will rain tomorrow, and 0.001% sure my lottery ticket will win. What's more, we think these degrees of belief should abide by certain principles if they are to be rational. For instance, you shouldn't believe that a person's taller than 6ft more strongly than you believe that they're taller than 5ft, since the former entails the latter. In Dutch Book arguments, we try to establish the principles of rationality for degrees of belief by appealing to their role in guiding decisions. In particular, we show that degrees of belief that don't satisfy the principles will always guide action in some way that is bad or undesirable. In this Element, we present Dutch Book arguments for the principles of Probabilism, Conditionalization, and the Reflection Principle, among others, and we formulate and consider the most serious objections to them.
Wittgenstein's 'middle period' is often seen as a transitional phase connecting his better-known early and later philosophies. The fifteen essays in this volume focus both on the distinctive character of his teaching and writing in the 1930s, and on its pivotal importance for an understanding of his philosophy as a whole. They offer wide-ranging perspectives on the central issue of how best to identify changes and continuities in his philosophy during those years, as well as on particular topics in the philosophy of mind, religion, ethics, aesthetics, and the philosophy of mathematics. The volume will be valuable for all who are interested in this formative period of Wittgenstein's development.
This book returns to the discussion in volume 1 on analogy and induction, and analyses their substance. The first part distinguishes between two kinds of logic: One kind based on union of the common features, and the other kind based on synthesis of different features. In the second part of the book we propose a formal scheme for synthesis of concepts. The third part analyses various mechanisms for kidushin and kinyan, which form a mathematical group.
Luciano Floridi presents an innovative approach to philosophy, conceived as conceptual design. He explores how we make, transform, refine, and improve the objects of our knowledge. His starting point is that reality provides the data, to be understood as constraining affordances, and we transform them into information, like semantic engines. Such transformation or repurposing is not equivalent to portraying, or picturing, or photographing, or photocopying anything. It is more like cooking: the dish does not represent the ingredients, it uses them to make something else out of them, yet the reality of the dish and its properties hugely depend on the reality and the properties of the ingredients. Models are not representations understood as pictures, but interpretations understood as data elaborations, of systems. Thus, he articulates and defends the thesis that knowledge is design and philosophy is the ultimate form of conceptual design. Although entirely independent of Floridi's previous books, The Philosophy of Information (OUP 2011) and The Ethics of Information (OUP 2013), The Logic of Information both complements the existing volumes and presents new work on the foundations of the philosophy of information.
A comprehensive collection of historical readings in the philosophy of mathematics and a selection of influential contemporary work, this much-needed introduction reveals the rich history of the subject. An Historical Introduction to the Philosophy of Mathematics: A Reader brings together an impressive collection of primary sources from ancient and modern philosophy. Arranged chronologically and featuring introductory overviews explaining technical terms, this accessible reader is easy-to-follow and unrivaled in its historical scope. With selections from key thinkers such as Plato, Aristotle, Descartes, Hume and Kant, it connects the major ideas of the ancients with contemporary thinkers. A selection of recent texts from philosophers including Quine, Putnam, Field and Maddy offering insights into the current state of the discipline clearly illustrates the development of the subject. Presenting historical background essential to understanding contemporary trends and a survey of recent work, An Historical Introduction to the Philosophy of Mathematics: A Reader is required reading for undergraduates and graduate students studying the philosophy of mathematics and an invaluable source book for working researchers.
According to Bayesian epistemology, rational learning from experience is consistent learning, that is learning should incorporate new information consistently into one's old system of beliefs. Simon M. Huttegger argues that this core idea can be transferred to situations where the learner's informational inputs are much more limited than Bayesianism assumes, thereby significantly expanding the reach of a Bayesian type of epistemology. What results from this is a unified account of probabilistic learning in the tradition of Richard Jeffrey's 'radical probabilism'. Along the way, Huttegger addresses a number of debates in epistemology and the philosophy of science, including the status of prior probabilities, whether Bayes' rule is the only legitimate form of learning from experience, and whether rational agents can have sustained disagreements. His book will be of interest to students and scholars of epistemology, of game and decision theory, and of cognitive, economic, and computer sciences.
This book analyses and defends the deflationist claim that there is nothing deep about our notion of truth. According to this view, truth is a 'light' and innocent concept, devoid of any essence which could be revealed by scientific inquiry. Cezary Cieslinski considers this claim in light of recent formal results on axiomatic truth theories, which are crucial for understanding and evaluating the philosophical thesis of the innocence of truth. Providing an up-to-date discussion and original perspectives on this central and controversial issue, his book will be important for those with a background in logic who are interested in formal truth theories and in current philosophical debates about the deflationary conception of truth.
While probabilistic logics in principle might be applied to solve a range of problems, in practice they are rarely applied - perhaps because they seem disparate, complicated, and computationally intractable. This programmatic book argues that several approaches to probabilistic logic fit into a simple unifying framework in which logically complex evidence is used to associate probability intervals or probabilities with sentences. Specifically, Part I shows that there is a natural way to present a question posed in probabilistic logic, and that various inferential procedures provide semantics for that question, while Part II shows that there is the potential to develop computationally feasible methods to mesh with this framework. The book is intended for researchers in philosophy, logic, computer science and statistics. A familiarity with mathematical concepts and notation is presumed, but no advanced knowledge of logic or probability theory is required.
This book introduces the theory of graded consequence (GCT) and its mathematical formulation. It also compares the notion of graded consequence with other notions of consequence in fuzzy logics, and discusses possible applications of the theory in approximate reasoning and decision-support systems. One of the main points where this book emphasizes on is that GCT maintains the distinction between the three different levels of languages of a logic, namely object language, metalanguage and metametalanguage, and thus avoids the problem of violation of the principle of use and mention; it also shows, gathering evidences from existing fuzzy logics, that the problem of category mistake may arise as a result of not maintaining distinction between levels.
From academic writing to personal and public discourse, the need for good arguments and better ways of arguing is greater than ever before. This timely fifth edition of A Rulebook for Arguments sharpens an already-classic text, adding updated examples and a new chapter on public debates that provides rules for the etiquette and ethics of sound public dialogue as well as clear and sound thinking in general.
Model theory begins with an audacious idea: to consider statements about mathematical structures as mathematical objects of study in their own right. While inherently important as a tool of mathematical logic, it also enjoys connections to and applications in diverse branches of mathematics, including algebra, number theory and analysis. Despite this, traditional introductions to model theory assume a graduate-level background of the reader. In this innovative textbook, Jonathan Kirby brings model theory to an undergraduate audience. The highlights of basic model theory are illustrated through examples from specific structures familiar from undergraduate mathematics, paying particular attention to definable sets throughout. With numerous exercises of varying difficulty, this is an accessible introduction to model theory and its place in mathematics.
Peter Adamson and Jonardon Ganeri present a lively introduction to one of the world's richest intellectual traditions: the philosophy of classical India. They begin with the earliest extant literature, the Vedas, and the explanatory works that these inspired, known as Upanisads. They also discuss other famous texts of classical Vedic culture, especially the Mahabharata and its most notable section, the Bhagavad-Gita, alongside the rise of Buddhism and Jainism. In this opening section, Adamson and Ganeri emphasize the way that philosophy was practiced as a form of life in search of liberation from suffering. Next, the pair move on to the explosion of philosophical speculation devoted to foundational texts called 'sutras,' discussing such traditions as the logical and epistemological Nyaya school, the monism of Advaita Vedanta, and the spiritual discipline of Yoga. In the final section of the book, they chart further developments within Buddhism, highlighting Nagarjuna's radical critique of 'non-dependent' concepts and the no-self philosophy of mind found in authors like Dignaga, and within Jainism, focusing especially on its 'standpoint' epistemology. Unlike other introductions that cover the main schools and positions in classical Indian philosophy, Adamson and Ganeri's lively guide also pays attention to philosophical themes such as non-violence, political authority, and the status of women, while considering textual traditions typically left out of overviews of Indian thought, like the Carvaka school, Tantra, and aesthetic theory as well. Adamson and Ganeri conclude by focusing on the much-debated question of whether Indian philosophy may have influenced ancient Greek philosophy and, from there, evaluate the impact that this area of philosophy had on later Western thought. |
You may like...
The Creation of Scientific Psychology
David J. Murray, Stephen W. Link
Paperback
R1,376
Discovery Miles 13 760
|