![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Philosophy > Topics in philosophy > Logic
This volume provides a comprehensive collection of classic and
contemporary readings in the philosophy of logic. The selections
include some of the most important, technically exact, yet lucid
and readable expositions of central concepts and controversies in
philosophical logic. Areas of coverage include classical logic,
truth, propositions and meaning, quantifiers and quantificational
theory, validity, inference and entailment, and modality,
intensionality and propositional attitude. Since the articles are written from a variety of perspectives,
the reader is able to critically assess the background and current
trends in philosophical applications of symbolic logic. The volume
also examines the limitations of classical and nonstandard
logics. The book complements "Philosophy of Mathematics: An Anthology" and "A Companion to Philosophical Logic, "also edited by Dale Jacquette.
Forms of Truth and the Unity of Knowledge addresses a philosophical subject-the nature of truth and knowledge-but treats it in a way that draws on insights beyond the usual confines of modern philosophy. This ambitious collection includes contributions from established scholars in philosophy, theology, mathematics, chemistry, biology, psychology, literary criticism, history, and architecture. It represents an attempt to integrate the insights of these disciplines and to help them probe their own basic presuppositions and methods. The essays in Forms of Truth and the Unity of Knowledge are collected into five parts, the first dealing with division of knowledge into multiple disciplines in Western intellectual history; the second with the foundational disciplines of epistemology, logic, and mathematics; the third with explanation in the natural sciences; the fourth with truth and understanding in disciplines of the humanities; and the fifth with art and theology. Contributors: Vittorio Hoesle, Keith Lehrer, Robert Hanna, Laurent Lafforgue, Thomas Nowak, Francisco J. Ayala, Zygmunt Pizlo, Osborne Wiggins, Allan Gibbard, Carsten Dutt, Aviezer Tucker, Nicola Di Cosmo, Michael Lykoudis, and Celia Deane-Drummond.
Every thoughtful person must ask, "What do I know?" The two most explosive fields, religion and politics, are notably filled with strident and conflicting claims. Close analysis in clear language reveals that no one knows what he or she is talking about. Because of the challenge of unexamined assumptions, of unclear cause-and-effect relationships, and of the rarity of reliable sources, a person who wants to be open-minded cannot avoid adopting skepticism as the least embarrassing philosophy. Some discoveries made in this book: *Reason appears to prove nothing *Intuition is probably a delusion *Facts are slippery *Religious people yearn for suicide *Why socialism cannot work *Where conservatives screwed up badly (as they admit) *The equation STAR+2R+R3=GPS explains the cultural history of the world *Shakespeare was a skeptic *Dante's curious insight into love *Passing the Magic Johnson test *Tom De Lay does not realize that relativism is as American as apple pie *Hamlet, who never existed, is more real than you or I. Here is a sample observation: "People believe in God because the Bible tells all about him, and they believe in the Bible because God wrote or inspired it. This is a classic case of the Fallacy of Circular Reasoning."
One hundred years ago, Russell and Whitehead published their
epoch-making Principia Mathematica (PM), which was initially
conceived as the second volume of Russell's Principles of
Mathematics (PoM) that had appeared ten years before. No other
works can be credited to have had such an impact on the development
of logic and on philosophy in the twentieth century. However, until
now, scholars only focused on the first parts of the books - that
is, on Russell's and Whitehead's theory of logic, set-theory and
arithmetic.
Is critical argumentation an effective way to overcome disagreement? And does the exchange of arguments bring opponents in a controversy closer to the truth? This study provides a new perspective on these pivotal questions. By means of multi-agent simulations, it investigates the truth and consensus-conduciveness of controversial debates. The book brings together research in formal epistemology and argumentation theory. Aside from its consequences for discursive practice, the work may have important implications for philosophy of science and the way we construe scientific rationality as well.
This monograph examines truth in fiction by applying the techniques of a naturalized logic of human cognitive practices. The author structures his project around two focal questions. What would it take to write a book about truth in literary discourse with reasonable promise of getting it right? What would it take to write a book about truth in fiction as true to the facts of lived literary experience as objectivity allows? It is argued that the most semantically distinctive feature of the sentences of fiction is that they areunambiguously true and false together. It is true that Sherlock Holmes lived at 221B Baker Street and also concurrently false that he did. A second distinctive feature of fiction is that the reader at large knows of this inconsistency and isn't in the least cognitively molested by it. Why, it is asked, would this be so? What would explain it? Two answers are developed. According to the no-contradiction thesis, the semantically tangled sentences of fiction are indeed logically inconsistent but not logically contradictory. According to the no-bother thesis, if the inconsistencies of fiction were contradictory, a properly contrived logic for the rational management of inconsistency would explain why readers at large are not thrown off cognitive stride by their embrace of those contradictions. As developed here, the account of fiction suggests the presence of an underlying three - or four-valued dialethic logic. The author shows this to be a mistaken impression. There are only two truth-values in his logic of fiction. The naturalized logic of Truth in Fiction jettisons some of the standard assumptions and analytical tools of contemporary philosophy, chiefly because the neurotypical linguistic and cognitive behaviour of humanity at large is at variance with them. Using the resources of a causal response epistemology in tandem with the naturalized logic, the theory produced here is data-driven, empirically sensitive, and open to a circumspect collaboration with the empirical sciences of language and cognition.
How science can convey a profound sense of wonder, connectedness,
and optimism about the human condition.
Truth Through Proof defends an anti-platonist philosophy of
mathematics derived from game formalism. Classic formalists claimed
implausibly that mathematical utterances are truth-valueless moves
in a game. Alan Weir aims to develop a more satisfactory successor
to game formalism utilising a widely accepted, broadly neo-Fregean
framework, in which the proposition expressed by an utterance is a
function of both sense and background circumstance. This framework
allows for sentences whose truth-conditions are not
representational, which are made true or false by conditions
residing in the circumstances of utterances but not transparently
in the sense.
This book presents the current state of the art regarding the application of logical tools to the problems of theory and practice of lawmaking. It shows how contemporary logic may be useful in the analysis of legislation, legislative drafting and legal reasoning concerning different contexts of law making. Elaborations of the process of law making have variously emphasised its political, social or economic aspects. Yet despite strong interest in logical analyses of law, questions remains about the role of logical tools in law making. This volume attempts to bridge that gap, or at least to narrow it, drawing together some important research problems-and some possible solutions-as seen through the work of leading contemporary academics. The volume encompasses 20 chapters written by authors from 16 countries and it presents diversified views on the understanding of logic (from strict mathematical approaches to the informal, argumentative ones) and differentiated choices concerning the aspects of law making taken into account. The book presents a broad set of perspectives, insights and results into the emerging field of research devoted to the logical analysis of the area of creation of law. How does logic inform lawmaking? Are legal systems consistent and complete? How can legal rules be represented by means of formal calculi and visualization techniques? Does the structure of statutes or of legal systems resemble the structure of deductive systems? What are the logical relations between the basic concepts of jurisprudence that constitute the system of law? How are theories of legal interpretation relevant to the process of legislation? How might the statutory text be analysed by means of contemporary computer programs? These and other questions, ranging from the theoretical to the immediately practical, are addressed in this definitive collection.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This book contains 20 essays tracing the work of David Zarefsky, a leading North American scholar of argumentation from a rhetorical perspective.The essays cohere around 4 general themes: objectives for studying argumentation rhetorically, approaches to rhetorical study of argumentation, patterns and schemes of rhetorical argumentation, and case studies illustrating the potential of studying argumentation rhetorically.These articles are drawn from across Zarefsky's 45-year career. Many of these articles originally appeared in publications that are difficult to access today, and this collection brings the reader up to date on the topic. Zarefsky's scholarship focuses on the role of language in political argumentation, the ways in which argumentation creates public knowledge and belief, the influence of framing and context on what is said and understood, the deployment of particular patterns and schemes of argumentation in public reasoning, and the influence of debate on politics and governance. All these topics are addressed in this book. Each of the conceptual essays includes brief application to specific cases, and five extended case studies are also presented in this volume. The case studies cover different themes: two explore famous political debates, the third focuses on presidential rhetoric across the course of United States history, the fourth on the arguments for liberalism at a time of political polarization, and the fifth on the contemporary effort to engage the United States with the Muslim world. This bookis ofinterest to scholars in the fields of philosophy, logic, law, philosophy of law, and legal history. The range of topics and concepts addressed, the interplay of concepts and cases and the unifying perspective of rhetorical argumentation make this book a valuable read for students of argumentative practice, whether rhetorically or otherwise."
This book provides a critical examination of how the choice of what to believe is represented in the standard model of belief change. In particular the use of possible worlds and infinite remainders as objects of choice is critically examined. Descriptors are introduced as a versatile tool for expressing the success conditions of belief change, addressing both local and global descriptor revision. The book presents dynamic descriptors such as Ramsey descriptors that convey how an agent's beliefs tend to be changed in response to different inputs. It also explores sentential revision and demonstrates how local and global operations of revision by a sentence can be derived as a special case of descriptor revision. Lastly, the book examines revocation, a generalization of contraction in which a specified sentence is removed in a process that may possibly also involve the addition of some new information to the belief set.
This volume documents the 17th Munster Lectures in Philosophy with Susan Haack, the prominent contemporary philosopher. It contains an original, programmatic article by Haack on her overall philosophical approach, entitled 'The Fragmentation of Philosophy, the Road to Reintegration'. In addition, the volume includes seven papers on various aspects of Haack's philosophical work as well as her replies to the papers. Susan Haack has deeply influenced many of the debates in contemporary philosophy. In her vivid and accessible way, she has made ground-breaking contributions covering a wide range of topics, from logic, metaphysics and epistemology, to pragmatism and the philosophy of science and law. In her work, Haack has always been very sensitive in detecting subtle differences. The distinctions she has introduced reveal what lies at the core of philosophical controversies, and show the problems that exist with established views. In order to resolve these problems, Haack has developed some 'middle-course approaches'. One example of this is her famous 'Foundherentism', a theory of justification that includes elements from both the rival theories of Foundationalism and Coherentism. Haack herself has offered the best description of her work calling herself a 'passionate moderate'.
This collection of papers, published in honour of Hector J. Levesque on the occasion of his 60th birthday, addresses a number of core areas in the field of knowledge representation and reasoning. In a broad sense, the book is about knowledge and belief, tractable reasoning, and reasoning about action and change. More specifically, the book contains contributions to Description Logics, the expressiveness of knowledge representation languages, limited forms of inference, satisfiablity (SAT), the logical foundations of BDI architectures, only-knowing, belief revision, planning, causation, the situation calculus, the action language Golog, and cognitive robotics.
In 1945 Alonzo Church issued a pair of referee reports in which he
anonymously conveyed to Frederic Fitch a surprising proof showing
that wherever there is (empirical) ignorance there is also
logically unknowable truth. Fitch published this and a
generalization of the result in 1963. Ever since, philosophers have
been attempting to understand the significance and address the
counter-intuitiveness of this, the so-called paradox of
knowability.
Logical form has always been a prime concern for philosophers belonging to the analytic tradition. For at least one century, the study of logical form has been widely adopted as a method of investigation, relying on its capacity to reveal the structure of thoughts or the constitution of facts. This book focuses on the very idea of logical form, which is directly relevant to any principled reflection on that method. Its central thesis is that there is no such thing as a correct answer to the question of what is logical form: two significantly different notions of logical form are needed to fulfill two major theoretical roles that pertain respectively to logic and to semantics. This thesis has a negative and a positive side. The negative side is that a deeply rooted presumption about logical form turns out to be overly optimistic: there is no unique notion of logical form that can play both roles. The positive side is that the distinction between two notions of logical form, once properly spelled out, sheds light on some fundamental issues concerning the relation between logic and language.
This meticulous critical assessment of the ground-breaking work of philosopher Stanislaw Le niewski focuses exclusively on primary texts and explores the full range of output by one of the master logicians of the Lvov-Warsaw school. The author's nuanced survey eschews secondary commentary, analyzing Le niewski's core philosophical views and evaluating the formulations that were to have such a profound influence on the evolution of mathematical logic. One of the undisputed leaders of the cohort of brilliant logicians that congregated in Poland in the early twentieth century, Le niewski was a guide and mentor to a generation of celebrated analytical philosophers (Alfred Tarski was his PhD student). His primary achievement was a system of foundational mathematical logic intended as an alternative to the Principia Mathematica of Alfred North Whitehead and Bertrand Russell. Its three strands-'protothetic', 'ontology', and 'mereology', are detailed in discrete sections of this volume, alongside a wealth other chapters grouped to provide the fullest possible coverage of Le niewski's academic output. With material on his early philosophical views, his contributions to set theory and his work on nominalism and higher-order quantification, this book offers a uniquely expansive critical commentary on one of analytical philosophy's great pioneers. "
This volume presents recent advances in philosophical logic with chapters focusing on non-classical logics, including paraconsistent logics, substructural logics, modal logics of agency and other modal logics. The authors cover themes such as the knowability paradox, tableaux and sequent calculi, natural deduction, definite descriptions, identity, truth, dialetheism and possible worlds semantics. The developments presented here focus on challenging problems in the specification of fundamental philosophical notions, as well as presenting new techniques and tools, thereby contributing to the development of the field. Each chapter contains a bibliography, to assist the reader in making connections in the specific areas covered. Thus this work provides both a starting point for further investigations into philosophical logic and an update on advances, techniques and applications in a dynamic field. The chapters originate from papers presented during the T"rends in Logic XI" conference at the Ruhr University Bochum, June 2012. |
You may like...
|