![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Philosophy > Topics in philosophy > Logic
This volume contains a collection of research papers centered around the concept of quantifier. Recently this concept has become the central point of research in logic. It is one of the important logical concepts whose exact domain and applications have so far been insufficiently explored, especially in the area of inferential and semantic properties of languages. It should thus remain the central point of research in the future. Moreover, during the last twenty years generalized quantifiers and logical technics based on them have proved their utility in various applications. The example of natu rallanguage semantics has been partcularly striking. For a long time it has been belived that elementary logic also called first-order logic was an ade quate theory of logical forms of natural language sentences. Recently it has been accepted that semantics of many natural language constructions can not be properly represented in elementary logic. It has turned out, however, that they can be described by means of generalized quantifiers. As far as computational applications oflogic are concerned, particulary interesting are semantics restricted to finite models. Under this restriction elementary logic looses several of its advantages such as axiomatizability and compactness. And for various purposes we can use equally well some semantically richer languages of which generalized quantifiers offer the most universal methods of describing extensions of elementary logic. Moreover we can look at generalized quantifiers as an explication of some specific mathematical concepts, e. g."
Alonzo Church was undeniably one ofthe intellectual giants of theTwenti- eth Century . These articles are dedicated to his memory and illustrate the tremendous importance his ideas have had in logic , mathematics, comput er science and philosophy . Discussions of some of thesevarious contributions have appeared in The Bulletin of Symbolic Logic, and th e interested reader is invited to seek details there . Here we justtry to give somegener al sense of the scope, depth,and value of his work. Church is perhaps best known for the theorem , appropriately called " C h u r c h ' s Theorem ", that there is no decision procedure forthelogical valid- ity of formulas first-order of logic . A d ecision proce dure forthat part of logic would have come near to fulfilling Leibniz's dream of a calculus that could be mechanically used tosettle logical disputes . It was not to . be It could not be . What Church proved precisely is that there is no lambda-definable function that can i n every case providethe right answer , ' y e s ' or ' n o', tothe question of whether or not any arbitrarily given formula is valid .
All humans can interpret sentences of their native language quickly and without effort. Working from the perspective of generative grammar, the contributors investigate three mental mechanisms, widely assumed to underlie this ability: compositional semantics, implicature computation and presupposition computation. This volume brings together experts from semantics and pragmatics to bring forward the study of interconnections between these three mechanisms. The contributions develop new insights into important empirical phenomena; for example, approximation, free choice, accommodation, and exhaustivity effects.
"Logic," one of the central words in Western intellectual history, compre hends in its meaning such diverse things as the Aristotelian syllogistic, the scholastic art of disputation, the transcendental logic of the Kantian critique, the dialectical logic of Hegel, and the mathematical logic of the Principia Mathematica of Whitehead and Russell. The term "Formal Logic," following Kant is generally used to distinguish formal logical reasonings, precisely as formal, from the remaining universal truths based on reason. (Cf. SCHOLZ, 1931). A text-book example of a formal-logical inference which from "Some men are philosophers" and "All philosophers are wise" concludes that "Some men are wise" is called formal, because the validity of this inference depends only on the form ofthe given sentences -in particular it does not depend on the truth or falsity of these sentences. (On the dependence of logic on natural language, English, for example, compare Section 1 and 8). The form of a sentence like "Some men are philosophers," is that which remains preserved when the given predicates, here "men" and "philosophers" are replaced by arbitrary ones. The form itself can thus be represented by replacing the given predicates by variables. Variables are signs devoid of meaning, which may serve merely to indicate the place where meaningful constants (here the predicates) are to be inserted. As variables we shall use - as did Aristotle - letters, say P, Q and R, as variables for predicates."
The forms and scope of logic rest on assumptions of how language and reasoning connect to experience. In this volume an analysis of meaning and truth provides a foundation for studying modern propositional and predicate logics. Chapters on propositional logic, parsing propositions, and meaning, truth and reference give a basis for criteria that can be used to judge formalizations of ordinary language arguments. Over 120 worked examples of formalizations of propositions and arguments illustrate the scope and limitations of modern logic, as analyzed in chapters on identity, quantifiers, descriptive names, functions, and second-order logic. The chapter on second-order logic illustrates how different conceptions of predicates and propositions do not lead to a common basis for quantification over predicates, as they do for quantification over things. Notable for its clarity of presentation, and supplemented by many exercises, this volume is suitable for philosophers, linguists, mathematicians, and computer scientists who wish to better understand the tools they use in formalizing reasoning.
This collection of essays is dedicated to 'Joe' Karel Lambert. The contributors are all personally affected to Joe in some way or other, but they are definitely not the only ones. Whatever excuses there are - there are some -, the editors apologize to whomever they have neglected. But even so the collection displays how influential Karel Lambert has been, personally and through his teaching and his writings. The display is in alphabetical order - with one exception: Bas van Fraassen, being about the earliest student of Karel Lambert, opens the collection with some reminiscences. Naturally, one of the focal points of this volume is Lambert's logical thinking and (or: freed of) ontological thinking. Free logic is intimately connected with description theory. Bas van Fraassen gives a survey of the development of the area, and Charles Daniels points to difficulties with definite descriptions in modal contexts and stories. Peter Woodruff addresses the relation between free logic and supervaluation semantics, presenting a novel condition which recovers desirable metatheoretic properties for free logic under that semantics. Terence Parsons shows how free logic can be utilized in interpreting sentences as purporting to denote events (true ones succeed and false ones fail) and how this helps to understand natural language.
Quantifiers: Logics, Models and Computation is the first concentrated effort to give a systematic presentation of the main research results on the subject, since the modern concept was formulated in the late '50s and early '60s. The majority of the papers are in the nature of a handbook. All of them are self-contained, at various levels of difficulty. The Introduction surveys the main ideas and problems encountered in the logical investigation of quantifiers. The Prologue, written by Per Lindstrom, presents the early history of the concept of generalised quantifiers. The volume then continues with a series of papers surveying various research areas, particularly those that are of current interest. Together they provide introductions to the subject from the points of view of mathematics, linguistics, and theoretical computer science. The present volume has been prepared in parallel with Quantifiers: Logics, Models and Computation, Volume Two. Contributions, which contains a collection of research papers on the subject in areas that are too fresh to be summarised. The two volumes are complementary. For logicians, mathematicians, philosophers, linguists and computer scientists. Suitable as a text for advanced undergraduate and graduate specialised courses in logic. "
It is with great pleasure that we are presenting to the community the second edition of this extraordinary handbook. It has been over 15 years since the publication of the first edition and there have been great changes in the landscape of philosophical logic since then. The first edition has proved invaluable to generations of students and researchers in formal philosophy and language, as well as to consumers of logic in many applied areas. The main logic artiele in the Encyelopaedia Britannica 1999 has described the first edition as 'the best starting point for exploring any of the topics in logic'. We are confident that the second edition will prove to be just as good. ! The first edition was the second handbook published for the logic commu nity. It followed the North Holland one volume Handbook 0/ Mathematical Logic, published in 1977, edited by the late Jon Barwise. The four volume Handbook 0/ Philosophical Logic, published 1983-1989 came at a fortunate temporal junction at the evolution of logic. This was the time when logic was gaining ground in computer science and artificial intelligence cireles. These areas were under increasing commercial press ure to provide devices which help andjor replace the human in his daily activity. This pressure required the use of logic in the modelling of human activity and organisa tion on the one hand and to provide the theoretical basis for the computer program constructs on the other.
Philosophy, Psychology, and Psychologism presents a remarkable
diversity of contemporary opinions on the prospects of addressing
philosophical topics from a psychological perspective. It considers
the history and philosophical merits of psychologism, and looks
systematically at psychologism in phenomenology, cognitive science,
epistemology, logic, philosophy of language, philosophical
semantics, and artificial intelligence. It juxtaposes many
different philosophical standpoints, each supported by rigorous
philosophical argument.
Mathematics depends on proofs, and proofs must begin somewhere, from some fundamental assumptions. For nearly a century, the axioms of set theory have played this role, so the question of how these axioms are properly judged takes on a central importance. Approaching the question from a broadly naturalistic or second-philosophical point of view, Defending the Axioms isolates the appropriate methods for such evaluations and investigates the ontological and epistemological backdrop that makes them appropriate. In the end, a new account of the objectivity of mathematics emerges, one refreshingly free of metaphysical commitments.
We who live in this post-modern late twentieth century culture are still children of dualism. For a variety of rather complex reasons we continue to split apart and treat as radical opposites body and spirit, medicine and religion, sacred and secular, private and public, love and justice, men and women. Though this is still our strong tendency, we are beginning to discover both the futility and the harm of such dualistic splitting. Peoples of many ancient cultures might smile at the belatedness of our discovery concerning the commonalities of medicine and religion. A cur sory glance back at ancient Egypt, Samaria, Babylonia, Persia, Greece, and Rome would disclose a common thread - the close union of religion and medicine. Both were centrally concerned with healing, health, and wholeness. The person was understood as a unity of body, mind, and spirit. The priest and the physician frequently were combined in the same individual. One of the important contributions of this significant volume of essays is the sustained attack upon dualism. From a variety of vantage points, virtually all of the authors unmask the varied manifestations of dualism in religion and medicine, urging a more holistic approach. Since the editor has provided an excellent summary of each article, I shall not attempt to comment on specific contributions. Rather, I wish to highlight three 1 broad themes which I find notable for theological ethics."
This book presents a collection of contributions from related logics to applied paraconsistency. Moreover, all of them are dedicated to Jair Minoro Abe,on the occasion of his sixtieth birthday. He is one of the experts in Paraconsistent Engineering, who developed the so-called annotated logics. The book includes important contributions on foundations and applications of paraconsistent logics in connection with engineering, mathematical logic, philosophical logic, computer science, physics, economics, and biology. It will be of interest to students and researchers, who are working on engineering and logic.
Modal Logic is a branch of logic with applications in many related disciplines such as computer science, philosophy, linguistics and artificial intelligence. Over the last twenty years, in all of these neighbouring fields, modal systems have been developed that we call multi-dimensional. (Our definition of multi-dimensionality in modal logic is a technical one: we call a modal formalism multi-dimensional if, in its intended semantics, the universe of a model consists of states that are tuples over some more basic set.) This book treats such multi-dimensional modal logics in a uniform way, linking their mathematical theory to the research tradition in algebraic logic. We will define and discuss a number of systems in detail, focusing on such aspects as expressiveness, definability, axiomatics, decidability and interpolation. Although the book will be mathematical in spirit, we take care to give motivations from the disciplines mentioned earlier on.
The analytic/synthetic distinction looks simple. It is a
distinction between two different kinds of sentence. Synthetic
sentences are true in part because of the way the world is, and in
part because of what they mean. Analytic sentences - like all
bachelors are unmarried and triangles have three sides - are
different. They are true in virtue of meaning, so no matter what
the world is like, as long as the sentence means what it does, it
will be true.
The modern discussion on the concept of truthlikeness was started in 1960. In his influential Word and Object, W. V. O. Quine argued that Charles Peirce's definition of truth as the limit of inquiry is faulty for the reason that the notion 'nearer than' is only "defined for numbers and not for theories." In his contribution to the 1960 International Congress for Logic, Methodology, and Philosophy of Science at Stan ford, Karl Popper defended the opposite view by defining a compara tive notion of verisimilitude for theories. was originally introduced by the The concept of verisimilitude Ancient sceptics to moderate their radical thesis of the inaccessibility of truth. But soon verisimilitudo, indicating likeness to the truth, was confused with probabilitas, which expresses an opiniotative attitude weaker than full certainty. The idea of truthlikeness fell in disrepute also as a result of the careless, often confused and metaphysically loaded way in which many philosophers used - and still use - such concepts as 'degree of truth', 'approximate truth', 'partial truth', and 'approach to the truth'. Popper's great achievement was his insight that the criticism against truthlikeness - by those who urge that it is meaningless to speak about 'closeness to truth' - is more based on prejudice than argument."
This is an outline of a coherence theory of law. Its basic ideas are: reasonable support and weighing of reasons. All the rest is commentary. These words at the beginning of the preface of this book perfectly indicate what On Law and Reason is about. It is a theory about the nature of the law which emphasises the role of reason in the law and which refuses to limit the role of reason to the application of deductive logic. In 1989, when the first edition of On Law and Reason appeared, this book was ground breaking for several reasons. It provided a rationalistic theory of the law in the language of analytic philosophy and based on a thorough understanding of the results, including technical ones, of analytic philosophy. That was not an obvious combination at the time of the book s first appearance and still is not. The result is an analytical rigor that is usually associated with positivist theories of the law, combined with a philosophical position that is not natural law in a strict sense, but which shares with it the emphasis on the role of reason in determining what the law is. If only for this rare combination, On Law and Reason still deserves careful study. On Law and Reason also foreshadowed and influenced a development in the field of Legal Logic that would take place in the nineties of the 20th century, namely the development of non-monotonic ( defeasible ) logics for the analysis of legal reasoning. In the new Introduction to this second edition, this aspect is explored in some more detail."
We all engage in the process of reasoning, but we don't always pay attention to whether we are doing it well. This book offers the opportunity to practise reasoning in a clear-headed and critical way, with the aims of developing an awareness of the importance of reasoning well and of improving the reader's skill in analyzing and evaluating arguments. In this third edition, Anne Thomson has updated and revised the book to include fresh and topical examples which will guide students through the processes of critical reasoning in a clear and engaging way. In addition, two new chapters on evaluating the credibility of evidence and decision making and dilemmas will fully equip students to reason well. By the end of the book students should be able to:
Aristotle's "On Interpretation", a centrepiece of his logic, studies the relationship between conflicting pairs of statements. The first eight chapters, studied here, explain what statements are; they start from their basic components, the words, and work up to the character of opposed affirmations and negations. The 15,000 pages of the ancient Greek commentators on Aristotle, written mainly between 200 and 500 AD, constitute the largest corpus of extant Greek philosophical writing not translated into English or other European languages. This new series of translations, planned in 60 volumes, fills an important gap in the history of European thought.
Propositional Logics presents the history, philosophy, and mathematics of the major systems of propositional logic. Classical logic, modal logics, many-valued logics, intuitionism, paraconsistent logics, and dependent implication are examined in separate chapters. Each begins with a motivation in the originators' own terms, followed by the standard formal semantics, syntax, and completeness theorem. The chapters on the various logics are largely self-contained so that the book can be used as a reference. An appendix summarizes the formal semantics and axiomatizations of the logics. The view that unifies the exposition is that propositional logics comprise a spectrum. As the aspect of propositions under consideration varies, the logic varies. Each logic is shown to fall naturally within a general framework for semantics. A theory of translations between logics is presented that allows for further comparisons, and necessary conditions are given for a translation to preserve meaning. For this third edition the material has been re-organized to make the text easier to study, and a new section on paraconsistent logics with simple semantics has been added which challenges standard views on the nature of consequence relations. The text includes worked examples and hundreds of exercises, from routine to open problems, making the book with its clear and careful exposition ideal for courses or individual study.
Labelled deduction is an approach to providing frameworks for presenting and using different logics in a uniform and natural way by enriching the language of a logic with additional information of a semantic proof-theoretical nature. Labelled deduction systems often possess attractive properties, such as modularity in the way that families of related logics are presented, parameterised proofs of metatheoretic properties, and ease of mechanisability. It is thus not surprising that labelled deduction has been applied to problems in computer science, AI, mathematical logic, cognitive science, philosophy and computational linguistics - for example, formalizing and reasoning about dynamic state oriented' properties such as knowledge, belief, time, space, and resources.
Doing Worlds with Words throws light on the problem of meaning as the meeting point of linguistics, logic and philosophy, and critically assesses the possibilities and limitations of elucidating the nature of meaning by means of formal logic, model theory and model-theoretical semantics. The main thrust of the book is to show that it is misguided to understand model theory metaphysically and so to try to base formal semantics on something like formal metaphysics; rather, the book states that model theory and similar tools of the analysis of language should be understood as capturing the semantically relevant, especially inferential, structure of language. From this vantage point, the reader gains a new light on many of the traditional concepts and problems of logic and philosophy of language, such as meaning, reference, truth and the nature of formal logic.
Hilbert's Program was founded on a concern for the phenomenon of paradox in mathematics. To Hilbert, the paradoxes, which are at once both absurd and irresistible, revealed a deep philosophical truth: namely, that there is a discrepancy between the laws accord ing to which the mind of homo mathematicus works, and the laws governing objective mathematical fact. Mathematical epistemology is, therefore, to be seen as a struggle between a mind that naturally works in one way and a reality that works in another. Knowledge occurs when the two cooperate. Conceived in this way, there are two basic alternatives for mathematical epistemology: a skeptical position which maintains either that mind and reality seldom or never come to agreement, or that we have no very reliable way of telling when they do; and a non-skeptical position which holds that there is significant agree ment between mind and reality, and that their potential discrepan cies can be detected, avoided, and thus kept in check. Of these two, Hilbert clearly embraced the latter, and proposed a program designed to vindicate the epistemological riches represented by our natural, if non-literal, ways of thinking. Brouwer, on the other hand, opted for a position closer (in Hilbert's opinion) to that of the skeptic. Having decided that epistemological purity could come only through sacrifice, he turned his back on his classical heritage to accept a higher calling."
It is with great pleasure that we are presenting to the community the second edition of this extraordinary handbook. It has been over 15 years since the publication of the first edition and there have been great changes in the landscape of philosophical logic since then. The first edition has proved invaluable to generations of students and researchers in formal philosophy and language, as well as to consumers of logic in many applied areas. The main logic article in the Encyclopaedia Britannica 1999 has described the first edition as 'the best starting point for exploring any of the topics in logic'. We are confident that the second edition will prove to be just as good, The first edition was the second handbook published for the logic commu nity. It followed the North Holland one volume Handbook of Mathematical Logic, published in 1977, edited by the late Jon Barwise. The four volume Handbook of Philosophical Logic, published 1983-1989 came at a fortunate temporal junction at the evolution of logic. This was the time when logic was gaining ground in computer science and artificial intelligence circles. These areas were under increasing commercial pressure to provide devices which help and/or replace the human in his daily activity. This pressure required the use of logic in the modelling of human activity and organisa tion on the one hand and to provide the theoretical basis for the computer program constructs on the other."
In this book the authors present new results on interpolation for nonmonotonic logics, abstract (function) independence, the Talmudic Kal Vachomer rule, and an equational solution of contrary-to-duty obligations. The chapter on formal construction is the conceptual core of the book, where the authors combine the ideas of several types of nonmonotonic logics and their analysis of 'natural' concepts into a formal logic, a special preferential construction that combines formal clarity with the intuitive advantages of Reiter defaults, defeasible inheritance, theory revision, and epistemic considerations. It is suitable for researchers in the area of computer science and mathematical logic. |
You may like...
Logic on the Track of Social Change
David Braybrooke, Bryson Brown, …
Hardcover
R1,459
Discovery Miles 14 590
|