![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Logic
Scare Tactics, the first book on the subject, provides a theory of the structure of reasoning used in fear and threat appeal argumentation. Such arguments come under the heading of the argumentum ad baculum, the argument to the stick/club', traditionally treated as a fallacy in the logic textbooks. The new dialectical theory is based on case studies of many interesting examples of the use of these arguments in advertising, public relations, politics, international negotiations, and everyday argumentation on all kinds of subjects. Many of these arguments are amusing, once you see the clever tactic used; others are scary. Some of the arguments appear to be quite reasonable, while others are highly suspicious, or even outrageously fraudulent. In addition to the examples taken from logic textbooks, other cases treated come from a variety of sources, including political debates, legal arguments, and arguments from media sources, like magazine articles and television ads. The purpose of this book is to explain how such arguments work as devices of persuasion, and to develop a method for analyzing and evaluating their reasonable and fallacious uses in particular cases. The book shows how such arguments share a common structure, revealing several distinctive forms of argument nested within each other. Based on its account of this cognitive structure, the new dialectical theory presents methods for identifying, analyzing, and evaluating these arguments, as they are used in specific cases. The book is a scholarly contribution to argumentation theory. It is written in an accessible style, and uses many colorful and provocative examples of fear and threat appeal arguments that are suitable for classroom discussions. The matters treated will be of interest to professionals and students in law, critical thinking, advertising, speech communication, informal logic, cognitive science, rhetoric, and media studies.
Gottlob Frege (1848-1925) is considered the father of modern logic and one of the founding figures of analytic philosophy. He was first and foremost a mathematician, but his major works also made important contributions to the philosophy of language. Frege 's writings are difficult and deal with technical, abstract concepts. The Routledge Philosophy Guidebook to Frege On Sense and Reference helps the student to get to grips with Frege 's thought, and introduces and assesses:
Ideal for those coming to Frege for the first time, and containing fresh insights for anyone interested in his philosophy, this Guidebook is essential reading for all students of philosophy of language, philosophical logic and the history of analytic philosophy.
Alonzo Church was undeniably one ofthe intellectual giants of theTwenti- eth Century . These articles are dedicated to his memory and illustrate the tremendous importance his ideas have had in logic , mathematics, comput er science and philosophy . Discussions of some of thesevarious contributions have appeared in The Bulletin of Symbolic Logic, and th e interested reader is invited to seek details there . Here we justtry to give somegener al sense of the scope, depth,and value of his work. Church is perhaps best known for the theorem , appropriately called " C h u r c h ' s Theorem ", that there is no decision procedure forthelogical valid- ity of formulas first-order of logic . A d ecision proce dure forthat part of logic would have come near to fulfilling Leibniz's dream of a calculus that could be mechanically used tosettle logical disputes . It was not to . be It could not be . What Church proved precisely is that there is no lambda-definable function that can i n every case providethe right answer , ' y e s ' or ' n o', tothe question of whether or not any arbitrarily given formula is valid .
This volume contains a collection of research papers centered around the concept of quantifier. Recently this concept has become the central point of research in logic. It is one of the important logical concepts whose exact domain and applications have so far been insufficiently explored, especially in the area of inferential and semantic properties of languages. It should thus remain the central point of research in the future. Moreover, during the last twenty years generalized quantifiers and logical technics based on them have proved their utility in various applications. The example of natu rallanguage semantics has been partcularly striking. For a long time it has been belived that elementary logic also called first-order logic was an ade quate theory of logical forms of natural language sentences. Recently it has been accepted that semantics of many natural language constructions can not be properly represented in elementary logic. It has turned out, however, that they can be described by means of generalized quantifiers. As far as computational applications oflogic are concerned, particulary interesting are semantics restricted to finite models. Under this restriction elementary logic looses several of its advantages such as axiomatizability and compactness. And for various purposes we can use equally well some semantically richer languages of which generalized quantifiers offer the most universal methods of describing extensions of elementary logic. Moreover we can look at generalized quantifiers as an explication of some specific mathematical concepts, e. g."
"Logic," one of the central words in Western intellectual history, compre hends in its meaning such diverse things as the Aristotelian syllogistic, the scholastic art of disputation, the transcendental logic of the Kantian critique, the dialectical logic of Hegel, and the mathematical logic of the Principia Mathematica of Whitehead and Russell. The term "Formal Logic," following Kant is generally used to distinguish formal logical reasonings, precisely as formal, from the remaining universal truths based on reason. (Cf. SCHOLZ, 1931). A text-book example of a formal-logical inference which from "Some men are philosophers" and "All philosophers are wise" concludes that "Some men are wise" is called formal, because the validity of this inference depends only on the form ofthe given sentences -in particular it does not depend on the truth or falsity of these sentences. (On the dependence of logic on natural language, English, for example, compare Section 1 and 8). The form of a sentence like "Some men are philosophers," is that which remains preserved when the given predicates, here "men" and "philosophers" are replaced by arbitrary ones. The form itself can thus be represented by replacing the given predicates by variables. Variables are signs devoid of meaning, which may serve merely to indicate the place where meaningful constants (here the predicates) are to be inserted. As variables we shall use - as did Aristotle - letters, say P, Q and R, as variables for predicates."
All humans can interpret sentences of their native language quickly and without effort. Working from the perspective of generative grammar, the contributors investigate three mental mechanisms, widely assumed to underlie this ability: compositional semantics, implicature computation and presupposition computation. This volume brings together experts from semantics and pragmatics to bring forward the study of interconnections between these three mechanisms. The contributions develop new insights into important empirical phenomena; for example, approximation, free choice, accommodation, and exhaustivity effects.
This collection of essays is dedicated to 'Joe' Karel Lambert. The contributors are all personally affected to Joe in some way or other, but they are definitely not the only ones. Whatever excuses there are - there are some -, the editors apologize to whomever they have neglected. But even so the collection displays how influential Karel Lambert has been, personally and through his teaching and his writings. The display is in alphabetical order - with one exception: Bas van Fraassen, being about the earliest student of Karel Lambert, opens the collection with some reminiscences. Naturally, one of the focal points of this volume is Lambert's logical thinking and (or: freed of) ontological thinking. Free logic is intimately connected with description theory. Bas van Fraassen gives a survey of the development of the area, and Charles Daniels points to difficulties with definite descriptions in modal contexts and stories. Peter Woodruff addresses the relation between free logic and supervaluation semantics, presenting a novel condition which recovers desirable metatheoretic properties for free logic under that semantics. Terence Parsons shows how free logic can be utilized in interpreting sentences as purporting to denote events (true ones succeed and false ones fail) and how this helps to understand natural language.
It is with great pleasure that we are presenting to the community the second edition of this extraordinary handbook. It has been over 15 years since the publication of the first edition and there have been great changes in the landscape of philosophical logic since then. The first edition has proved invaluable to generations of students and researchers in formal philosophy and language, as well as to consumers of logic in many applied areas. The main logic artiele in the Encyelopaedia Britannica 1999 has described the first edition as 'the best starting point for exploring any of the topics in logic'. We are confident that the second edition will prove to be just as good. ! The first edition was the second handbook published for the logic commu nity. It followed the North Holland one volume Handbook 0/ Mathematical Logic, published in 1977, edited by the late Jon Barwise. The four volume Handbook 0/ Philosophical Logic, published 1983-1989 came at a fortunate temporal junction at the evolution of logic. This was the time when logic was gaining ground in computer science and artificial intelligence cireles. These areas were under increasing commercial press ure to provide devices which help andjor replace the human in his daily activity. This pressure required the use of logic in the modelling of human activity and organisa tion on the one hand and to provide the theoretical basis for the computer program constructs on the other.
Quantifiers: Logics, Models and Computation is the first concentrated effort to give a systematic presentation of the main research results on the subject, since the modern concept was formulated in the late '50s and early '60s. The majority of the papers are in the nature of a handbook. All of them are self-contained, at various levels of difficulty. The Introduction surveys the main ideas and problems encountered in the logical investigation of quantifiers. The Prologue, written by Per Lindstrom, presents the early history of the concept of generalised quantifiers. The volume then continues with a series of papers surveying various research areas, particularly those that are of current interest. Together they provide introductions to the subject from the points of view of mathematics, linguistics, and theoretical computer science. The present volume has been prepared in parallel with Quantifiers: Logics, Models and Computation, Volume Two. Contributions, which contains a collection of research papers on the subject in areas that are too fresh to be summarised. The two volumes are complementary. For logicians, mathematicians, philosophers, linguists and computer scientists. Suitable as a text for advanced undergraduate and graduate specialised courses in logic. "
Philosophy, Psychology, and Psychologism presents a remarkable
diversity of contemporary opinions on the prospects of addressing
philosophical topics from a psychological perspective. It considers
the history and philosophical merits of psychologism, and looks
systematically at psychologism in phenomenology, cognitive science,
epistemology, logic, philosophy of language, philosophical
semantics, and artificial intelligence. It juxtaposes many
different philosophical standpoints, each supported by rigorous
philosophical argument.
We who live in this post-modern late twentieth century culture are still children of dualism. For a variety of rather complex reasons we continue to split apart and treat as radical opposites body and spirit, medicine and religion, sacred and secular, private and public, love and justice, men and women. Though this is still our strong tendency, we are beginning to discover both the futility and the harm of such dualistic splitting. Peoples of many ancient cultures might smile at the belatedness of our discovery concerning the commonalities of medicine and religion. A cur sory glance back at ancient Egypt, Samaria, Babylonia, Persia, Greece, and Rome would disclose a common thread - the close union of religion and medicine. Both were centrally concerned with healing, health, and wholeness. The person was understood as a unity of body, mind, and spirit. The priest and the physician frequently were combined in the same individual. One of the important contributions of this significant volume of essays is the sustained attack upon dualism. From a variety of vantage points, virtually all of the authors unmask the varied manifestations of dualism in religion and medicine, urging a more holistic approach. Since the editor has provided an excellent summary of each article, I shall not attempt to comment on specific contributions. Rather, I wish to highlight three 1 broad themes which I find notable for theological ethics."
Mathematics depends on proofs, and proofs must begin somewhere, from some fundamental assumptions. For nearly a century, the axioms of set theory have played this role, so the question of how these axioms are properly judged takes on a central importance. Approaching the question from a broadly naturalistic or second-philosophical point of view, Defending the Axioms isolates the appropriate methods for such evaluations and investigates the ontological and epistemological backdrop that makes them appropriate. In the end, a new account of the objectivity of mathematics emerges, one refreshingly free of metaphysical commitments.
This book presents a collection of contributions from related logics to applied paraconsistency. Moreover, all of them are dedicated to Jair Minoro Abe,on the occasion of his sixtieth birthday. He is one of the experts in Paraconsistent Engineering, who developed the so-called annotated logics. The book includes important contributions on foundations and applications of paraconsistent logics in connection with engineering, mathematical logic, philosophical logic, computer science, physics, economics, and biology. It will be of interest to students and researchers, who are working on engineering and logic.
Modal Logic is a branch of logic with applications in many related disciplines such as computer science, philosophy, linguistics and artificial intelligence. Over the last twenty years, in all of these neighbouring fields, modal systems have been developed that we call multi-dimensional. (Our definition of multi-dimensionality in modal logic is a technical one: we call a modal formalism multi-dimensional if, in its intended semantics, the universe of a model consists of states that are tuples over some more basic set.) This book treats such multi-dimensional modal logics in a uniform way, linking their mathematical theory to the research tradition in algebraic logic. We will define and discuss a number of systems in detail, focusing on such aspects as expressiveness, definability, axiomatics, decidability and interpolation. Although the book will be mathematical in spirit, we take care to give motivations from the disciplines mentioned earlier on.
This is an outline of a coherence theory of law. Its basic ideas are: reasonable support and weighing of reasons. All the rest is commentary. These words at the beginning of the preface of this book perfectly indicate what On Law and Reason is about. It is a theory about the nature of the law which emphasises the role of reason in the law and which refuses to limit the role of reason to the application of deductive logic. In 1989, when the first edition of On Law and Reason appeared, this book was ground breaking for several reasons. It provided a rationalistic theory of the law in the language of analytic philosophy and based on a thorough understanding of the results, including technical ones, of analytic philosophy. That was not an obvious combination at the time of the book s first appearance and still is not. The result is an analytical rigor that is usually associated with positivist theories of the law, combined with a philosophical position that is not natural law in a strict sense, but which shares with it the emphasis on the role of reason in determining what the law is. If only for this rare combination, On Law and Reason still deserves careful study. On Law and Reason also foreshadowed and influenced a development in the field of Legal Logic that would take place in the nineties of the 20th century, namely the development of non-monotonic ( defeasible ) logics for the analysis of legal reasoning. In the new Introduction to this second edition, this aspect is explored in some more detail."
This anthology of original essays has been nearly .two and one-half years in the making, and reflects the generous effort of many persons. To begin with, we thank the contributors to the volume, who not only cooperated with regards to their own works, but who also provided valuable advice concerning the over-all volume. One of the contributors was outstanding in his assistance and warrants special mention: we thank Professor Michel Meyer, for his encouragement, counsel, and dedication to see this project to comple tion. We would also like to thank Professor Jaakko Hintikka for his encouragement and Mrs. Kuipers of Reidel for her patience and under standing along the way. A project such as this could never have been completed without the unique assistance of members of the Department of Communication, Ohio State University: Ms. Kimberly Pasi and Mr. Charles Mawhirtcr. Also, special thanks are due to our graduate research assistant Ms. Susan Jasko, for her proofreading and bibliographic work. The pressures of developing a Festschrift are considerable and could not have been met without the cooperation and enthusiasm of Mrs. Perelman, especially in allowing us to publish Professor Perelman's address to Ohio State University as our introduction."
The analytic/synthetic distinction looks simple. It is a
distinction between two different kinds of sentence. Synthetic
sentences are true in part because of the way the world is, and in
part because of what they mean. Analytic sentences - like all
bachelors are unmarried and triangles have three sides - are
different. They are true in virtue of meaning, so no matter what
the world is like, as long as the sentence means what it does, it
will be true.
The modern discussion on the concept of truthlikeness was started in 1960. In his influential Word and Object, W. V. O. Quine argued that Charles Peirce's definition of truth as the limit of inquiry is faulty for the reason that the notion 'nearer than' is only "defined for numbers and not for theories." In his contribution to the 1960 International Congress for Logic, Methodology, and Philosophy of Science at Stan ford, Karl Popper defended the opposite view by defining a compara tive notion of verisimilitude for theories. was originally introduced by the The concept of verisimilitude Ancient sceptics to moderate their radical thesis of the inaccessibility of truth. But soon verisimilitudo, indicating likeness to the truth, was confused with probabilitas, which expresses an opiniotative attitude weaker than full certainty. The idea of truthlikeness fell in disrepute also as a result of the careless, often confused and metaphysically loaded way in which many philosophers used - and still use - such concepts as 'degree of truth', 'approximate truth', 'partial truth', and 'approach to the truth'. Popper's great achievement was his insight that the criticism against truthlikeness - by those who urge that it is meaningless to speak about 'closeness to truth' - is more based on prejudice than argument."
"The questions this book deals with deserve close investigation. The book offers a rare opportunity to think about the limitations of our standard logic and semantic analysis from an overall perspective." - Notre Dame Philosophical Reviews In this challenging and provocative book, Dale Jacquette argues that contemporary philosophy labours under a number of historically inherited delusions about the nature of logic and the philosophical significance of certain formal properties of specific types of logical constructions. Exposing some of the key misconceptions about formal symbolic logic and its relation to thought, language and the world, Jacquette clears the ground of some very well-entrenched philosophical doctrines about the nature of logic, including some of the most fundamental seldom-questioned parts of elementary propositional and predicate-quantificational logic. Having presented difficulties for conventional ways of thinking about truth functionality, the metaphysics of reference and predication, the role of a concept of truth in a theory of meaning, among others, Jacquette proceeds to reshape the network of ideas about traditional logic that philosophy has acquired along with modern logic itself. In so doing Jacquette is able to offer a new perspective on a number of existing problems in logic and philosophy of logic.
We all engage in the process of reasoning, but we don't always pay attention to whether we are doing it well. This book offers the opportunity to practise reasoning in a clear-headed and critical way, with the aims of developing an awareness of the importance of reasoning well and of improving the reader's skill in analyzing and evaluating arguments. In this third edition, Anne Thomson has updated and revised the book to include fresh and topical examples which will guide students through the processes of critical reasoning in a clear and engaging way. In addition, two new chapters on evaluating the credibility of evidence and decision making and dilemmas will fully equip students to reason well. By the end of the book students should be able to:
Aristotle's "On Interpretation", a centrepiece of his logic, studies the relationship between conflicting pairs of statements. The first eight chapters, studied here, explain what statements are; they start from their basic components, the words, and work up to the character of opposed affirmations and negations. The 15,000 pages of the ancient Greek commentators on Aristotle, written mainly between 200 and 500 AD, constitute the largest corpus of extant Greek philosophical writing not translated into English or other European languages. This new series of translations, planned in 60 volumes, fills an important gap in the history of European thought.
Doing Worlds with Words throws light on the problem of meaning as the meeting point of linguistics, logic and philosophy, and critically assesses the possibilities and limitations of elucidating the nature of meaning by means of formal logic, model theory and model-theoretical semantics. The main thrust of the book is to show that it is misguided to understand model theory metaphysically and so to try to base formal semantics on something like formal metaphysics; rather, the book states that model theory and similar tools of the analysis of language should be understood as capturing the semantically relevant, especially inferential, structure of language. From this vantage point, the reader gains a new light on many of the traditional concepts and problems of logic and philosophy of language, such as meaning, reference, truth and the nature of formal logic.
In this book the authors present new results on interpolation for nonmonotonic logics, abstract (function) independence, the Talmudic Kal Vachomer rule, and an equational solution of contrary-to-duty obligations. The chapter on formal construction is the conceptual core of the book, where the authors combine the ideas of several types of nonmonotonic logics and their analysis of 'natural' concepts into a formal logic, a special preferential construction that combines formal clarity with the intuitive advantages of Reiter defaults, defeasible inheritance, theory revision, and epistemic considerations. It is suitable for researchers in the area of computer science and mathematical logic.
Labelled deduction is an approach to providing frameworks for presenting and using different logics in a uniform and natural way by enriching the language of a logic with additional information of a semantic proof-theoretical nature. Labelled deduction systems often possess attractive properties, such as modularity in the way that families of related logics are presented, parameterised proofs of metatheoretic properties, and ease of mechanisability. It is thus not surprising that labelled deduction has been applied to problems in computer science, AI, mathematical logic, cognitive science, philosophy and computational linguistics - for example, formalizing and reasoning about dynamic state oriented' properties such as knowledge, belief, time, space, and resources. |
You may like...
A Home For Zephany - The Story Of A Girl…
Heindrich Wyngaard
Paperback
(1)
Statistical, Mapping and Digital…
Gilles Maignant, Pascal Staccini
Hardcover
R2,198
Discovery Miles 21 980
Database Principles - Fundamentals of…
Carlos Coronel, Keeley Crockett, …
Paperback
Advances in Applied Microbiology, Volume…
Geoffrey M. Gadd, Sima Sariaslani
Hardcover
R3,094
Discovery Miles 30 940
|