![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Logic
This collection presents the first sustained examination of the nature and status of the idea of principles in early modern thought. Principles are almost ubiquitous in the seventeenth and eighteenth centuries: the term appears in famous book titles, such as Newton's Principia; the notion plays a central role in the thought of many leading philosophers, such as Leibniz's Principle of Sufficient Reason; and many of the great discoveries of the period, such as the Law of Gravitational Attraction, were described as principles. Ranging from mathematics and law to chemistry, from natural and moral philosophy to natural theology, and covering some of the leading thinkers of the period, this volume presents ten compelling new essays that illustrate the centrality and importance of the idea of principles in early modern thought. It contains chapters by leading scholars in the field, including the Leibniz scholar Daniel Garber and the historian of chemistry William R. Newman, as well as exciting, emerging scholars, such as the Newton scholar Kirsten Walsh and a leading expert on experimental philosophy, Alberto Vanzo. The Idea of Principles in Early Modern Thought: Interdisciplinary Perspectives charts the terrain of one of the period's central concepts for the first time, and opens up new lines for further research.
The founder of both American pragmatism and semiotics, Charles Sanders Peirce (1839-1914) is widely regarded as an enormously important and pioneering theorist. In this book, scholars from around the world examine the nature and significance of Peirce's work on perception, iconicity, and diagrammatic thinking. Abjuring any strict dichotomy between presentational and representational mental activity, Peirce's theories transform the Aristotelian, Humean, and Kantian paradigms that continue to hold sway today and, in so doing, forge a new path for understanding the centrality of visual thinking in science, education, art, and communication. The essays in this collection cover a wide range of issues related to Peirce's theories, including the perception of generality; the legacy of ideas being copies of impressions; imagination and its contribution to knowledge; logical graphs, diagrams, and the question of whether their iconicity distinguishes them from other sorts of symbolic notation; how images and diagrams contribute to scientific discovery and make it possible to perceive formal relations; and the importance and danger of using diagrams to convey scientific ideas. This book is a key resource for scholars interested in Perice's philosophy and its relation to contemporary issues in mathematics, philosophy of mind, philosophy of perception, semiotics, logic, visual thinking, and cognitive science.
This publication will introduce how two different countries promote high quality learning with technology in very different educational systems. The book opens inspiring scenarios how new technological tools and services can be used for promoting students' learning in schools and higher education, enhancing collaboration in educational communities and supporting teachers' professional development. The publication focuses on three major themes: Students as knowledge and art creators in playful learning systems, personalized learning supported by mobiles and intelligent tutoring systems with games and new web-based tools identifying learning difficulties, and technology in digitalized learning environments. The book is based on systematic research work in universities.
We are happy to present the first volume of the Handbook of Defeasible Reasoning and Uncertainty Management Systems. Uncertainty pervades the real world and must therefore be addressed by every system that attempts to represent reality. The representation of uncertainty is a ma jor concern of philosophers, logicians, artificial intelligence researchers and com puter sciencists, psychologists, statisticians, economists and engineers. The present Handbook volumes provide frontline coverage of this area. This Handbook was produced in the style of previous handbook series like the Handbook of Philosoph ical Logic, the Handbook of Logic in Computer Science, the Handbook of Logic in Artificial Intelligence and Logic Programming, and can be seen as a companion to them in covering the wide applications of logic and reasoning. We hope it will answer the needs for adequate representations of uncertainty. This Handbook series grew out of the ESPRIT Basic Research Project DRUMS II, where the acronym is made out of the Handbook series title. This project was financially supported by the European Union and regroups 20 major European research teams working in the general domain of uncertainty. As a fringe benefit of the DRUMS project, the research community was able to create this Hand book series, relying on the DRUMS participants as the core of the authors for the Handbook together with external international experts."
Were the most serious philosophers of the millennium 200 A.D. to 1200 A.D. just confused mystics? This book shows otherwise. John Martin rehabilitates Neoplatonism, founded by Plotinus and brought into Christianity by St. Augustine. The Neoplatonists devise ranking predicates like good, excellent, perfect to divide the Chain of Being, and use the predicate intensifier hyper so that it becomes a valid logical argument to reason from God is not (merely) good to God is hyper-good. In this way the relational facts underlying reality find expression in Aristotle's subject-predicate statements, and the Platonic tradition proves able to subsume Aristotle's logic while at the same time rejecting his metaphysics. In the Middle Ages when Aristotle's larger philosophy was recovered and joined again to the Neoplatonic tradition which was never lost, Neoplatonic logic lived along side Aristotle's metaphysics in a sometime confusing and unsettled way. Showing Neoplatonism to be significantly richer in its logical and philosophical ideas than it is usually given credit for, this book will be of interest not just to historians of logic, but to philosophers, logicians, linguists, and theologians.
This book offers a comprehensive account of logic that addresses fundamental issues concerning the nature and foundations of the discipline. The authors claim that these foundations can not only be established without the need for strong metaphysical assumptions, but also without hypostasizing logical forms as specific entities. They present a systematic argument that the primary subject matter of logic is our linguistic interaction rather than our private reasoning and it is thus misleading to see logic as revealing "the laws of thought". In this sense, fundamental logical laws are implicit to our "language games" and are thus more similar to social norms than to the laws of nature. Peregrin and Svoboda also show that logical theories, despite the fact that they rely on rules implicit to our actual linguistic practice, firm up these rules and make them explicit. By carefully scrutinizing the project of logical analysis, the authors demonstrate that logical rules can be best seen as products of the so called reflective equilibrium. They suggest that we can profit from viewing languages as "inferential landscapes" and logicians as "geographers" who map them and try to pave safe routes through them. This book is an essential resource for scholars and researchers engaged with the foundations of logical theories and the philosophy of language.
In this book, Lorraine Besser-Jones develops a eudaimonistic virtue ethics based on a psychological account of human nature. While her project maintains the fundamental features of the eudaimonistic virtue ethical framework-virtue, character, and well-being-she constructs these concepts from an empirical basis, drawing support from the psychological fields of self-determination and self-regulation theory. Besser-Jones's resulting account of "eudaimonic ethics" presents a compelling normative theory and offers insight into what is involved in being a virtuous person and "acting well." This original contribution to contemporary ethics and moral psychology puts forward a provocative hypothesis of what an empirically-based moral theory would look like.
Moral Inferences is the first volume to thoroughly explore the relationship between morality and reasoning. Drawing on the expertise of world-leading researchers, this text provides ground-breaking insight into the importance of studying these distinct fields together. The volume integrates the latest research into morality with current theories in reasoning to consider the prominent role reasoning plays in everyday moral judgements. Featuring contributions on topics such as moral arguments, causal models, and dual process theory, this text provides a new perspectives on previous studies, encouraging researchers to adopt a more integrated approach in the future. Moral Inferences will be essential reading for students and researchers of moral psychology, specifically those interested in reasoning, rationality and decision-making.
Moral Inferences is the first volume to thoroughly explore the relationship between morality and reasoning. Drawing on the expertise of world-leading researchers, this text provides ground-breaking insight into the importance of studying these distinct fields together. The volume integrates the latest research into morality with current theories in reasoning to consider the prominent role reasoning plays in everyday moral judgements. Featuring contributions on topics such as moral arguments, causal models, and dual process theory, this text provides a new perspectives on previous studies, encouraging researchers to adopt a more integrated approach in the future. Moral Inferences will be essential reading for students and researchers of moral psychology, specifically those interested in reasoning, rationality and decision-making.
Epistemic Principles: A Primer of the Theory of Knowledge presents a compact account of the basic principles of the theory of knowledge. In doing this, Nicholas Rescher aims to fill the current gap in contemporary philosophical theory of knowledge with a comprehensive analysis of epistemological fundamentals. The book is not a mere inventory of such rules and principles, but rather interweaves them into a continuous exposition of basic issues. Written at a user-friendly and accessible level, Epistemic Principles is an essential addition for both advanced undergraduate and graduate courses in epistemology.
It is with great pleasure that we are presenting to the community the second edition of this extraordinary handbook. It has been over 15 years since the publication of the first edition and there have been great changes in the landscape of philosophical logic since then. The first edition has proved invaluable to generations of students and researchers in formal philosophy and language, as well as to consumers of logic in many applied areas. The main logic article in the Encyclopaedia Britannica 1999 has described the first edition as 'the best starting point for exploring any of the topics in logic'. We are confident that the second edition will prove to be just as good The first edition was the second handbook published for the logic commu nity. It followed the North Holland one volume Handbook of Mathematical Logic, published in 1977, edited by the late Jon Barwise. The four volume Handbook of Philosophical Logic, published 1983-1989 came at a fortunate temporal junction at the evolution of logic. This was the time when logic was gaining ground in computer science and artificial intelligence circles. These areas were under increasing commercial pressure to provide devices which help and/or replace the human in his daily activity. This pressure required the use of logic in the modelling of human activity and organisa tion on the one hand and to provide the theoretical basis for the computer program constructs on the other."
First Published in 2004. Routledge is an imprint of Taylor & Francis, an informa company.
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes' set theoretical predicates and the "da Costa and Chuaqui" approach. This book engages with important discussions of the nature of scientific theories and will be a useful resource for researchers and upper-level students working in philosophy of science.
Originally published in 1973, this book shows that methods developed for the semantics of systems of formal logic can be successfully applied to problems about the semantics of natural languages; and, moreover, that such methods can take account of features of natural language which have often been thought incapable of formal treatment, such as vagueness, context dependence and metaphorical meaning. Parts 1 and 2 set out a class of formal languages and their semantics. Parts 3 and 4 show that these formal languages are rich enought to be used in the precise description of natural languages. Appendices describe some of the concepts discussed in the text.
From Concept to Objectivity uncovers the nature and authority of conceptual determination by critically thinking through neglected arguments in Hegel's Science of Logic pivotal for understanding reason and its role in philosophy. Winfield clarifies the logical problems of presuppositionlessness and determinacy that prepare the way for conceiving the concept, examines how universality, particularity, and individuality are determined, investigates how judgment and syllogism are exhaustively differentiated, and, on that basis, explores how objectivity can be categorized without casting thought in irrevocable opposition to reality. Winfield's book will be of interest to readers of Hegel as well as anyone wondering how thought can be objective.
This book advances a reading of Wittgenstein's Tractatus that moves beyond the main interpretative options of the New Wittgenstein debate. It covers Wittgenstein's approach to language and logic, as well as other areas unduly neglected in the literature, such as his treatment of metaphysics, the natural sciences and value. Tejedor re-contextualises Wittgenstein's thinking in these areas, plotting its evolution in his diaries, correspondence and pre-Tractatus texts, and developing a fuller picture of its intellectual background. This broadening of the angle of view is central to the interpretative strategy of her book: only by looking at the Tractatus in this richer light can we address the fundamental questions posed by the New Wittgenstein debate - questions concerning the method of the Tractatus, its approach to nonsense and the continuity in Wittgenstein's philosophy. Wittgenstein's early work remains insightful, thought-inspiring and relevant to contemporary philosophy of language and science, metaphysics and ethics. Tejedor's ground-breaking work ultimately conveys a surprisingly positive message concerning the power for ethical transformation that philosophy can have, when it is understood as an activity aimed at increasing conceptual clarification and awareness.
Bounded Thinking offers a new account of the virtues of limitation management: intellectual virtues of adapting to the fact that we cannot solve many problems that we can easily describe. Adam Morton argues that we do give one another guidance on managing our limitations, but that this has to be in terms of virtues and not of rules, and in terms of success-knowledge and accomplishment-rather than rationality. He establishes a taxonomy of intellectual virtues, which includes 'paradoxical virtues' that sound like vices, such as the virtue of ignoring evidence and the virtue of not thinking too hard. There are also virtues of not planning ahead, in that some forms of such planning require present knowledge of one's future knowledge that is arguably impossible. A person's best response to many problems depends not on the most rationally promising solution to solving them but on the most likely route to success given the profile of intellectual virtues that the person has and lacks. Morton illustrates his argument with discussions of several paradoxes and conundra. He closes the book with a discussion of intelligence and rationality, and argues that both have very limited usefulness in the evaluation of who will make progress on which problems.
Originally published in 1966 On the Syllogism and Other Logical Writings assembles for the first time the five celebrated memoirs of Augustus De Morgan on the syllogism. These are collected together with the more condensed accounts of his researches given in his Syllabus of a Proposed System of Logic an article on Logic contributed to the English Cyclopaedia. De Morgan was among the most distinguished of nineteenth century British mathematicians but is chiefly remembered today as one of the founders of modern mathematical logic. His writings on this subject have been little read, however since apart from his Formal Logic, they lie buried for the most part in inaccessible periodicals. De Morgan's own later amendments are inserted in the text and the editorial introduction gives a summary of the whole and traces in some detail the course of the once-famous feud with Sir William Hamilton of Edinburgh.
First published in 1989. Routledge is an imprint of Taylor & Francis, an informa company.
This volume identifies and analyses English words and expressions that are crucial for an adequate reconstruction of argumentative discourse. It provides the analyst of argumentative discussions and texts with a systematic set of instruments for giving a well founded analysis which results in an analytic overview of the elements that are relevant for the evaluation of the argumentation. In the book a systematic connection is made between linguistic insights into the characteristics of argumentative discourse and insights from argumentation theory into the resolution of differences of opinion by means of argumentation.
The twentieth century witnessed the birth of analytic philosophy. This volume covers some of its key movements and philosophers, including Frege and Wittgenstein's Tractatus. |
You may like...
Bioengineered Nanomaterials for Wound…
Hamed Barabadi, Muthupandian Saravanan, …
Paperback
R4,795
Discovery Miles 47 950
Vaxxers - The Inside Story Of The Oxford…
Sarah Gilbert, Catherine Green
Paperback
R123
Discovery Miles 1 230
Intelligent Data Sensing and Processing…
Miguel Antonio Wister Ovando, Pablo Pancardo Garcia, …
Paperback
A Brief History Of Time - From Big Bang…
Stephen Hawking
Paperback
(4)
Mesenchymal Stem Cells - Basics and…
Birgit Weyand, Massimo Dominici, …
Hardcover
R5,844
Discovery Miles 58 440
|