![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Logic
First published in 1985, Practical Inferences describes how practical inferences are used. Starting with relatively simple inference patterns exhibited in everyday prudential decisions, the author extends a basic structural framework to the more complex inferences used in assessing probabilities, and finally to moral inferences. In this way what have been regarded as disparate activities are shown to exhibit fundamental similarities. The author argues that at all levels of decision-making the practical inferences used contain at least one premise expressing the desires or preferences of the agent. This is in opposition to the dominant view in Western philosophy that desires must be regulated or evaluated by means of principles of conduct discovered by rational procedures. By examining the premises implied by holders of this view, the author shows that they are inadequate bases for justifying practical decisions. This book will be of interest to students of philosophy, logic and mathematics.
This book is an updated and revised edition of Fundamentals of Legal Argumentation published in 1999. It discusses new developments that have taken place in the past 15 years in research of legal argumentation, legal justification and legal interpretation, as well as the implications of these new developments for the theory of legal argumentation. Almost every chapter has been revised and updated, and the chapters include discussions of recent studies, major additions on topical issues, new perspectives, and new developments in several theoretical areas. Examples of these additions are discussions of recent developments in such areas as Habermas' theory, MacCormick's theory, Alexy's theory, Artificial Intelligence and law, and the pragma-dialectical theory of legal argumentation. Furthermore it provides an extensive and systematic overview of approaches and studies of legal argumentation in the context of legal justification in various legal systems and countries that have been important for the development of research of legal argumentation. The book contains a discussion of influential theories that conceive the law and legal justification as argumentative activity. From different disciplinary and theoretical angles it addresses such topics as the institutional characteristics of the law and the relation between general standards for moral discussions and legal standards such as the Rule of Law. It discusses patterns of legal justification in the context of different types of problems in the application of the law and it describes rules for rational legal discussions. The combination of the sound basis of the first edition and the discussions of new developments make this new edition an up-to-date and comprehensive survey of the various theoretical influences which have informed the study of legal argumentation. It discusses salient backgrounds to this field as well as major approaches and trends in the contemporary research. It surveys the relevant theoretical factors both from various continental law traditions and common law countries.
Our preferences determine how we act and think, but exactly what the mechanics are and how they work is a central cause of concern in many disciplines. This book uses techniques from modern logics of information flow and action to develop a unified new theory of what preference is and how it changes. The theory emphasizes reasons for preference, as well as its entanglement with our beliefs. Moreover, the book provides dynamic logical systems which describe the explicit triggers driving preference change, including new information, suggestions, and commands. In sum, the book creates new bridges between many fields, from philosophy and computer science to economics, linguistics, and psychology. For the experienced scholar access to a large body of recent literature is provided and the novice gets a thorough introduction to the action and techniques of dynamic logic.
Walter Benjamin is one of the most important figures of modern culture. The authors focus within this book on Benjamin as a philosopher, or rather as a critic of modernism entangled in tradition (mainly Jewish), but also as a writer. Philosophical and philological readings are accompanied by essays presenting the complex biography of Benjamin and numerous, often unexpected, parallels which indicate traces of his reflections in works of other artists. In consequence, "The Arcades Project", which can be described as Benjamin's opus vitae, is not only a picturesque history of Parisian arcades of the mid-19th century. It is also a polyphonic text, composed of quotations, commentaries and footnotes, a discussion of the sense of history and the literary work of art that surprises with its meandering quality.
This book seeks to work out which commitments are minimally sufficient to obtain an ontology of the natural world that matches all of today's well-established physical theories. We propose an ontology of the natural world that is defined only by two axioms: (1) There are distance relations that individuate simple objects, namely matter points. (2) The matter points are permanent, with the distances between them changing. Everything else comes in as a means to represent the change in the distance relations in a manner that is both as simple and as informative as possible. The book works this minimalist ontology out in philosophical as well as mathematical terms and shows how one can understand classical mechanics, quantum field theory and relativistic physics on the basis of this ontology. Along the way, we seek to achieve four subsidiary aims: (a) to make a case for a holistic individuation of the basic objects (ontic structural realism); (b) to work out a new version of Humeanism, dubbed Super-Humeanism, that does without natural properties; (c) to set out an ontology of quantum physics that is an alternative to quantum state realism and that avoids any ontological dualism of particles and fields; (d) to vindicate a relationalist ontology based on point objects also in the domain of relativistic physics.
This book develops new techniques in formal epistemology and applies them to the challenge of Cartesian skepticism. It introduces two formats of epistemic evaluation that should be of interest to epistemologists and philosophers of science: the dual-component format, which evaluates a statement on the basis of its safety and informativeness, and the relative-divergence format, which evaluates a probabilistic model on the basis of its complexity and goodness of fit with data. Tomoji Shogenji shows that the former lends support to Cartesian skepticism, but the latter allows us to defeat Cartesian skepticism. Along the way, Shogenji addresses a number of related issues in epistemology and philosophy of science, including epistemic circularity, epistemic closure, and inductive skepticism.
Peirce's Speculative Grammar: Logic as Semiotics offers a comprehensive, philologically accurate, and exegetically ambitious developmental account of Peirce's theory of speculative grammar. The book traces the evolution of Peirce's grammatical writings from his early research on the classification of arguments in the 1860s up to the complex semiotic taxonomies elaborated in the first decade of the twentieth century. It will be of interest to academic specialists working on Peirce, the history of American philosophy and pragmatism, the philosophy of language, the history of logic, and semiotics.
Bayesian nets are widely used in artificial intelligence as a calculus for casual reasoning, enabling machines to make predictions, perform diagnoses, take decisions and even to discover casual relationships. But many philosophers have criticized and ultimately rejected the central assumption on which such work is based-the causal Markov Condition. So should Bayesian nets be abandoned? What explains their success in artificial intelligence? This book argues that the Causal Markov Condition holds as a default rule: it often holds but may need to be repealed in the face of counter examples. Thus, Bayesian nets are the right tool to use by default but naively applying them can lead to problems. The book develops a systematic account of causal reasoning and shows how Bayesian nets can be coherently employed to automate the reasoning processes of an artificial agent. The resulting framework for causal reasoning involves not only new algorithms, but also new conceptual foundations. Probability and causality are treated as mental notions - part of an agent's belief state. Yet probability and causality are also objective - different agents with the same background knowledge ought to adopt the same or similar probabilistic and causal beliefs. This book, aimed at researchers and graduate students in computer science, mathematics and philosophy, provides a general introduction to these philosophical views as well as exposition of the computational techniques that they motivate.
This is the second of two volumes of essays in commemoration of Alan Turing, celebrating his intellectual legacy within the philosophy of mind and cognitive science. It focuses on the relationship between a scientific, computational image of the mind and a common-sense picture of the mind as an inner arena populated by concepts, beliefs, intentions, and qualia. Topics covered include the causal potency of folk-psychological states, the connectionist reconception of learning and concept formation, the understanding of the notion of computation itself, and the relation between philosophical and psychological theories of concepts.
This book is the first part of a comprehensive study of Wittgenstein's conception of language description. Describing language was no pastime occupation for the philosopher. It was hard work and it meant struggle. It made for a philosophy that required Wittgenstein's full attention and half his life. His approach had always been working on himself, on how he saw things. The central claim of this book is that nothing will come of our exegetical efforts to see what Wittgenstein's later philosophy amounts to if his work on describing language is not given the place and concern it deserves. The book shows what his philosophy might begin to look like in the light of critical questions around his interest to see the end of the day with descriptions, and these things only.
This book examines the progress to date in the many facets - conceptual, epistemological and methodological - of the field of legal semiotics. It reflects the fulfilment of the promise of legal semiotics when used to explore the law, its processes and interpretation. This study in Legal Semiotics brings together the theory, structure and practise of legal semiotics in an accessible style. The book introduces the concepts of legal semiotics and offers an insight in contemporary and future directions which the semiotics of law is going to take. A theoretical and practical oriented synthesis of the historical, contemporary and most recent ideas pertaining to legal semiotics, the book will be of interest to scholars and researchers in law and social sciences, as well as those who are interested in the interdisciplinary dynamics of law and semiotics.
The papers in this volume address fundamental, and interrelated, philosophical issues concerning modality and identity, issues that have not only been pivotal to the development of analytic philosophy in the twentieth century, but remain a key focus of metaphysical debate in the twenty-first. How are we to understand the concepts of necessity and possibility? Is chance a basic ingredient of reality? How are we to make sense of claims about personal identity? Do numbers require distinctive identity criteria? Does the capacity to identify an object presuppose an ability to bring it under a sortal concept? Rather than presenting a single, partisan perspective, Identity and Modality enriches our understanding of identity and modality by bringing together papers written by leading researchers working in metaphysics, the philosophy of mind, the philosophy of science, and the philosophy of mathematics. The resulting variety of perspectives correspondingly reflects both the breadth and depth of contemporary theorizing about identity and modality, each paper addressing a particular issue and advancing our knowledge of the area. This volume will provide essential reading for graduate students in the subject and professional philosophers.
The founder of both American pragmatism and semiotics, Charles Sanders Peirce (1839-1914) is widely regarded as an enormously important and pioneering theorist. In this book, scholars from around the world examine the nature and significance of Peirce's work on perception, iconicity, and diagrammatic thinking. Abjuring any strict dichotomy between presentational and representational mental activity, Peirce's theories transform the Aristotelian, Humean, and Kantian paradigms that continue to hold sway today and, in so doing, forge a new path for understanding the centrality of visual thinking in science, education, art, and communication. The essays in this collection cover a wide range of issues related to Peirce's theories, including the perception of generality; the legacy of ideas being copies of impressions; imagination and its contribution to knowledge; logical graphs, diagrams, and the question of whether their iconicity distinguishes them from other sorts of symbolic notation; how images and diagrams contribute to scientific discovery and make it possible to perceive formal relations; and the importance and danger of using diagrams to convey scientific ideas. This book is a key resource for scholars interested in Perice's philosophy and its relation to contemporary issues in mathematics, philosophy of mind, philosophy of perception, semiotics, logic, visual thinking, and cognitive science.
Timothy Williamson is one of the most influential living philosophers working in the areas of logic and metaphysics. His work in these areas has been particularly influential in shaping debates about metaphysical modality, which is the topic of his recent provocative and closely-argued book Modal Logic as Metaphysics (2013). This book comprises ten essays by metaphysicians and logicians responding to Williamson's work on metaphysical modality, as well as replies by Williamson to each essay. In addition, it contains an original essay by Williamson, 'Modal science,' concerning the role of modal claims in natural science. This book was originally published as a special issue of the Canadian Journal of Philosophy.
This book offers a comprehensive account of logic that addresses fundamental issues concerning the nature and foundations of the discipline. The authors claim that these foundations can not only be established without the need for strong metaphysical assumptions, but also without hypostasizing logical forms as specific entities. They present a systematic argument that the primary subject matter of logic is our linguistic interaction rather than our private reasoning and it is thus misleading to see logic as revealing "the laws of thought". In this sense, fundamental logical laws are implicit to our "language games" and are thus more similar to social norms than to the laws of nature. Peregrin and Svoboda also show that logical theories, despite the fact that they rely on rules implicit to our actual linguistic practice, firm up these rules and make them explicit. By carefully scrutinizing the project of logical analysis, the authors demonstrate that logical rules can be best seen as products of the so called reflective equilibrium. They suggest that we can profit from viewing languages as "inferential landscapes" and logicians as "geographers" who map them and try to pave safe routes through them. This book is an essential resource for scholars and researchers engaged with the foundations of logical theories and the philosophy of language.
A Logical Introduction to Probability and Induction is a textbook on the mathematics of the probability calculus and its applications in philosophy. On the mathematical side, the textbook introduces these parts of logic and set theory that are needed for a precise formulation of the probability calculus. On the philosophical side, the main focus is on the problem of induction and its reception in epistemology and the philosophy of science. Particular emphasis is placed on the means-end approach to the justification of inductive inference rules. In addition, the book discusses the major interpretations of probability. These are philosophical accounts of the nature of probability that interpret the mathematical structure of the probability calculus. Besides the classical and logical interpretation, they include the interpretation of probability as chance, degree of belief, and relative frequency. The Bayesian interpretation of probability as degree of belief locates probability in a subject's mind. It raises the question why her degrees of belief ought to obey the probability calculus. In contrast to this, chance and relative frequency belong to the external world. While chance is postulated by theory, relative frequencies can be observed empirically. A Logical Introduction to Probability and Induction aims to equip students with the ability to successfully carry out arguments. It begins with elementary deductive logic and uses it as basis for the material on probability and induction. Throughout the textbook results are carefully proved using the inference rules introduced at the beginning, and students are asked to solve problems in the form of 50 exercises. An instructor's manual contains the solutions to these exercises as well as suggested exam questions. The book does not presuppose any background in mathematics, although sections 10.3-10.9 on statistics are technically sophisticated and optional. The textbook is suitable for lower level undergraduate courses in philosophy and logic.
Moral Inferences is the first volume to thoroughly explore the relationship between morality and reasoning. Drawing on the expertise of world-leading researchers, this text provides ground-breaking insight into the importance of studying these distinct fields together. The volume integrates the latest research into morality with current theories in reasoning to consider the prominent role reasoning plays in everyday moral judgements. Featuring contributions on topics such as moral arguments, causal models, and dual process theory, this text provides a new perspectives on previous studies, encouraging researchers to adopt a more integrated approach in the future. Moral Inferences will be essential reading for students and researchers of moral psychology, specifically those interested in reasoning, rationality and decision-making.
Moral Inferences is the first volume to thoroughly explore the relationship between morality and reasoning. Drawing on the expertise of world-leading researchers, this text provides ground-breaking insight into the importance of studying these distinct fields together. The volume integrates the latest research into morality with current theories in reasoning to consider the prominent role reasoning plays in everyday moral judgements. Featuring contributions on topics such as moral arguments, causal models, and dual process theory, this text provides a new perspectives on previous studies, encouraging researchers to adopt a more integrated approach in the future. Moral Inferences will be essential reading for students and researchers of moral psychology, specifically those interested in reasoning, rationality and decision-making.
We are happy to present to the reader the first book of our Applied Logic Series. Walton's book on the fallacies of ambiguity is firmly at the heart of practical reasoning, an important part of applied logic. There is an increasing interest in artifIcial intelligence, philosophy, psychol ogy, software engineering and linguistics, in the analysis and possible mechanisation of human practical reasoning. Continuing the ancient quest that began with Aristotle, computer scientists, logicians, philosophers and linguists are vigorously seeking to deepen our understanding of human reasoning and argumentation. Significant communities of researchers are actively engaged in developing new approaches to logic and argumentation, which are better suited to the urgent needs of today's applications. The author of this book has, over many years, made significant contributions to the detailed analysis of practical reasoning case studies, thus providing solid foundations for new and more applicable formal logical systems. We welcome Doug Walton's new book to our series."
This collection presents the first sustained examination of the nature and status of the idea of principles in early modern thought. Principles are almost ubiquitous in the seventeenth and eighteenth centuries: the term appears in famous book titles, such as Newton's Principia; the notion plays a central role in the thought of many leading philosophers, such as Leibniz's Principle of Sufficient Reason; and many of the great discoveries of the period, such as the Law of Gravitational Attraction, were described as principles. Ranging from mathematics and law to chemistry, from natural and moral philosophy to natural theology, and covering some of the leading thinkers of the period, this volume presents ten compelling new essays that illustrate the centrality and importance of the idea of principles in early modern thought. It contains chapters by leading scholars in the field, including the Leibniz scholar Daniel Garber and the historian of chemistry William R. Newman, as well as exciting, emerging scholars, such as the Newton scholar Kirsten Walsh and a leading expert on experimental philosophy, Alberto Vanzo. The Idea of Principles in Early Modern Thought: Interdisciplinary Perspectives charts the terrain of one of the period's central concepts for the first time, and opens up new lines for further research.
Were the most serious philosophers of the millennium 200 A.D. to 1200 A.D. just confused mystics? This book shows otherwise. John Martin rehabilitates Neoplatonism, founded by Plotinus and brought into Christianity by St. Augustine. The Neoplatonists devise ranking predicates like good, excellent, perfect to divide the Chain of Being, and use the predicate intensifier hyper so that it becomes a valid logical argument to reason from God is not (merely) good to God is hyper-good. In this way the relational facts underlying reality find expression in Aristotle's subject-predicate statements, and the Platonic tradition proves able to subsume Aristotle's logic while at the same time rejecting his metaphysics. In the Middle Ages when Aristotle's larger philosophy was recovered and joined again to the Neoplatonic tradition which was never lost, Neoplatonic logic lived along side Aristotle's metaphysics in a sometime confusing and unsettled way. Showing Neoplatonism to be significantly richer in its logical and philosophical ideas than it is usually given credit for, this book will be of interest not just to historians of logic, but to philosophers, logicians, linguists, and theologians.
This book is dedicated to Dov Gabbay, one of the most outstanding and most productive researchers in the area of logic, language and reasoning. He has exerted a profound influence in the major fields of logic, linguistics and computer science. Most of the chapters included, therefore, build on his work and present results or summarize areas where Dov has made major contributions. In particular his work on Labelled Deductive Systems is addressed in most of the contributions. The chapters on computational linguistics address logical and deductive aspects of linguistic problems. The papers by van Benthem Lambek and Moortgat investigate categorial considerations and the use of labels within the "parsing as deduction" approach. Analyses of particular linguistic problems are given in the remaining papers by Kamp, Kempson, Moravcsik, Konig and Reyle. They address the logic of generalized quantifiers, the treatment of cross-over phenomena and temporal/aspectual interpretation, as well as applicability of underspecified deduction in linguistic formalisms. The more logic-oriented chapters address philosophical and proof-theoretic problems and give algorithmic solutions for most of them. The spectrum ranges from K. Segerberg's contribution which brings together the two traditions of epistemic and doxastic logics of belief, to M. Finger and M. Reynold's chapter on two-dimensional executable logics with applications to temporal databases. The book demonstrates that a relatively small number of basic techniques and ideas, in particular the idea of labelled deductive systems, can be successfully applied in many different areas.
In this book, Lorraine Besser-Jones develops a eudaimonistic virtue ethics based on a psychological account of human nature. While her project maintains the fundamental features of the eudaimonistic virtue ethical framework-virtue, character, and well-being-she constructs these concepts from an empirical basis, drawing support from the psychological fields of self-determination and self-regulation theory. Besser-Jones's resulting account of "eudaimonic ethics" presents a compelling normative theory and offers insight into what is involved in being a virtuous person and "acting well." This original contribution to contemporary ethics and moral psychology puts forward a provocative hypothesis of what an empirically-based moral theory would look like.
First Published in 2004. Routledge is an imprint of Taylor & Francis, an informa company.
Philosophers have warned of the perils of a life spent without reflection, but what constitutes reflective inquiry - and why it's necessary in our lives - can be an elusive concept. Synthesizing ideas from minds as diverse as John Dewey and Paulo Freire, theHandbook of Reflection and Reflective Inquiry presents reflective thought in its most vital aspects, not as a fanciful or nostalgic exercise, but as a powerful means of seeing familiar events anew, encouraging critical thinking and crucial insight, teaching and learning. In its opening pages, two seasoned educators, Maxine Greene and Lee Shulman, discuss reflective inquiry as a form of active attention (Thoreau's "wide-awakeness"), an act of consciousness, and a process by which people can understand themselves, their work (particularly in the form of life projects), and others. Building on this foundation, the Handbook analyzes through the work of 40 internationally oriented authors: - Definitional issues concerning reflection, what it is and is not; - Worldwide social and moral conditions contributing to the growing interest in reflective inquiry in professional education; - Reflection as promoted across professional educational domains, including K-12 education, teacher education, occupational therapy, and the law; - Methods of facilitating and scaffolding reflective engagement; - Current pedagogical and research practices in reflection; - Approaches to assessing reflective inquiry. Educators across the professions as well as adult educators, counselors and psychologists, and curriculum developers concerned with adult learning will find the Handbook of Reflection and Reflective Inquiry an invaluable teaching tool for challenging times. |
You may like...
Logic on the Track of Social Change
David Braybrooke, Bryson Brown, …
Hardcover
R1,459
Discovery Miles 14 590
|