![]() |
![]() |
Your cart is empty |
||
Books > Humanities > Philosophy > Western philosophy > Modern Western philosophy, c 1600 to the present > Western philosophy, from c 1900 - > Analytical & linguistic philosophy
Carnap, Quine, and Putnam held that in our pursuit of truth we can do no better than to start in the middle, relying on already-established beliefs and inferences and applying our best methods for re-evaluating particular beliefs and inferences and arriving at new ones. In this collection of essays, Gary Ebbs interprets these thinkers' methodological views in the light of their own philosophical commitments, and in the process refutes some widespread misunderstandings of their views, reveals the real strengths of their arguments, and exposes a number of problems that they face. To solve these problems, in many of the essays Ebbs also develops new philosophical approaches, including new theories of logical truth, language use, reference and truth, truth by convention, realism, trans-theoretical terms, agreement and disagreement, radical belief revision, and contextually a priori statements. His essays will be valuable for a wide range of readers in analytic philosophy.
This book contains a selection of papers from the workshop Women in the History of Analytic Philosophy held in October 2019 in Tilburg, the Netherlands. It is the first volume devoted to the role of women in early analytic philosophy. It discusses the ideas of ten female philosophers and covers a period of over a hundred years, beginning with the contribution to the Significs Movement by Victoria, Lady Welby in the second half of the nineteenth century, and ending with Ruth Barcan Marcus's celebrated version of quantified modal logic after the Second World War. The book makes clear that women contributed substantially to the development of analytic philosophy in all areas of philosophy, from logic, epistemology, and philosophy of science, to ethics, metaphysics, and philosophy of language. It illustrates that although women's voices were no different from men's as regards their scope and versatility, they had a much harder time being heard. The book is aimed at historians of philosophy and scholars in gender studies
Ludwig Wittgenstein (1889-1951) is one of the most important and influential philosophers in modern times, but he is also one of the least accessible. In this volume, leading experts chart the development of his work and clarify the connections between its different stages. The essays, which are both expository and original, address central themes in Wittgenstein's writing on a wide range of topics, particularly his thinking about the mind, language, logic, and mathematics. The contributors illuminate the character of the whole body of work by focusing on key topics: the style of the philosophy, the conception of grammar contained in it, rule-following, convention, logical necessity, the self, and what Wittgenstein called, in a famous phrase, 'forms of life'. This revised edition includes a new introduction, five new essays - on Tractarian ethics, Wittgenstein's development, aspects, the mind, and time and history - and a fully updated comprehensive bibliography.
Peter Unger's provocative new book poses a serious challenge to contemporary analytic philosophy, arguing that to its detriment it focuses the predominance of its energy on "empty ideas." In the mid-twentieth century, philosophers generally agreed that, by contrast with science, philosophy should offer no substantial thoughts about the general nature of concrete reality. Leading philosophers were concerned with little more than the semantics of ordinary words. For example: Our word "perceives" differs from our word "believes" in that the first word is used more strictly than the second. While someone may be correct in saying "I believe there's a table before me" whether or not there is a table before her, she will be correct in saying "I perceive there's a table before me" only if there is a table there. Though just a parochial idea, whether or not it is correct does make a difference to how things are with concrete reality. In Unger's terms, it is a concretely substantial idea. Alongside each such parochial substantial idea, there is an analytic or conceptual thought, as with the thought that someone may believe there is a table before her whether or not there is one, but she will perceive there is a table before her only if there is a table there. Empty of import as to how things are with concrete reality, those thoughts are what Unger calls concretely empty ideas. It is widely assumed that, since about 1970, things had changed thanks to the advent of such thoughts as the content externalism championed by Hilary Putnam and Donald Davidson, various essentialist thoughts offered by Saul Kripke, and so on. Against that assumption, Unger argues that, with hardly any exceptions aside from David Lewis's theory of a plurality of concrete worlds, all of these recent offerings are concretely empty ideas. Except when offering parochial ideas, Peter Unger maintains that mainstream philosophy still offers hardly anything beyond concretely empty ideas.
Many philosophers believe they can gain knowledge about the world from the comfort of their armchairs, simply by reflecting on the nature of things. But how can the mind arrive at substantive knowledge of the world without seeking its input? Michael Strevens proposes an original defense of the armchair pursuit of philosophical knowledge, focusing on "the method of cases," in which judgments about category membership-Does this count as causation? Does that count as the right action to take?-are used to test philosophical hypotheses about such matters as causality, moral responsibility, and beauty. Strevens argues that the method of cases is capable of producing reliable, substantial knowledge. His strategy is to compare concepts of philosophical things to concepts of natural kinds, such as water. Philosophical concepts, like natural kind concepts, do not contain the answers to philosophers' questions; armchair philosophy therefore cannot be conceptual analysis. But just as natural kind concepts provide a viable starting point for exploring the nature of the material world, so philosophical concepts are capable of launching and sustaining fruitful inquiry into philosophical matters, using the method of cases. Agonizing about unusual "edge cases," Strevens shows, can play a leading role in such discoveries. Thinking Off Your Feet seeks to reshape current debates about the nature of philosophical thinking and the methodological implications of experimental philosophy, to make significant contributions to the cognitive science of concepts, and to restore philosophy to its traditional position as an essential part of the human quest for knowledge.
First published in 1982, Ellery Eells' original work on rational decision making had extensive implications for probability theorists, economists, statisticians and psychologists concerned with decision making and the employment of Bayesian principles. His analysis of the philosophical and psychological significance of Bayesian decision theories, causal decision theories and Newcomb's paradox continues to be influential in philosophy of science. His book is now revived for a new generation of readers and presented in a fresh twenty-first-century series livery, including a specially commissioned preface written by Brian Skyrms, illuminating its continuing importance and relevance to philosophical enquiry.
During the course of the twentieth century, analytic philosophy developed into the dominant philosophical tradition in the English-speaking world. In the last two decades, it has become increasingly influential in the rest of the world, from continental Europe to Latin America and Asia. At the same time there has been deepening interest in the origins and history of analytic philosophy, as analytic philosophers examine the foundations of their tradition and question many of the assumptions of their predecessors. This has led to greater historical self-consciousness among analytic philosophers and more scholarly work on the historical contexts in which analytic philosophy developed. This historical turn in analytic philosophy has been gathering pace since the 1990s, and the present volume is the most comprehensive collection of essays to date on the history of analytic philosophy. It contains state-of-the-art contributions from many of the leading scholars in the field, all of the contributions specially commissioned. The introductory essays discuss the nature and historiography of analytic philosophy, accompanied by a detailed chronology and bibliography. Part One elucidates the origins of analytic philosophy, with special emphasis on the work of Frege, Russell, Moore, and Wittgenstein. Part Two explains the development of analytic philosophy, from Oxford realism and logical positivism to the most recent work in analytic philosophy, and includes essays on ethics, aesthetics, and political philosophy as well as on the areas usually seen as central to analytic philosophy, such as philosophy of language and mind. Part Three explores certain key themes in the history of analytic philosophy.
The idea that mathematics is reducible to logic has a long history, but it was Frege who gave logicism an articulation and defense that transformed it into a distinctive philosophical thesis with a profound influence on the development of philosophy in the twentieth century. This volume of classic, revised and newly written essays by William Demopoulos examines logicism's principal legacy for philosophy: its elaboration of notions of analysis and reconstruction. The essays reflect on the deployment of these ideas by the principal figures in the history of the subject - Frege, Russell, Ramsey and Carnap - and in doing so illuminate current concerns about the nature of mathematical and theoretical knowledge. Issues addressed include the nature of arithmetical knowledge in the light of Frege's theorem; the status of realism about the theoretical entities of physics; and the proper interpretation of empirical theories that postulate abstract structural constraints.
In this volume of essays, Howard Wettstein explores the foundations of religious commitment. His orientation is broadly naturalistic, but not in the mode of reductionism or eliminativism. This collection explores questions of broad religious interest, but does so through a focus on the author's religious tradition, Judaism. Among the issues explored are the nature and role of awe, ritual, doctrine, religious experience; the distinction between belief and faith; problems of evil and suffering with special attention to the Book of Job and to the Akedah, the biblical story of the binding of Isaac; the virtue of forgiveness. One of the book's highlights is its literary (as opposed to philosophical) approach to theology that at the same time makes room for philosophical exploration of religion. Another is Wettstein's rejection of the usual picture that sees religious life as sitting atop a distinctive metaphysical foundation, one that stands in need of epistemological justification.
John Rawls is widely regarded as one of the most influential philosophers of the twentieth century, and his work has permanently shaped the nature and terms of moral and political philosophy, deploying a robust and specialized vocabulary that reaches beyond philosophy to political science, economics, sociology, and law. This volume is a complete and accessible guide to Rawls's vocabulary, with over 200 alphabetical encyclopaedic entries written by the world's leading Rawls scholars. From 'basic structure' to 'burdened society', from 'Sidgwick' to 'strains of commitment', and from 'Nash point' to 'natural duties', the volume covers the entirety of Rawls's central ideas and terminology, with illuminating detail and careful cross-referencing. It will be an essential resource for students and scholars of Rawls, as well as for other readers in political philosophy, ethics, political science, sociology, international relations and law.
This book addresses controversies concerning the epistemological foundations of data science: Is it a genuine science? Or is data science merely some inferior practice that can at best contribute to the scientific enterprise, but cannot stand on its own? The author proposes a coherent conceptual framework with which these questions can be rigorously addressed. Readers will discover a defense of inductivism and consideration of the arguments against it: an epistemology of data science more or less by definition has to be inductivist, given that data science starts with the data. As an alternative to enumerative approaches, the author endorses Federica Russo's recent call for a variational rationale in inductive methodology. Chapters then address some of the key concepts of an inductivist methodology including causation, probability and analogy, before outlining an inductivist framework. The inductivist framework is shown to be adequate and useful for an analysis of the epistemological foundations of data science. The author points out that many aspects of the variational rationale are present in algorithms commonly used in data science. Introductions to algorithms and brief case studies of successful data science such as machine translation are included. Data science is located with reference to several crucial distinctions regarding different kinds of scientific practices, including between exploratory and theory-driven experimentation, and between phenomenological and theoretical science. Computer scientists, philosophers and data scientists of various disciplines will find this philosophical perspective and conceptual framework of great interest, especially as a starting point for further in-depth analysis of algorithms used in data science.
Willard Van Orman Quine's work revolutionized the fields of epistemology, semantics and ontology. At the heart of his philosophy are several interconnected doctrines: his rejection of conventionalism and of the linguistic doctrine of logical and mathematical truth, his rejection of the analytic/synthetic distinction, his thesis of the indeterminacy of translation and his thesis of the inscrutability of reference. In this book Edward Becker sets out to interpret and explain these doctrines. He offers detailed analyses of the relevant texts, discusses Quine's views on meaning, reference and knowledge, and shows how Quine's views developed over the years. He also proposes a new version of the linguistic doctrine of logical truth, and a new way of rehabilitating analyticity. His rich exploration of Quine's thought will interest all those seeking to understand and evaluate the work of one of the most important philosophers of the second half of the twentieth century.
The claim that contemporary analytic philosophers rely extensively on intuitions as evidence is almost universally accepted in current meta-philosophical debates and it figures prominently in our self-understanding as analytic philosophers. No matter what area you happen to work in and what views you happen to hold in those areas, you are likely to think that philosophizing requires constructing cases and making intuitive judgments about those cases. This assumption also underlines the entire experimental philosophy movement: only if philosophers rely on intuitions as evidence are data about non-philosophers' intuitions of any interest to us. Our alleged reliance on the intuitive makes many philosophers who don't work on meta-philosophy concerned about their own discipline: they are unsure what intuitions are and whether they can carry the evidential weight we allegedly assign to them. The goal of this book is to argue that this concern is unwarranted since the claim is false: it is not true that philosophers rely extensively (or even a little bit) on intuitions as evidence. At worst, analytic philosophers are guilty of engaging in somewhat irresponsible use of 'intuition'-vocabulary. While this irresponsibility has had little effect on first order philosophy, it has fundamentally misled meta-philosophers: it has encouraged meta-philosophical pseudo-problems and misleading pictures of what philosophy is.
Key Thinkers in Linguistics and the Philosophy of Language is a unique and accessible reference guide to the work of figures who have played an important role in the development of ideas about language. It includes eighty entries on individual thinkers in the Western tradition, ranging from antiquity to the present day, chosen because of their impact on the description or theory of language. Each entry explains the main ideas of the thinker, outlining their development and assessing their significance and influence. Brief biographical details place the subject in his or her cultural and historical context. No prior knowledge of either linguistics or philosophy is assumed; each entry concludes with suggestions for further reading of both primary texts and secondary sources, encouraging readers to find out more about the particular key thinker and the impact of his or her ideas. Thinkers included range from Plato and Aristotle, through Berkeley, Leibniz, Kant, Russell, Wittgenstein, and Austin, to Sacks, Kristeva, and Chomsky.Features * The only single-volume reference resource to bring together linguistics and the philosophy of language * Entries are extensively cross-referenced, allowing readers to trace influences, developments and debates both in contemporary thinking and across time * Accessibly written for use at all levels, including undergraduate, postgraduate, academic and other general readers in the fields of linguistics and the philosophy of language.
The Rules of Thought develops a rationalist theory of mental content while defending a traditional epistemology of philosophy. Jonathan Jenkins Ichikawa and Benjamin W. Jarvis contend that a capacity for pure rational thought is fundamental to mental content itself and underwrites our quotidian reasoning and extraordinary philosophical engagement alike. Part I of the book develops a Fregean theory of mental content, according to which rational relations between propositions play a central role in individuating contents; the theory is designed to be sensitive not only to Frege's puzzle and other data that have motivated rationalist conceptions of content, but also to considerations in the philosophy of mind and language that have motivated neo-Russellian views. Part II articulates a theory of the a priori, and shows that, given the framework of Part I, it is very plausible that much philosophical work of interest is genuinely a priori. Notably, it is no part of the picture developed that intuitions have an important role to play, either in mental content, or in the epistemology of the a priori; Part III defends this departure from rationalist orthodoxy.
The idea that mathematics is reducible to logic has a long history, but it was Frege who gave logicism an articulation and defense that transformed it into a distinctive philosophical thesis with a profound influence on the development of philosophy in the twentieth century. This volume of classic, revised and newly written essays by William Demopoulos examines logicism's principal legacy for philosophy: its elaboration of notions of analysis and reconstruction. The essays reflect on the deployment of these ideas by the principal figures in the history of the subject - Frege, Russell, Ramsey and Carnap - and in doing so illuminate current concerns about the nature of mathematical and theoretical knowledge. Issues addressed include the nature of arithmetical knowledge in the light of Frege's theorem; the status of realism about the theoretical entities of physics; and the proper interpretation of empirical theories that postulate abstract structural constraints.
Nicholas J. J. Smith argues that an adequate account of vagueness must involve degrees of truth. The basic idea of degrees of truth is that while some sentences are true and some are false, others possess intermediate truth values: they are truer than the false sentences, but not as true as the true ones. This idea is immediately appealing in the context of vagueness-yet it has fallen on hard times in the philosophical literature, with existing degree-theoretic treatments of vagueness facing apparently insuperable objections. Smith seeks to turn the tide in favour of a degree-theoretic treatment of vagueness, by motivating and defending the basic idea that truth can come in degrees. He argues that no theory of vagueness that does not countenance degrees of truth can be correct, and develops a new degree-theoretic treatment of vagueness-fuzzy plurivaluationism-that solves the problems plaguing earlier degree theories.
An in-depth history of the linguistic turn in analytic philosophy, from a leading philosopher of language This is the second of five volumes of a definitive history of analytic philosophy from the invention of modern logic in 1879 to the end of the twentieth century. Scott Soames, a leading philosopher of language and historian of analytic philosophy, provides the fullest and most detailed account of the analytic tradition yet published, one that is unmatched in its chronological range, topics covered, and depth of treatment. Focusing on the major milestones and distinguishing them from detours, Soames gives a seminal account of where the analytic tradition has been and where it appears to be heading. Volume 2 provides an intensive account of the new vision in analytical philosophy initiated by Ludwig Wittgenstein's Tractatus Logico-Philosophicus, its assimilation by the Vienna Circle of Moritz Schlick and Rudolf Carnap, and the subsequent flowering of logical empiricism. With this "linguistic turn," philosophical analysis became philosophy itself, and the discipline's stated aim was transformed from advancing philosophical theories to formalizing, systematizing, and unifying science. In addition to exploring the successes and failures of philosophers who pursued this vision, the book describes how the philosophically minded logicians Kurt Godel, Alfred Tarski, Alonzo Church, and Alan Turing discovered the scope and limits of logic and developed the mathematical theory of computation that ushered in the digital era. The book's account of this pivotal period closes with a searching examination of the struggle to preserve ethical normativity in a scientific age.
The Things We Do and Why We Do Them argues against the common assumption that there is one thing called 'action' which all reason-giving explanations of action are geared towards. Sandis shows why all theories concerned with identifying the nature of our 'real' reasons for action fail from the outset.
This book has two objectives: to be a contribution to the understanding of Frege's theory of truth - especially a defence of his notorious critique of the correspondence theory - and to be an introduction to the practice of interpreting philosophical texts.
Paul Horwich develops an interpretation of Ludwig Wittgenstein's later writings that differs in substantial respects from what can already be found in the literature. He argues that it is Wittgenstein's radically anti-theoretical metaphilosophy-and not (as assumed by most other commentators) his identification of the meaning of a word with its use-that lies at the foundation of his discussions of specific issues concerning language, the mind, mathematics, knowledge, art, and religion. Thus Horwich's first aim is to give a clear account of Wittgenstein's hyper-deflationist view of what philosophy is, how it should be conducted, and what it might achieve. His second aim is to defend this view against a variety of objections: that is, to display its virtues, not merely as an accurate reading of Wittgenstein, but as the correct conception of philosophy itself. And the third aim is to examine the application of this view to a variety of topics-but primarily to language and to experience. A further distinctive feature of this approach is its presupposition that Wittgenstein's ideas may be formulated with precision and that solid arguments may be found on their behalf. This pair of guiding assumptions-the centrality of Wittgenstein's metaphilosophy, and its susceptibility to rigorous articulation and rational support-are admittedly controversial but are vindicated, not just textually, but by the power and plausibility of the philosophy that results from them.
Hilary Putnam's ever-evolving philosophical oeuvre has been called "the history of recent philosophy in outline"-an intellectual achievement, nearly seventy years in the making, that has shaped disciplinary fields from epistemology to ethics, metaphysics to the philosophy of physics, the philosophy of mathematics to the philosophy of mind. Naturalism, Realism, and Normativity offers new avenues into the thought of one of the most influential minds in contemporary analytic philosophy. The essays collected here cover a range of interconnected topics including naturalism, commonsense and scientific realism, ethics, perception, language and linguistics, and skepticism. Aptly illustrating Putnam's willingness to revisit and revise past arguments, they contain important new insights and freshly illuminate formulations that will be familiar to students of his work: his rejection of the idea that an absolute conception of the world is obtainable; his criticism of a nihilistic view of ethics that claims to be scientifically based; his pathbreaking distinction between sensations and apperceptions; and his use of externalist semantics to invalidate certain forms of skepticism. Above all, Naturalism, Realism, and Normativity reflects Putnam's thinking on how to articulate a theory of naturalism which acknowledges that normative phenomena form an ineluctable part of human experience, thereby reconciling scientific and humanistic views of the world that have long appeared incompatible.
In this excellent book Sebastien Gandon focuses mainly on Russell's two major texts, Principa Mathematica and Principle of Mathematics , meticulously unpicking the details of these texts and bringing a new interpretation of both the mathematical and the philosophical content. Winner of The Bertrand Russell Society Book Award 2013.
The first book in English to offer a systematic survey of Bolzano's philosophical logic and theory of knowledge, it offers a reconstruction of Bolzano's views on a series of key issues: the analysis of meaning, generality, analyticity, logical consequence, mathematical demonstration and knowledge by virtue of meaning.
F. H. Bradley (1846-1924) was the foremost philosopher of the British Idealist school, which came to prominence in the second half of the nineteenth century. Bradley, who was a life fellow of Merton College, Oxford, was influenced by Hegel, and also reacted against utilitarianism. He was recognised during his lifetime as one of the greatest intellectuals of his generation and was the first philosopher to receive the Order of Merit, in 1924. His work is considered to have been important to the formation of analytic philosophy. In metaphysics, he rejected pluralism and realism, and believed that English philosophy needed to deal systematically with first principles. This work, first published in 1893, is divided into two parts: 'Appearance' deals with exposing the contradictions that Bradley believed are hidden in our everyday conceptions of the world; and in 'Reality', he builds his positive account of reality and considers possible objections to it. |
![]() ![]() You may like...
Polarization and Moment Tensors - With…
Habib Ammari, Hyeonbae Kang
Hardcover
R3,305
Discovery Miles 33 050
Handbook of Research on Intelligent…
Anil Kumar, Manoj Kumar Dash, …
Hardcover
R7,498
Discovery Miles 74 980
Mike Meyers' CompTIA A+ Guide to…
Mike Meyers, Travis Everett, …
Paperback
R1,478
Discovery Miles 14 780
Skin We Are In - A Celebration Of The…
Sindiwe Magona, Nina G. Jablonski
Paperback
R370
Discovery Miles 3 700
Guts of Surfaces and the Colored Jones…
David Futer, Efstratia Kalfagianni, …
Paperback
R1,875
Discovery Miles 18 750
|