|
Books > Humanities > Philosophy > Western philosophy > Modern Western philosophy, c 1600 to the present > Western philosophy, from c 1900 - > Analytical & linguistic philosophy
As the foundation of our rationality, logic has traditionally been
considered fixed, stable and constant. This conception of the
discipline has been challenged recently by the plurality of logics
and in this book, Pavel Arazim extends the debate to offer a new
view of logic as dynamic and without a definite, specific shape.
The Problem of Plurality of Logics examines the origins of our
standard view of logic alongside Kant's theories, the holistic
view, the issue of logic's pragmatic significance and Robert
Brandom's logical expressivism. Arazim then draws on
proof-theoretical approaches to present a convincing argument for a
dynamic version of logical inferentialism, which opens space for a
new freedom to modify our own logic. He explores the scope,
possibilities and limits of this freedom in order to highlight the
future paths logic could take, as a motivation for further
research. Marking a departure from logical monism and also from the
recent doctrine of logical pluralism in its various forms, this
book addresses current debates concerning the expressive role of
logic and contributes to a lively area of discussion in analytic
philosophy.
Throughout his career, Keith Hossack has made outstanding
contributions to the theory of knowledge, metaphysics and the
philosophy of mathematics. This collection of previously
unpublished papers begins with a focus on Hossack's conception of
the nature of knowledge, his metaphysics of facts and his account
of the relations between knowledge, agents and facts. Attention
moves to Hossack's philosophy of mind and the nature of
consciousness, before turning to the notion of necessity and its
interaction with a priori knowledge. Hossack's views on the nature
of proof, logical truth, conditionals and generality are discussed
in depth. In the final chapters, questions about the identity of
mathematical objects and our knowledge of them take centre stage,
together with questions about the necessity and generality of
mathematical and logical truths. Knowledge, Number and Reality
represents some of the most vibrant discussions taking place in
analytic philosophy today.
For centuries, philosophers have addressed the ontological question
of whether God exists. Most recently, philosophers have begun to
explore the axiological question of what value impact, if any,
God's existence has (or would have) on our world. This book brings
together four prestigious philosophers, Michael Almeida, Travis
Dumsday, Perry Hendricks and Graham Oppy, to present different
views on the axiological question about God. Each contributor
expresses a position on axiology, which is then met with responses
from the remaining contributors. This structure makes for genuine
discussion and developed exploration of the key issues at stake,
and shows that the axiological question is more complicated than it
first appears. Chapters explore a range of relevant issues,
including the relationship between Judeo-Christian theism and
non-naturalist alternatives such as pantheism, polytheism, and
animism/panpsychism. Further chapters consider the attitudes and
emotions of atheists within the theism conversation, and develop
and evaluate the best arguments for doxastic pro-theism and
doxastic anti-theism. Of interest to those working on philosophy of
religion, theism and ethics, this book presents lively accounts of
an important topic in an exciting and collaborative way, offered by
renowned experts in this area.
During the first quarter of the twentieth century, the French
philosopher Henri Bergson became an international celebrity,
profoundly influencing contemporary intellectual and artistic
currents. While Bergsonism was fashionable, L. Susan Stebbing,
Bertrand Russell, Moritz Schlick, and Rudolf Carnap launched
different critical attacks against some of Bergson's views. This
book examines this series of critical responses to Bergsonism early
in the history of analytic philosophy. Analytic criticisms of
Bergsonism were influenced by William James, who saw Bergson as an
'anti-intellectualist' ally of American Pragmatism, and Max
Scheler, who saw him as a prophet of Lebensphilosophie. Some of the
main analytic objections to Bergson are answered in the work of
Karin Costelloe-Stephen. Analytic anti-Bergsonism accompanied the
earlier refutations of idealism by Russell and Moore, and later
influenced the Vienna Circle's critique of metaphysics. It
eventually contributed to the formation of the view that 'analytic'
philosophy is divided from its 'continental' counterpart.
This edited volume examines the relationship between collective
intentionality and inferential theories of meaning. The book
consists of three main sections. The first part contains essays
demonstrating how researchers working on inferentialism and
collective intentionality can learn from one another. The essays in
the second part examine the dimensions along which philosophical
and empirical research on human reasoning and collective
intentionality can benefit from more cross-pollination. The final
part consists of essays that offer a closer examination of themes
from inferentialism and collective intentionality that arise in the
work of Wilfrid Sellars. Groups, Norms and Practices provides a
template for continuing an interdisciplinary program in philosophy
and the sciences that aims to deepen our understanding of human
rationality, language use, and sociality.
This book demonstrates for the first time how the work of Ludwig
Wittgenstein can transform 4E Cognitive Science. In particular, it
shows how insights from Wittgenstein can empower those within 4E to
reject the long held view that our minds must involve
representations inside our heads. The book begins by showing how
proponents of 4E are divided amongst themselves. Proponents of
Extended Mind insist that internal representations are always
needed to explain the human mind. However, proponents of Enacted
Mind reject this claim. Using insights from Ludwig Wittgenstein,
the book introduces and defends a new theoretical framework called
Structural Enacted or Extended Mind (STEEM). STEEM brings together
Enacted Mind and Extended Mind in a way that rejects all talk of
internal representations. STEEM thus highlights the
anti-representationalist credentials of 4E and so demonstrates how
4E can herald a new beginning when it comes to thinking about the
mind.
Bernard Lonergan (1904-84) is acknowledged as one of the most
significant philosopher-theologians of the 20th century. Lonergan,
Meaning and Method in many ways complements Andrew Beards' previous
book on Lonergan, Insight and Analysis (Bloomsbury, 2010). Andrew
Beards applies Lonergan's thought and brings it into critical
dialogue and discussion with other contemporary philosophical
interlocutors, principally from the analytical tradition. He also
introduces themes and arguments from the continental tradition, as
well as offering interpretative analysis of some central notions in
Lonergan's thought that are of interest to all who wish to
understand the importance of Lonergan's work for philosophy and
Christian theology. Three of the chapters focus upon areas of
fruitful exchange and debate between Lonergan's thought and the
work of three major figures in current analytical philosophy: Nancy
Cartwright, Timothy Williamson and Scott Soames. The discussion
also ranges across such topics as meaning theory, metaphilosophy,
epistemology, philosophy of science and aesthetics.
This edited collection provides the first comprehensive volume on
A. J. Ayer's 1936 masterpiece, Language, Truth and Logic. With
eleven original chapters the volume reconsiders the historical and
philosophical significance of Ayer's work, examining its place in
the history of analytic philosophy and its subsequent legacy.
Making use of pioneering research in logical empiricism, the
contributors explore a wide variety of topics, from ethics, values
and religion, to truth, epistemology and philosophy of language.
Among the questions discussed are: How did Ayer preserve or distort
the views and conceptions of logical empiricists? How are Ayer's
arguments different from the ones he aimed at reconstructing? And
which aspects of the book were responsible for its immense impact?
The volume expertly places Language, Truth and Logic in the
intellectual and socio-cultural history of twentieth-century
philosophical thought, providing both introductory and contextual
chapters, as well as specific explorations of a variety of topics
covering the main themes of the book. Providing important insights
of both historical and contemporary significance, this collection
is an essential resource for scholars interested in the legacy of
the Vienna Circle and its effect on ethics and philosophy of mind.
A Critical Introduction to Fictionalism provides a clear and
comprehensive understanding of an important alternative to realism.
Drawing on questions from ethics, the philosophy of religion, art,
mathematics, logic and science, this is a complete exploration of
how fictionalism contrasts with other non-realist doctrines and
motivates influential fictionalist treatments across a range of
philosophical issues. Defending and criticizing influential as well
as emerging fictionalist approaches, this accessible overview
discuses physical objects, universals, God, moral properties,
numbers and other fictional entities. Where possible it draws
general lessons about the conditions under which a fictionalist
treatment of a class of items is plausible. Distinguishing
fictionalism from other views about the existence of items, it
explains the central features of this key metaphysical topic.
Featuring a historical survey, definitions of key terms,
characterisations of important subdivisions, objections and
problems for fictionalism, and contemporary fictionalist treatments
of several issues, A Critical Introduction to Fictionalism is a
valuable resource for students of metaphysics as well as students
of philosophical methodology. It is the only book of its kind.
This book repairs and revives the Theory of Knowledge research
program of Russell's Principia era. Chapter 1, 'Introduction and
Overview', explains the program's agenda. Inspired by the
non-Fregean logicism of Principia Mathematica, it endorses the
revolution within mathematics presenting it as a study of
relations. The synthetic a priori logic of Principia is the essence
of philosophy considered as a science which exposes the dogmatisms
about abstract particulars and metaphysical necessities that create
prisons that fetter the mind. Incipient in The Problems of
Philosophy, the program's acquaintance epistemology embraced a
multiple-relation theory of belief. It reached an impasse in 1913,
having been itself retrofitted with abstract particular logical
forms to address problems of direction and compositionality. With
its acquaintance epistemology in limbo, Scientific Method in
Philosophy became the sequel to Problems. Chapter 2 explains
Russell's feeling intellectually dishonest. Wittgenstein's demand
that logic exclude nonsense belief played no role. The 1919 neutral
monist era ensued, but Russell found no epistemology for the logic
essential to philosophy. Repairing, Chapters 4-6 solve the impasse.
Reviving, Chapters 3 and 7 vigorously defend the facts about
Principia. Studies of modality and entailment are viable while
Principia remains a universal logic above the civil wars of the
metaphysicians.
How ought you to evaluate your options if you're uncertain about
what's fundamentally valuable? A prominent response is Expected
Value Maximisation (EVM)-the view that under axiological
uncertainty, an option is better than another if and only if it has
the greater expected value across axiologies. But the expected
value of an option depends on quantitative probability and value
facts, and in particular on value comparisons across axiologies. We
need to explain what it is for such facts to hold. Also, EVM is by
no means self-evident. We need an argument to defend that it's
true. This book introduces an axiomatic approach to answer these
worries. It provides an explication of what EVM means by use of
representation theorems: intertheoretic comparisons can be
understood in terms of facts about which options are better than
which, and mutatis mutandis for intratheoretic comparisons and
axiological probabilities. And it provides a systematic argument to
the effect that EVM is true: the theory can be vindicated through
simple axioms. The result is a formally cogent and philosophically
compelling extension of standard decision theory, and original take
on the problem of axiological or normative uncertainty.
This book aims to explain the decline of the later Wittgensteinian
tradition in analytic philosophy during the second half of the
twentieth century. Throughout the 1950s, Oxford was the center of
analytic philosophy and Wittgenstein - the later Wittgenstein - the
most influential contemporary thinker within that philosophical
tradition. Wittgenstein's methods and ideas were widely accepted,
with everything seeming to point to the Wittgensteinian paradigm
having a similar impact on the philosophical scenes of all English
speaking countries. However, this was not to be the case. By the
1980s, albeit still important, Wittgenstein was considered as a
somewhat marginal thinker. What occurred within the history of
analytic philosophy to produce such a decline? This book expertly
traces the early reception of Wittgenstein in the United States,
the shift in the humanities to a tradition rooted in the natural
sciences, and the economic crisis of the mid-1970s, to reveal the
factors that contributed to the eventual hostility towards the
later Wittgensteinian tradition.
|
|