![]() |
![]() |
Your cart is empty |
||
Books > Humanities > Philosophy > Topics in philosophy > Epistemology, theory of knowledge
This work investigates crucial aspects of Kant's epistemology and ethics in relation to Kierkegaard's thinking. The challenge is taken up of developing a systematic reconstruction of Kant's and Kierkegaard's position. Kant forms a matrix for the interpretation of Kierkegaard, and considerable space is devoted to the exposition of Kant at those various points at which contact with Kierkegaard's thought is to be demonstrated. The burden of the argument is that Kierkegaard in his account of the stages is much closer to Kant than the texts initially reveal. It is possible, then, to arrive at a proper grasp of Kierkegaard's final position by seeing just how radically the stage of Christian faith (Religiousness B) departs from Kant.
This volume seeks to further the use of formal methods in clarifying one of the central problems of philosophy: that of our free human agency and its place in our indeterministic world. It celebrates the important contributions made in this area by Nuel Belnap, American logician and philosopher. Philosophically, indeterminism and free action can seem far apart, but in Belnap's work, they are intimately linked. This book explores their philosophical interconnectedness through a selection of original research papers that build forth on Belnap's logical and philosophical work. Some contributions take the form of critical discussions of Belnap's published work, some develop points made in his publications in new directions, and others provide additional insights on the topics of indeterminism and free action. In Nuel Belnap's work on indeterminism and free action, three formal frameworks figure prominently: the simple branching histories framework known as "branching time;" its relativistic spatio-temporal extension, branching space-times; and the "seeing to it that" (stit ) logic of agency. As those frameworks provide the formal background for the contributed papers, the volume introduction gives an overview of the current state of their development. It also introduces case-intensional first order logic (CIFOL), a general intensional logic offering resources for a first-order extension of the mentioned frameworks and a recent research focus of Belnap's. The volume also contains an extended biographical interview with Nuel Belnap.
The "Blackwell Guide to Philosophy of Language" is a collection of
twenty new essays in a cutting-edge and wide-ranging field.
Archives are considered to be collections of administrative, legal, commercial and other records or the actual place where they are located. They have become ubiquitous in the modern world, but emerged not much later than the invention of writing. Following Foucault, who first used the word archive in a metaphorical sense as "the general system of the formation and transformation of statements" in his "Archaeology of Knowledge" (1969), postmodern theorists have tried to exploit the potential of this concept and initiated the "archival turn". In recent years, however, archives have attracted the attention of anthropologists and historians of different denominations regarding them as historical objects and "grounding" them again in real institutions. The papers in this volume explore the complex topic of the archive in a historical, systematic and comparative context and view it in the broader context of manuscript cultures by addressing questions like how, by whom and for which purpose were archival records produced, and if they differ from literary manuscripts regarding materials, formats, and producers (scribes).
Was David Hume radically sceptical about our attempts to understand the world or was he merely approaching philosophical problems from a scientific perspective? Most philosophers today believe that Hume's outlook was more scientific than radically sceptical and that his scepticism was more limited than previously supposed. If these philosophers are correct, then Hume's approach to philosophy mirrors the approach of many contemporary philosophers. This similarity between Hume and many aspects of contemporary philosophy suggests that we should try to understand Hume not as an historical relic but as a partner in a continuing philosophical dialogue. When we look closely at Hume's thoughts about human understanding, we find that Hume's scepticism emerges very insistently in the context of Hume's scientific approach. This book tries to come to terms with Hume's scepticism in a way that sheds light on contemporary philosophy and its relationship to science.
In "Cognitive Integration: Attacking The Bounds of Cognition"
Richard Menary argues that the real pay-off from
extended-mind-style arguments is not a new form of externalism in
the philosophy of mind, but a view in which the 'internal' and
'external' aspects of cognition are integrated into a whole.
How do ordinary objects persist through time and across possible worlds? How do they manage to have their temporal and modal properties? These are the questions adressed in this book which is? "guided tour of theories of persistence." The book is divided in two parts. In the first, the two traditional accounts of persistence through time (endurantism and perdurantism) are combined with presentism and eternalism to yield four different views, and their variants. The resulting views are then examined in turn, in order to see which combinations are appealing and which are not. It is argued that the 'worm view' variant of eternalist perdurantism is superior to the other alternatives. In the second part of the book, the same strategy is applied to the combinations of views about persistence across possible worlds (trans-world identity, counterpart theory, modal perdurants) and views about the nature of worlds, mainly modal realism and abstractionism. Not only all the traditional and well-known views, but also some more original ones, are examined and their pros and cons are carefully weighted. Here again, it is argued that perdurance seems to be the best strategy available.
Hermeneutic philosophies of social science offer an approach to the philosophy of social science foregrounding the human subject and including attention to history as well as a methodological reflection on the notion of reflection, including the intrusions of distortions and prejudice. Hermeneutic philosophies of social science offer an explicit orientation to and concern with the subject of the human and social sciences. Hermeneutic philosophies of the social science represented in the present collection of essays draw inspiration from Gadamer's work as well as from Paul Ricoeur in addition to Michel de Certeau and Michel Foucault among others. Special attention is given to Wilhelm Dilthey in addition to the broader phenomenological traditions of Edmund Husserl and Martin Heidegger as well as the history of philosophy in Plato and Descartes. The volume is indispensible reading for students and scholars interested in epistemology, philosophy of science, social social studies of knowledge as well as social studies of technology.
The present volume contains a selection of papers presented at the Fifth Conference on Collective Intentionality held at the University of Helsinki August 31 to September 2, 2006 and two additional contributions. The common aim of the papers is to explore the structure of shared intentional attitudes, and to explain how they underlie the social, cultural and institutional world. The contributions to this volume explore the phenomenology of sharedness, the concept of sharedness, and also various aspects of the structure of collective intentionality in general, and of the intricate relations between sharedness and normativity in particular. Concepts of Sharedness shows how rich and lively the philosophical research focused on the analysis of collective intentionality has become, and will provide further inspiration for future work in this rapidly evolving field.
This volume covers a wide range of topics in the most recent debates in the philosophy of mathematics, and is dedicated to how semantic, epistemological, ontological and logical issues interact in the attempt to give a satisfactory picture of mathematical knowledge. The essays collected here explore the semantic and epistemic problems raised by different kinds of mathematical objects, by their characterization in terms of axiomatic theories, and by the objectivity of both pure and applied mathematics. They investigate controversial aspects of contemporary theories such as neo-logicist abstractionism, structuralism, or multiversism about sets, by discussing different conceptions of mathematical realism and rival relativistic views on the mathematical universe. They consider fundamental philosophical notions such as set, cardinal number, truth, ground, finiteness and infinity, examining how their informal conceptions can best be captured in formal theories. The philosophy of mathematics is an extremely lively field of inquiry, with extensive reaches in disciplines such as logic and philosophy of logic, semantics, ontology, epistemology, cognitive sciences, as well as history and philosophy of mathematics and science. By bringing together well-known scholars and younger researchers, the essays in this collection - prompted by the meetings of the Italian Network for the Philosophy of Mathematics (FilMat) - show how much valuable research is currently being pursued in this area, and how many roads ahead are still open for promising solutions to long-standing philosophical concerns. Promoted by the Italian Network for the Philosophy of Mathematics - FilMat
This book explains how gossip contributes to knowledge. Karen Adkins marshals scholarship and case studies spanning centuries and disciplines to show that although gossip is a constant activity in human history, it has rarely been studied as a source of knowledge. People gossip for many reasons, but most often out of desire to make sense of the world while lacking access to better options for obtaining knowledge. This volume explores how, when our access to knowledge is blocked, gossip becomes a viable path to knowledge attainment, one that involves the asking of questions, the exchange of ideas, and the challenging of preconceived notions.
Value, Reality, and Desire is an extended argument for a robust realism about value. The robust realist affirms the following distinctive theses. There are genuine claims about value which are true or false - there are facts about value. These value-facts are mind-independent - they are not reducible to desires or other mental states, or indeed to any non-mental facts of a non-evaluative kind. And these genuine, mind-independent, irreducible value-facts are causally efficacious. Values, quite literally, affect us. These are not particularly fashionable theses, and taken as a whole they go somewhat against the grain of quite a lot of recent work in the metaphysics of value. Further, against the received view, Oddie argues that we can have knowledge of values by experiential acquaintance, that there are experiences of value which can be both veridical and appropriately responsive to the values themselves. Finally, these value-experiences are not the products of some exotic and implausible faculty of 'intuition'. Rather, they are perfectly mundane and familiar mental states - namely, desires. This view explains how values can be 'intrinsically motivating', without falling foul of the widely accepted 'queerness' objection. There are, of course, other objections to each of the realist's claims. In showing how and why these objections fail, Oddie introduces a wealth of interesting and original insights about issues of wider interest - including the nature of properties, reduction, supervenience, and causation. The result is a novel and interesting account which illuminates what would otherwise be deeply puzzling features of value and desire and the connections between them.
Jonathan Knowles argues against theories that seek to provide specific norms for the formation of belief on the basis of empirical sources: the project of naturalized epistemology. He argues that such norms are either not genuinely normative for belief, or are not required for optimal belief formation. An exhaustive classification of such theories is motivated and each variety is discussed in turn. He distinguishes naturalized epistemology from the less committal idea of naturalism, which provides a sense in which we can achieve epistemic normativity without norms.
How do cognitive neuroscientists explain phenomena like memory or language processing? This book examines the different kinds of experiments and manipulative research strategies involved in understanding and eventually explaining such phenomena. Against this background, it evaluates contemporary accounts of scientific explanation, specifically the mechanistic and interventionist accounts, and finds them to be crucially incomplete. Besides, mechanisms and interventions cannot actually be combined in the way usually done in the literature. This book offers solutions to both these problems based on insights from experimental practice. It defends a new reading of the interventionist account, highlights the importance of non-interventionist studies for scientific inquiry, and supplies a taxonomy of experiments that makes it easy to see how the gaps in contemporary accounts of scientific explanation can be filled. The book concludes that a truly empirically adequate philosophy of science must take into account a much wider range of experimental research than has been done to date. With the taxonomy provided, this book serves a stepping-stone leading into a new era of philosophy of science-for cognitive neuroscience and beyond.
Relevant to, and drawing from, a range of disciplines, the chapters in this collection show the diversity, and applicability, of research in Bayesian argumentation. Together, they form a challenge to philosophers versed in both the use and criticism of Bayesian models who have largely overlooked their potential in argumentation. Selected from contributions to a multidisciplinary workshop on the topic held in Sweden in 2010, the authors count linguists and social psychologists among their number, in addition to philosophers. They analyze material that includes real-life court cases, experimental research results, and the insights gained from computer models. The volume provides, for the first time, a formal measure of subjective argument strength and argument force, robust enough to allow advocates of opposing sides of an argument to agree on the relative strengths of their supporting reasoning. With papers from leading figures such as Michael Oaksford and Ulrike Hahn, the book comprises recent research conducted at the frontiers of Bayesian argumentation and provides a multitude of examples in which these formal tools can be applied to informal argument. It signals new and impending developments in philosophy, which has seen Bayesian models deployed in formal epistemology and philosophy of science, but has yet to explore the full potential of Bayesian models as a framework in argumentation. In doing so, this revealing anthology looks destined to become a standard teaching text in years to come. "
Rebirth and the Stream of Life explores the diversity as well as the ethical and religious significance of rebirth beliefs, focusing especially on Hindu and Buddhist traditions but also discussing indigenous religions and ancient Greek thought. Utilizing resources from religious studies, anthropology and theology, an expanded conception of philosophy of religion is exemplified, which takes seriously lived experience rather than treating religious beliefs in isolation from their place in believers' lives. Drawing upon his expertise in interdisciplinary working and Wittgenstein-influenced approaches, Mikel Burley examines several interrelated phenomena, including purported past-life memories, the relationship between metaphysics and ethics, efforts to 'demythologize' rebirth, and moral critiques of the doctrine of karma. This range of topics, with rebirth as a unifying theme, makes the book of value to anyone interested in philosophy, the study of religions, and what it means to believe that we undergo multiple lives.
Over the last few decades information and communication technology has come to play an increasingly prominent role in our dealings with other people. Computers, in particular, have made available a host of new ways of interacting, which we have increasingly made use of. In the wake of this development a number of ethical questions have been raised and debated. Ethics in Cyberspace focuses on the consequences for ethical agency of mediating interaction by means of computers, seeking to clarify how the conditions of certain kinds of interaction in cyberspace (for example, in chat-rooms and virtual worlds) differ from the conditions of interaction face-to-face and how these differences may come to affect the behaviour of interacting agents in terms of ethics.
Epistemology or theory of knowledge has always been one of the most important -if not the most important -field of philosophy. New arguments are constantly brought to bear on old views, new variants are marshalled to revive ancient stands, new concepts and distinctions increase the sophistication of epistemogical theories. There are a great many excellent textbooks, monographs as well as anthologies consisting of articles in epistemology. Similarly, there are useful philosophical dictionaries which contain a great number of relatively short entries, and general philosophical handbooks which also touch epistemological issues. This volume of 27 essays grew out from the interest to see a handbook which is devoted entirely to the historical roots and systematic development of theory of knowledge. It is not intended to compete but to supplement the already existing literature. It aims at giving both beginners and more advanced students as well as professionals in epistemology and other areas of philosophy an overview of the central problems and solutions of epistemology. The essays are self-contained and stil often rather extensive discussions of the chosen aspects of knowledge. The contributions presuppose very little familiarity with previous literature and only a few of them require the mastery of even elementary logical notation. This, we hope, makes the volume also accessible to the philosophically interested wider audience. The contributors were asked to provide substantial, up-to-date, self-contained and balanced surveys of the various subareas and more specific topics of epistemology, with reference to literature.
This book examines the dependence of transhumanist arguments on the credibility of the narratives of meaning in which they are embedded. By taking the key ideas from transhumanist philosophy - the desirability of human self-design and immortality, the elimination of all suffering and the expansion of human autonomy - Michael Hauskeller explores these narratives and the understanding of human nature that informs them. Particular attention is paid to the theory of transhumanism as a form of utopia, stories of human nature, the increasing integration of the radical human enhancement project into the cultural mainstream, and the drive to upgrade from flesh to machine.
Does scepticism threaten our common sense picture of the world? Does it really undermine our deep-rooted certainties? Answers to these questions are offered through a comparative study of the epistemological work of two key figures in the history of analytic philosophy, G. E. Moore and Ludwig Wittgenstein.
"Could there have been nothing?" is the first book-length study of
metaphysical nihilism - the claim that there could have been no
concrete objects. It critically analyses the debate around nihilism
and related questions about the metaphysics of possible worlds,
concrete objects and ontological dependence.
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The main feature of Shapiro's account is that the extensions (and anti-extensions) of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann. The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox. Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects.
The volume "Conceptions of Knowledge" collects current essays on contemporary epistemology and philosophy of science. The essays are primarily concerned with pragmatic and contextual extensions of analytic epistemology but also deal with traditional questions like the nature of knowledge and skepticism. The topics include the connection between "knowing that" and "knowing how," the relevance of epistemic abilities, the embedding of knowledge ascriptions in context and contrast classes, the interpretation of skeptical doubt, and the various forms of knowledge.
How can one think about the same thing twice without knowing that
it's the same thing? How can one think about nothing at all (for
example Pegasus, the mythical flying horse)? Is thinking about
oneself special? One could mistake one's car for someone else's,
but it seems one could not mistake one's own headache for someone
else's. Why not?
Contextualism has become one of the leading paradigms in contemporary epistemology. According to this view, there is no context-independent standard of knowledge, and as a result, all knowledge ascriptions are context-sensitive. Contextualists contend that their account of this analysis allows us to resolve some major epistemological problems such as skeptical paradoxes and the lottery paradox, and that it helps us explain various other linguistic data about knowledge ascriptions. The apparent ease with which contextualism seems to solve numerous epistemological quandaries has inspired the burgeoning interest in it. This comprehensive anthology collects twenty original essays and critical commentaries on different aspects of contextualism, written by leading philosophers on the topic. The editorsa (TM) introduction sketches the historical development of the contextualist movement and provides a survey and analysis of its arguments and major positions. The papers explore, inter alia, the central problems and prospects of semantic (or conversational) contextualism and its main alternative approaches such as inferential (or issue) contextualism, epistemic contextualism, and virtue contextualism. They also investigate the connections between contextualism and epistemic particularism, and between contextualism and stability accounts of knowledge. Contributors include: Antonia Barke, Peter Baumann, Elke Brendel, Stewart Cohen, Wayne Davis, Fred Dretske, Mylan Engel, Jr., Gerhard Ernst, Verena Gottschling, John Greco, Thomas Grundmann, Frank Hofmann, Christoph JAger, Nikola Kompa, Dirk Koppelberg, Mark Lance, Margaret Little, Lydia Mechtenberg, Hans Rott, Bruce Russell, GilbertScharifi, and Michael Williams. |
![]() ![]() You may like...
Language Teaching with Video-Based…
Michael Thomas, Christel Schneider
Paperback
R1,427
Discovery Miles 14 270
Recent Advances in Geometric…
Dragoslav S. Mitrinovic, J. Pecaric, …
Hardcover
Reconsidering Context in Language…
Janna Fox, Natasha Artemeva
Hardcover
R4,587
Discovery Miles 45 870
|