![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Epistemology, theory of knowledge
Value, Reality, and Desire is an extended argument for a robust realism about value. The robust realist affirms the following distinctive theses. There are genuine claims about value which are true or false - there are facts about value. These value-facts are mind-independent - they are not reducible to desires or other mental states, or indeed to any non-mental facts of a non-evaluative kind. And these genuine, mind-independent, irreducible value-facts are causally efficacious. Values, quite literally, affect us. These are not particularly fashionable theses, and taken as a whole they go somewhat against the grain of quite a lot of recent work in the metaphysics of value. Further, against the received view, Oddie argues that we can have knowledge of values by experiential acquaintance, that there are experiences of value which can be both veridical and appropriately responsive to the values themselves. Finally, these value-experiences are not the products of some exotic and implausible faculty of 'intuition'. Rather, they are perfectly mundane and familiar mental states - namely, desires. This view explains how values can be 'intrinsically motivating', without falling foul of the widely accepted 'queerness' objection. There are, of course, other objections to each of the realist's claims. In showing how and why these objections fail, Oddie introduces a wealth of interesting and original insights about issues of wider interest - including the nature of properties, reduction, supervenience, and causation. The result is a novel and interesting account which illuminates what would otherwise be deeply puzzling features of value and desire and the connections between them.
How is medical knowledge made? New methods for research and clinical care have reshaped the practices of medical knowledge production over the last forty years. Consensus conferences, evidence-based medicine, translational medicine, and narrative medicine are among the most prominent new methods. Making Medical Knowledge explores their origins and aims, their epistemic strengths, and their epistemic weaknesses. Miriam Solomon argues that the familiar dichotomy between the art and the science of medicine is not adequate for understanding this plurality of methods. The book begins by tracing the development of medical consensus conferences, from their beginning at the United States' National Institutes of Health in 1977, to their widespread adoption in national and international contexts. It discusses consensus conferences as social epistemic institutions designed to embody democracy and achieve objectivity. Evidence-based medicine, which developed next, ranks expert consensus at the bottom of the evidence hierarchy, thus challenging the authority of consensus conferences. Evidence-based medicine has transformed both medical research and clinical medicine in many positive ways, but it has also been accused of creating an intellectual hegemony that has marginalized crucial stages of scientific research, particularly scientific discovery. Translational medicine is understood as a response to the shortfalls of both consensus conferences and evidence-based medicine. Narrative medicine is the most prominent recent development in the medical humanities. Its central claim is that attention to narrative is essential for patient care. Solomon argues that the differences between narrative medicine and the other methods have been exaggerated, and offers a pluralistic account of how the all the methods interact and sometimes conflict. The result is both practical and theoretical suggestions for how to improve medical knowledge and understand medical controversies.
Jonathan Knowles argues against theories that seek to provide specific norms for the formation of belief on the basis of empirical sources: the project of naturalized epistemology. He argues that such norms are either not genuinely normative for belief, or are not required for optimal belief formation. An exhaustive classification of such theories is motivated and each variety is discussed in turn. He distinguishes naturalized epistemology from the less committal idea of naturalism, which provides a sense in which we can achieve epistemic normativity without norms.
Does scepticism threaten our common sense picture of the world? Does it really undermine our deep-rooted certainties? Answers to these questions are offered through a comparative study of the epistemological work of two key figures in the history of analytic philosophy, G. E. Moore and Ludwig Wittgenstein.
"Could there have been nothing?" is the first book-length study of
metaphysical nihilism - the claim that there could have been no
concrete objects. It critically analyses the debate around nihilism
and related questions about the metaphysics of possible worlds,
concrete objects and ontological dependence.
The present volume contains a selection of papers presented at the Fifth Conference on Collective Intentionality held at the University of Helsinki August 31 to September 2, 2006 and two additional contributions. The common aim of the papers is to explore the structure of shared intentional attitudes, and to explain how they underlie the social, cultural and institutional world. The contributions to this volume explore the phenomenology of sharedness, the concept of sharedness, and also various aspects of the structure of collective intentionality in general, and of the intricate relations between sharedness and normativity in particular. Concepts of Sharedness shows how rich and lively the philosophical research focused on the analysis of collective intentionality has become, and will provide further inspiration for future work in this rapidly evolving field.
+ Clearly exposes the most frequent calumnies made against science + Shows how dogmatic religion, the financial interests of certain industries, and opportunistic politicians sometime work in cohort to undermine the public’s trust in science + Acknowledges that science’s most mistaken critics are often skilled communicators, and that effectively defending science requires an equally skilled defense + Shows that while the “Science Wars“ of the 1990s have abated, their effects on some of the methodologies in higher education and the larger population continue + Examines three case studies to clearly illustrate how reliable scientific knowledge is secured: • Eratosthenes’ discovery of the circumference of the earth • Louis Pasteur’s development of anthrax and rabies vaccines • The rapid emergence of scientific consensus regarding continental drift
This lucid, informal, and very accessible history of Western
thought takes the unique approach of interpreting skepticism--i.e.,
doubts about knowledge claims and the criteria for making such
claims--as an important stimulus for the development of philosophy.
The authors argue that practically every great thinker from the
time of the Greeks to the present has produced theories designed to
forestall or refute skepticism: from Plato to Moore and
Wittgenstein. The influence of and responses to such 20th-century
skeptics as Russell and Derrida are also discussed critically.
Epistemology or theory of knowledge has always been one of the most important -if not the most important -field of philosophy. New arguments are constantly brought to bear on old views, new variants are marshalled to revive ancient stands, new concepts and distinctions increase the sophistication of epistemogical theories. There are a great many excellent textbooks, monographs as well as anthologies consisting of articles in epistemology. Similarly, there are useful philosophical dictionaries which contain a great number of relatively short entries, and general philosophical handbooks which also touch epistemological issues. This volume of 27 essays grew out from the interest to see a handbook which is devoted entirely to the historical roots and systematic development of theory of knowledge. It is not intended to compete but to supplement the already existing literature. It aims at giving both beginners and more advanced students as well as professionals in epistemology and other areas of philosophy an overview of the central problems and solutions of epistemology. The essays are self-contained and stil often rather extensive discussions of the chosen aspects of knowledge. The contributions presuppose very little familiarity with previous literature and only a few of them require the mastery of even elementary logical notation. This, we hope, makes the volume also accessible to the philosophically interested wider audience. The contributors were asked to provide substantial, up-to-date, self-contained and balanced surveys of the various subareas and more specific topics of epistemology, with reference to literature.
Based on his experiences with the infamous Dreyfus case, this powerful last novel by Emile Zola about the scape-goating of a Jewish schoolteacher is a chilling depiction of anti-Semitism fully embedded in European society and an eerie presentiment of the Holocaust that would sweep across the Continent only forty years later. But this is not the whole story, for Zola also brilliantly demonstrates how truth, though suppressed for a generation, slowly but inexorably comes to light through the dedication and perseverance of a few humble defenders, who remain unswerving in their demand for justice.
Contextualism has become one of the leading paradigms in contemporary epistemology. According to this view, there is no context-independent standard of knowledge, and as a result, all knowledge ascriptions are context-sensitive. Contextualists contend that their account of this analysis allows us to resolve some major epistemological problems such as skeptical paradoxes and the lottery paradox, and that it helps us explain various other linguistic data about knowledge ascriptions. The apparent ease with which contextualism seems to solve numerous epistemological quandaries has inspired the burgeoning interest in it. This comprehensive anthology collects twenty original essays and critical commentaries on different aspects of contextualism, written by leading philosophers on the topic. The editorsa (TM) introduction sketches the historical development of the contextualist movement and provides a survey and analysis of its arguments and major positions. The papers explore, inter alia, the central problems and prospects of semantic (or conversational) contextualism and its main alternative approaches such as inferential (or issue) contextualism, epistemic contextualism, and virtue contextualism. They also investigate the connections between contextualism and epistemic particularism, and between contextualism and stability accounts of knowledge. Contributors include: Antonia Barke, Peter Baumann, Elke Brendel, Stewart Cohen, Wayne Davis, Fred Dretske, Mylan Engel, Jr., Gerhard Ernst, Verena Gottschling, John Greco, Thomas Grundmann, Frank Hofmann, Christoph JAger, Nikola Kompa, Dirk Koppelberg, Mark Lance, Margaret Little, Lydia Mechtenberg, Hans Rott, Bruce Russell, GilbertScharifi, and Michael Williams.
The human body is not simply a physical, anatomical, or physiological object; in an important sense we are our bodies. In this collection, Sartre scholars and others engage with the French philosopher Jean-Paul Sartre's brilliant but neglected descriptions of our experience of human bodies. The authors bring a wide variety of perspectives to bear on Sartre's thinking about the body, and, alongside in-depth scholarly historical and critical investigations, bring him into dialogue with feminists, sociologists, psychologists and historians, addressing such questions as: Why have philosophers found it so difficult to conceptualize the relationship between consciousness and the body? Do men and women experience their own bodies in fundamentally different ways? What is pain? What is sexual desire? What is it like to live as a racially marked body in a racist society? How do society and culture shape our bodies, and can we re-shape them?
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The main feature of Shapiro's account is that the extensions (and anti-extensions) of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann. The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox. Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects.
This book is multi- and interdisciplinary in both scope and content. It draws upon philosophy, the neurosciences, psychology, computer science, and engineering in efforts to resolve fundamental issues about the nature of immediate awareness. Approximately the first half of the book is addressed to historical approaches to the question whether or not there is such a thing as immediate awareness, and if so, what it might be. This involves reviewing arguments that one way or another have been offered as answers to the question or ways of avoiding it. It also includes detailed discussions of some complex questions about the part immediate awareness plays in our over-all natural intelligence. The second half of the book addresses intricate and complex issues involved in the computability of immediate awareness as it is found in simple, ordinary things human beings know how to do, as weIl as in some highly extraordinary things some know how to do. Over the past 2,500 years, human culture has discovered, created, and built very powerful tools for recognizing, classifying, and utilizing patterns found in the natural world. The most powerful of those tools is mathematics, the language of nature. The natural phenomenon of human knowing, of natural intelligence generally, is a very richly textured set of patterns that are highly complex, dynamic, self-organizing, and adaptive.
Rebirth and the Stream of Life explores the diversity as well as the ethical and religious significance of rebirth beliefs, focusing especially on Hindu and Buddhist traditions but also discussing indigenous religions and ancient Greek thought. Utilizing resources from religious studies, anthropology and theology, an expanded conception of philosophy of religion is exemplified, which takes seriously lived experience rather than treating religious beliefs in isolation from their place in believers' lives. Drawing upon his expertise in interdisciplinary working and Wittgenstein-influenced approaches, Mikel Burley examines several interrelated phenomena, including purported past-life memories, the relationship between metaphysics and ethics, efforts to 'demythologize' rebirth, and moral critiques of the doctrine of karma. This range of topics, with rebirth as a unifying theme, makes the book of value to anyone interested in philosophy, the study of religions, and what it means to believe that we undergo multiple lives.
Archives are considered to be collections of administrative, legal, commercial and other records or the actual place where they are located. They have become ubiquitous in the modern world, but emerged not much later than the invention of writing. Following Foucault, who first used the word archive in a metaphorical sense as "the general system of the formation and transformation of statements" in his "Archaeology of Knowledge" (1969), postmodern theorists have tried to exploit the potential of this concept and initiated the "archival turn". In recent years, however, archives have attracted the attention of anthropologists and historians of different denominations regarding them as historical objects and "grounding" them again in real institutions. The papers in this volume explore the complex topic of the archive in a historical, systematic and comparative context and view it in the broader context of manuscript cultures by addressing questions like how, by whom and for which purpose were archival records produced, and if they differ from literary manuscripts regarding materials, formats, and producers (scribes).
How can one think about the same thing twice without knowing that
it's the same thing? How can one think about nothing at all (for
example Pegasus, the mythical flying horse)? Is thinking about
oneself special? One could mistake one's car for someone else's,
but it seems one could not mistake one's own headache for someone
else's. Why not?
Hermeneutic philosophies of social science offer an approach to the philosophy of social science foregrounding the human subject and including attention to history as well as a methodological reflection on the notion of reflection, including the intrusions of distortions and prejudice. Hermeneutic philosophies of social science offer an explicit orientation to and concern with the subject of the human and social sciences. Hermeneutic philosophies of the social science represented in the present collection of essays draw inspiration from Gadamer's work as well as from Paul Ricoeur in addition to Michel de Certeau and Michel Foucault among others. Special attention is given to Wilhelm Dilthey in addition to the broader phenomenological traditions of Edmund Husserl and Martin Heidegger as well as the history of philosophy in Plato and Descartes. The volume is indispensible reading for students and scholars interested in epistemology, philosophy of science, social social studies of knowledge as well as social studies of technology.
The volume "Conceptions of Knowledge" collects current essays on contemporary epistemology and philosophy of science. The essays are primarily concerned with pragmatic and contextual extensions of analytic epistemology but also deal with traditional questions like the nature of knowledge and skepticism. The topics include the connection between "knowing that" and "knowing how," the relevance of epistemic abilities, the embedding of knowledge ascriptions in context and contrast classes, the interpretation of skeptical doubt, and the various forms of knowledge.
The philosophy of the humanistic sciences has been a blind-spot in analytic philosophy. This book argues that by adopting an appropriate pragmatic analysis of explanation and interpretation it is possible to show that scientific practice of humanistic sciences can be understood on similar lines to scientific practice of natural and social sciences.
This book uniquely illustrates the key concepts and issues involved with recent examples drawn from empirical research, highlighting the practical relevance of difficult theoretical and philosophical concepts to the way in which we think and talk about knowledge both in an everyday and in an academic/ sociological context.
One the most interesting debates in moral philosophy revolves around the significance of empirical moral psychology for moral philosophy. Genealogical arguments that rely on empirical findings about the origins of moral beliefs, so-called debunking arguments, take center stage in this debate. Looking at debunking arguments based on evidence from evolutionary moral psychology, experimental ethics and neuroscience, this book explores what ethicists can learn from the science of morality, and what they cannot. Among other things, the book offers a new take on the deontology/utilitarianism debate, discusses the usefulness of experiments in ethics, investigates whether morality should be thought of as a problem-solving device, shows how debunking arguments can tell us something about the structure of philosophical debate, and argues that debunking arguments lead to both moral and prudential skepticism. Presenting a new picture of the relationship between empirical moral psychology and moral philosophy, this book is essential reading for moral philosophers and moral psychologists alike.
How do cognitive neuroscientists explain phenomena like memory or language processing? This book examines the different kinds of experiments and manipulative research strategies involved in understanding and eventually explaining such phenomena. Against this background, it evaluates contemporary accounts of scientific explanation, specifically the mechanistic and interventionist accounts, and finds them to be crucially incomplete. Besides, mechanisms and interventions cannot actually be combined in the way usually done in the literature. This book offers solutions to both these problems based on insights from experimental practice. It defends a new reading of the interventionist account, highlights the importance of non-interventionist studies for scientific inquiry, and supplies a taxonomy of experiments that makes it easy to see how the gaps in contemporary accounts of scientific explanation can be filled. The book concludes that a truly empirically adequate philosophy of science must take into account a much wider range of experimental research than has been done to date. With the taxonomy provided, this book serves a stepping-stone leading into a new era of philosophy of science-for cognitive neuroscience and beyond.
According to a long tradition, questions about the nature of knowledge are to be answered by analyzing it as a species of true belief. In light of the apparent failure of this approach, knowledge first philosophy takes knowledge as the starting point in epistemology. Knowledge First? offers the first overview of this approach.
This book highlights the legacy of the Lvov-Warsaw School in broadly understood contemporary philosophy of language. Fundamental methodological issues, important topics in syntax, semantics and pragmatics (such as modern Categorial Grammar, theories of truth, game-theoretical semantics, and argumentation theory) are tracked down to their origins in the Lvov-Warsaw School, and - the other way round - modern renderings of the ideas expressed by Kazimierz Ajdukiewicz, Tadeusz Kotarbinski, Stanislaw Lesniewski, Jan Lukasiewicz, Alfred Tarski, Kazimierz Twardowski, and other members of the School are presented. Among contributors there are philosophers, logicians, formal linguists and other specialists from France, Italy, Poland, and Spain. |
You may like...
Donald Davidson's Truth-Theoretic…
Ernest LePore, Kirk Ludwig
Hardcover
R3,034
Discovery Miles 30 340
Mill and Carlyle - an Examination of Mr…
Patrick Proctor Alexander
Paperback
R459
Discovery Miles 4 590
|