![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Epistemology, theory of knowledge
The author argues that is not obvious what it means for our beliefs and assertions to be "truth-directed," and that we need to weaken our ordinary notion of a belief if we are to deal with radical scepticism without surrendering to idealism. Topics examined also include whether there could be alien conceptual schemes and what might happen to us if we abandoned genuine belief in place of mere pragmatic acceptance. A radically new "ecological" model of knowledge is defended.
This volume is the first systematic and thorough attempt to investigate the relation and the possible applications of mereology to contemporary science. It gathers contributions from leading scholars in the field and covers a wide range of scientific theories and practices such as physics, mathematics, chemistry, biology, computer science and engineering. Throughout the volume, a variety of foundational issues are investigated both from the formal and the empirical point of view. The first section looks at the topic as it applies to physics. The section addresses questions of persistence and composition within quantum and relativistic physics and concludes by scrutinizing the possibility to capture continuity of motion as described by our best physical theories within gunky space times. The second part tackles mathematics and shows how to provide a foundation for point-free geometry of space switching to fuzzy-logic. The relationbetween mereological sums and set-theoretic suprema is investigated and issues about different mereological perspectives such as classical and natural Mereology are thoroughly discussed. The third section in the volume looks at natural science. Several questions from biology, medicine and chemistry are investigated. From the perspective of biology, there is an attempt to provide axioms for inferring statements about part hood between two biological entities from statements about their spatial relation. From the perspective of chemistry, it is argued that classical mereological frameworks are not adequate to capture the practices of chemistry in that they consider neither temporal nor modal parameters. The final part introduces computer science and engineering. A new formal mereological framework in which an indeterminate relation of part hood is taken as a primitive notion is constructed and then applied to a wide variety of disciplines from robotics to knowledge engineering. A formal framework for discrete mereotopology and its applications is developed and finally, the importance of mereology for the relatively new science of domain engineering is also discussed."
In the course of conversation, we exert implicit pressures on both ourselves and others. These forms of conversational pressure are many and far from uniform, so much so that it is unclear whether they constitute a single cohesive class. In this book Sanford C. Goldberg explores the source, nature, and scope of the normative expectations we have of one another as we engage in conversation that are generated by the performance of speech acts themselves. In doing so he examines two fundamental types of expectation - epistemic and interpersonal. It is through normative expectations of these types that we aim to hold one another to standards of proper conversational conduct. This line of argument is pursued in connection with such topics as the normative significance of acts of address, the epistemic costs of politeness, the bearing of epistemic injustice on the epistemology of testimony, the normative pressure friendship exerts on belief, the nature of epistemic trust, the significance of conversational silence, and the various evils of silencing. By approaching these matters in terms of the normative expectations to which conversational participants are entitled, Goldberg aims to offer a unified account of the various pressures that are exerted in the course of a speech exchange.
Good decisions account for risks. For example, the risk of an accident while driving in the rain makes a reasonable driver decide to slow down. While risk is a large topic in theoretical disciplines such as economics and psychology, as well as in practical disciplines such as medicine and finance, philosophy has a unique contribution to make in developing a normative theory of risk that states what risk is, and to what extent our responses to it are rational. Weirich here develops a philosophical theory of the rationality of responses to risk. He first distinguishes two types of risk: first, a chance of a bad event, and second, an act's risk in relation to its possible outcomes. He argues that this distinction has normative significance in the sense that one's attitudes towards these types of risks - and how one acts on them - are governed by different general principles of rationality. Consequently, a comprehensive account of risk must not only characterize rational responses to risk but also explain why these responses are rational. Weirich explains how, for a rational ideal agent, the expected utilities of the acts available in a decision problem explain the agent's preferences among the acts. As a result, maximizing expected utility is just following preferences among the acts. His view takes an act's expected utility, not just as a feature of a representation of preferences among acts, but also as a factor in the explanation of preferences among acts. The book's precise formulation of general standards of rationality for attitudes and for acts, and its rigorous argumentation for these standards, make it philosophical; but while mainly of interest to philosophers, its broader arguments will contribute to the conceptual foundations of studies of risk in all disciplines that study it.
While much has been written on Descartes' theory of mind and ideas, no systematic study of his theory of sensory representation and misrepresentation is currently available in the literature. Descartes and the Puzzle of Sensory Misrepresentation is an ambitious attempt to fill this gap. It argues against the established view that Cartesian sensations are mere qualia by defending the view that they are representational; it offers a descriptivist-causal account of their representationality that is critical of, and differs from, all other extant accounts (such as, for example, causal, teleofunctional and purely internalist accounts); and it has the advantage of providing an adequate solution to the problem of sensory misrepresentation within Descartes' internalist theory of ideas. In sum, the book offers a novel account of the representationality of Cartesian sensations; provides a panoramic overview, and critical assessment, of the scholarly literature on this issue; and places Descartes' theory of sensation in the central position it deserves among the philosophical and scientific investigations of the workings of the human mind.
A Critical Introduction to the Metaphysics of Modality examines the eight main contemporary theories of possibility behind a central metaphysical topic. Covering modal skepticism, modal expressivism, modalism, modal realism, ersatzism, modal fictionalism, modal agnosticism, and the new modal actualism, this comprehensive introduction to modality places contemporary debates in an historical context. Beginning with a historical overview, Andrea Borghini discusses Parmenides and Zeno; looks at how central Medieval authors such as Aquinas, and Buridan prepared the ground for the Early Modern radical views of Leibniz, Spinoza, and Hume and discusses advancements in semantics in the later-half of the twentieth century a resulted in the rise of modal metaphysics, the branch characterizing the past few decades of philosophical reflection. Framing the debate according to three main perspectives - logical, epistemic, metaphysical- Borghini provides the basic concepts and terms required to discuss modality. With suggestions of further reading and end-of-chapter study questions, A Critical Introduction to the Metaphysics of Modality is an up-to-date resource for students working in contemporary metaphysics seeking a better understanding of this crucial topic.
This book consolidates and extends the authors' work on the connection between iconicity and abductive inference. It emphasizes a pragmatic, experimental and fallibilist view of knowledge without sacrificing formal rigor. Within this context, the book focuses particularly on scientific knowledge and its prevalent use of mathematics. To find an answer to the question "What kind of experimental activity is the scientific employment of mathematics?" the book addresses the problems involved in formalizing abductive cognition. For this, it implements the concept and method of iconicity, modeling this theoretical framework mathematically through category theory and topoi. Peirce's concept of iconic signs is treated in depth, and it is shown how Peirce's diagrammatic logical notation of Existential Graphs makes use of iconicity and how important features of this iconicity are representable within category theory. Alain Badiou's set-theoretical model of truth procedures and his relational sheaf-based theory of phenomenology are then integrated within the Peircean logical context. Finally, the book opens the path towards a more naturalist interpretation of the abductive models developed in Peirce and Badiou through an analysis of several recent attempts to reformulate quantum mechanics with categorical methods. Overall, the book offers a comprehensive and rigorous overview of past approaches to iconic semiotics and abduction, and it encompasses new extensions of these methods towards an innovative naturalist interpretation of abductive reasoning.
Over the last two decades, the field of artificial intelligence has experienced a separation into two schools that hold opposite opinions on how uncertainty should be treated. This separation is the result of a debate that began at the end of the 1960 s when AI first faced the problem of building machines required to make decisions and act in the real world. This debate witnessed the contraposition between the mainstream school, which relied on probability for handling uncertainty, and an alternative school, which criticized the adequacy of probability in AI applications and developed alternative formalisms. The debate has focused on the technical aspects of the criticisms raised against probability while neglecting an important element of contrast. This element is of an epistemological nature, and is therefore exquisitely philosophical. In this book, the historical context in which the debate on probability developed is presented and the key components of the technical criticisms therein are illustrated. By referring to the original texts, the epistemological element that has been neglected in the debate is analyzed in detail. Through a philosophical analysis of the epistemological element it is argued that this element is metaphysical in Popper s sense. It is shown that this element cannot be tested nor possibly disproved on the basis of experience and is therefore extra-scientific. Ii is established that a philosophical analysis is now compelling in order to both solve the problematic division that characterizes the uncertainty field and to secure the foundations of the field itself.
Disgust is a state of high alert. It acutely says "no" to a variety of phenomena that seemingly threaten the integrity of the self, if not its very existence. A counterpart to the feelings of appetite, desire, and love, it allows at the same time for an acting out of hidden impulses and libidinal drives. In Disgust, Winfried Menninghaus provides a comprehensive account of the significance of this forceful emotion in modern Western thought, ranging from a consideration of the role of disgust as both a cognitive and moral organon in Kant and Nietzsche to recent debates on "Abject Art."
In the recent educational research literature, it has been asserted that ethnic or cultural groups have their own distinctive epistemologies, and that these have been given short shrift by the dominant social group. Educational research, then, is pursued within a framework that embodies assumptions about knowledge and knowledge production that reflect the interests and historical traditions of this dominant group. In such arguments, however, some relevant philosophical issues remain unresolved, such as what claims about culturally distinctive epistemologies mean, precisely, and how they relate to traditional epistemological distinctions between beliefs and knowledge. Furthermore, can these ways of establishing knowledge stand up to critical scrutiny? This volume marshals a variety of resources to pursue such open questions in a lively and accessible way: a critical literature review, analyses from philosophers of education who have different positions on the key issues, a roundtable discussion, and interactions between the two editors, who sometimes disagree. It also employs the work of prominent feminist epistemologists who have investigated parallel issues with sophistication. This volume does not settle the question of culturally distinctive epistemologies, but teases out the various philosophical, sociological and political aspects of the issue so that the debate can continue with greater clarity."
Concepts based on full-blown collective intentionality (aboutness), viz., we-mode intentionality, are central for understanding and explaining the social world. The book systematically studies social groups, acting in them as a group member, collective commitment, group intentions, beliefs, and actions, especially authority-based group attitudes and actions. There are also chapters on cooperation, social institutions, cultural evolution, and group responsibility.
Proposes that philosophy is the proper cure for neurosis. John Russon's Human Experience draws on central concepts of contemporary European philosophy to develop a novel analysis of the human psyche. Beginning with a study of the nature of perception, embodiment, and memory, Russon investigates the formation of personality through family and social experience. He focuses on the importance of the feedback we receive from others regarding our fundamental worth as persons, and on the way this interpersonal process embeds meaning into our most basic bodily practices: eating, sleeping, sex, and so on. Russon concludes with an original interpretation of neurosis as the habits of bodily practice developed in family interactions that have become the foundation for developed interpersonal life, and proposes a theory of psychological therapy as the development of philosophical insight that responds to these neurotic compulsions.
In "Cognitive Integration: Attacking The Bounds of Cognition"
Richard Menary argues that the real pay-off from
extended-mind-style arguments is not a new form of externalism in
the philosophy of mind, but a view in which the 'internal' and
'external' aspects of cognition are integrated into a whole.
Recent philosophy and history of science has seen a surge of interest in the role of concepts in scientific research. Scholars working in this new field focus on scientific concepts, rather than theories, as units of analysis and on the ways in which concepts are formed and used rather than on what they represent. They analyze what has traditionally been called the context of discovery, rather than (or in addition to) the context of justification. And they examine the dynamics of research rather than the status of the finished research results. This volume provides detailed case studies and general analyses to address questions raised by these points, such as: - Can concepts be clearly distinguished from the sets of beliefs we have about their referents? - What - if any - sense can be made of the separation between concepts and theories? - Can we distinguish between empirical and theoretical concepts? - Are there interesting similarities and differences between the role of concepts in the empirical sciences and in mathematics? - What underlying notion of investigative practice could be drawn on to explicate the role of concept in such practice? - From a philosophical point of view, is the distinction between discovery and justification a helpful frame of reference for inquiring into the dynamics of research? - From a historiographical point of view, does a focus on concepts face the danger of falling back into an old-fashioned history of ideas?
New York, Bern, Berlin, Bruxelles, Frankfurt/M., Oxford, Wien. Relational "(e)pistemologies" redefines epistemology in a non-transcendent manner and reclaims the traditional epistemological concerns of standards and criteria for warranting arguments and determining truth and falsity. These concerns must be reclaimed in order to make them visible and accountable as well as pragmatically useful on socially constructed grounds - not transcendental grounds. Thayer-Bacon's book offers analysis and critique as well as redescription. She presents a pragmatist social feminist view, a relational perspective of knowing embedded within a discussion of many other relational views - personal, social and holistic, ecological, and scientific - which emphasize connections. Thayer-Bacon describes each of these forms of relationality, and she points to key scholars whose work highlights a certain relational form. She concludes with a discussion of the educational implications relational (e)pistemological theories have for education.
Memory and Identity in the Learned World offers a detailed and varied account of community formation in the early modern world of learning and science. The book traces how collective identity, institutional memory and modes of remembrance helped to shape learned and scientific communities. The case studies in this book analyse how learned communities and individuals presented and represented themselves, for example in letters, biographies, histories, journals, opera omnia, monuments, academic travels and memorials. By bringing together the perspectives of historians of literature, scholarship, universities, science, and art, this volume studies knowledge communities by looking at the centrality of collective identity and memory in their formations and reformations. Contributors: Lieke van Deinsen, Karl Enenkel, Constance Hardesty, Paul Hulsenboom, Dirk van Miert, Alan Moss, Richard Kirwan, Koen Scholten, Floris Solleveld, and Esther M. Villegas de la Torre.
This work investigates crucial aspects of Kant's epistemology and ethics in relation to Kierkegaard's thinking. The challenge is taken up of developing a systematic reconstruction of Kant's and Kierkegaard's position. Kant forms a matrix for the interpretation of Kierkegaard, and considerable space is devoted to the exposition of Kant at those various points at which contact with Kierkegaard's thought is to be demonstrated. The burden of the argument is that Kierkegaard in his account of the stages is much closer to Kant than the texts initially reveal. It is possible, then, to arrive at a proper grasp of Kierkegaard's final position by seeing just how radically the stage of Christian faith (Religiousness B) departs from Kant.
It is often thought that consciousness has a qualitative dimension that cannot be tracked by science. Recently, however, some philosophers have argued that this worry stems not from an elusive feature of the mind, but from the special nature of the concepts used to describe conscious states. Marc Champagne draws on the neglected branch of philosophy of signs or semiotics to develop a new take on this strategy. The term "semiotics" was introduced by John Locke in the modern period - its etymology is ancient Greek, and its theoretical underpinnings are medieval. Charles Sanders Peirce made major advances in semiotics, so he can act as a pipeline for these forgotten ideas. Most philosophers know Peirce as the founder of American pragmatism, but few know that he also coined the term "qualia," which is meant to capture the intrinsic feel of an experience. Since pragmatic verification and qualia are now seen as conflicting commitments, Champagne endeavors to understand how Peirce could (or thought he could) have it both ways. The key, he suggests, is to understand how humans can insert distinctions between features that are always bound. Recent attempts to take qualities seriously have resulted in versions of panpsychism, but Champagne outlines a more plausible way to achieve this. So, while semiotics has until now been the least known branch of philosophy ending in -ics, his book shows how a better understanding of that branch can move one of the liveliest debates in philosophy forward.
Even though important developments within 20th and 21st century philosophy have widened the scope of epistemology, this has not yet resulted in a systematic meta-epistemological debate about epistemology's aims, methods, and criteria of success. Ideas such as the methodology of reflective equilibrium, the proposal to "naturalize" epistemology, constructivist impulses fuelling the "sociology of scientific knowledge", pragmatist calls for taking into account the practical point of epistemic evaluations, as well as feminist criticism of the abstract and individualist assumptions built into traditional epistemology are widely discussed, but they have not typically resulted in the call for, let alone the construction of, a suitable meta-epistemological framework. This book motivates and elaborates such a new meta-epistemology. It provides a pragmatist, social and functionalist account of epistemic states that offers the conceptual space for revised or even replaced epistemic concepts. This is what it means to "refurbish epistemology": The book assesses conceptual tools in relation to epistemology's functionally defined conceptual space, responsive to both intra-epistemic considerations and political and moral values.
On the one hand, the concept of truth is a major research subject in analytic philosophy. On the other hand, mathematical logicians have developed sophisticated logical theories of truth and the paradoxes. Recent developments in logical theories of the semantical paradoxes are highly relevant for philosophical research on the notion of truth. And conversely, philosophical guidance is necessary for the development of logical theories of truth and the paradoxes. From this perspective, this volume intends to reflect and promote deeper interaction and collaboration between philosophers and logicians investigating the concept of truth than has existed so far.Aside from an extended introductory overview of recent work in the theory of truth, the volume consists of articles by leading philosophers and logicians on subjects and debates that are situated on the interface between logical and philosophical theories of truth. The volume is intended for graduate students in philosophy and in logic who want an introduction to contemporary research in this area, as well as for professional philosophers and logicians
Hilary Whitehall Putnam was one of the leading philosophers of the second half of the 20th century. As student of Rudolph Carnap's and Hans Reichenbach's, he went on to become not only a major figure in North American analytic philosophy, who made significant contributions to the philosophy of mind, language, mathematics, and physics but also to the disciplines of logic, number theory, and computer science. He passed away on March 13, 2016. The present volume is a memorial to his extraordinary intellectual contributions, honoring his contributions as a philosopher, a thinker, and a public intellectual. It features essays by an international team of leading philosophers, covering all aspects of Hilary Putnam's philosophy from his work in ethics and the history of philosophy to his contributions to the philosophy of science, logic, and mathematics. Each essay is an original contribution. "Hilary Putnam is one of the most distinguished philosophers of the modern era, and just speaking personally, one of the smartest and most impressive thinkers I have ever been privileged to know-as a good friend for 70 years. The fine essays collected here are a fitting tribute to a most remarkable figure." Noam Chomsky, Institute Professor Emeritus, Massachusetts Institute of Technology "In Engaging Putnam excellent philosophers engage the writings and ideas of Hilary Putnam, one of the most productive and influential philosophers of the last century. Putnam stands out because of the combination of brilliance and a firm grasp of reality he brought to a very broad range of issues: the logic and the philosophy of mathematics, free-will, skepticism, realism, internalism and externalism and a lot more. Along with this he offered penetrating insights about other great philosophers, from Aristotle to Wittgenstein. All great philosophers make us think. With many, we try to figure out the strange things they say. With Putnam, we are made to think about clearly explained examples and arguments that get to the heart of the issues he confronts. This book is a wonderful contribution to the continuation of Putnam-inspired thinking." John Perry, Emeritus Professor of Philosophy, Stanford University
How do ordinary objects persist through time and across possible worlds? How do they manage to have their temporal and modal properties? These are the questions adressed in this book which is? "guided tour of theories of persistence." The book is divided in two parts. In the first, the two traditional accounts of persistence through time (endurantism and perdurantism) are combined with presentism and eternalism to yield four different views, and their variants. The resulting views are then examined in turn, in order to see which combinations are appealing and which are not. It is argued that the 'worm view' variant of eternalist perdurantism is superior to the other alternatives. In the second part of the book, the same strategy is applied to the combinations of views about persistence across possible worlds (trans-world identity, counterpart theory, modal perdurants) and views about the nature of worlds, mainly modal realism and abstractionism. Not only all the traditional and well-known views, but also some more original ones, are examined and their pros and cons are carefully weighted. Here again, it is argued that perdurance seems to be the best strategy available.
How is medical knowledge made? New methods for research and clinical care have reshaped the practices of medical knowledge production over the last forty years. Consensus conferences, evidence-based medicine, translational medicine, and narrative medicine are among the most prominent new methods. Making Medical Knowledge explores their origins and aims, their epistemic strengths, and their epistemic weaknesses. Miriam Solomon argues that the familiar dichotomy between the art and the science of medicine is not adequate for understanding this plurality of methods. The book begins by tracing the development of medical consensus conferences, from their beginning at the United States' National Institutes of Health in 1977, to their widespread adoption in national and international contexts. It discusses consensus conferences as social epistemic institutions designed to embody democracy and achieve objectivity. Evidence-based medicine, which developed next, ranks expert consensus at the bottom of the evidence hierarchy, thus challenging the authority of consensus conferences. Evidence-based medicine has transformed both medical research and clinical medicine in many positive ways, but it has also been accused of creating an intellectual hegemony that has marginalized crucial stages of scientific research, particularly scientific discovery. Translational medicine is understood as a response to the shortfalls of both consensus conferences and evidence-based medicine. Narrative medicine is the most prominent recent development in the medical humanities. Its central claim is that attention to narrative is essential for patient care. Solomon argues that the differences between narrative medicine and the other methods have been exaggerated, and offers a pluralistic account of how the all the methods interact and sometimes conflict. The result is both practical and theoretical suggestions for how to improve medical knowledge and understand medical controversies.
Value, Reality, and Desire is an extended argument for a robust realism about value. The robust realist affirms the following distinctive theses. There are genuine claims about value which are true or false - there are facts about value. These value-facts are mind-independent - they are not reducible to desires or other mental states, or indeed to any non-mental facts of a non-evaluative kind. And these genuine, mind-independent, irreducible value-facts are causally efficacious. Values, quite literally, affect us. These are not particularly fashionable theses, and taken as a whole they go somewhat against the grain of quite a lot of recent work in the metaphysics of value. Further, against the received view, Oddie argues that we can have knowledge of values by experiential acquaintance, that there are experiences of value which can be both veridical and appropriately responsive to the values themselves. Finally, these value-experiences are not the products of some exotic and implausible faculty of 'intuition'. Rather, they are perfectly mundane and familiar mental states - namely, desires. This view explains how values can be 'intrinsically motivating', without falling foul of the widely accepted 'queerness' objection. There are, of course, other objections to each of the realist's claims. In showing how and why these objections fail, Oddie introduces a wealth of interesting and original insights about issues of wider interest - including the nature of properties, reduction, supervenience, and causation. The result is a novel and interesting account which illuminates what would otherwise be deeply puzzling features of value and desire and the connections between them. |
You may like...
Dignaga's Investigation of the Percept…
Douglas Duckworth, Malcolm David Eckel, …
Hardcover
R3,758
Discovery Miles 37 580
Free Will, Agency, and Selfhood in…
Matthew R. Dasti, Edwin F. Bryant
Hardcover
R3,844
Discovery Miles 38 440
|