![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Humanities > Philosophy > Topics in philosophy > Epistemology, theory of knowledge
I am very grateful to Kluwer Academic Publishers for the opportunity to republish these articles about knowledge and language. The Introduction to the volume has been written by James Logue, and I need to pay a very sincerely intended tribute to the care and professionalism which he has devoted to every feature of its production. My thanks are also due to Matthew MeG rattan for his technical as sistance in scanning the articles onto disk and formatting them. 1. Jonathan Cohen vii Publisher's Note Thanks are due to the following publishers for permission to reproduce the articles in this volume. On the project of a universal character. Oxford University Press. Paper 1 On a concept of a degree of grammaticalness. Logique et Analyse. Paper 2 Paper 3 The semantics of metaphor. Cambridge University Press. Paper 4 Can the logic of indirect discourse be formalised? The Association for Symbolic Logic. Paper 5 Some remarks on Grice's views about the logical particles of natural language. Kluwer Academic Publishers. Paper 6 Can the conversationalist hypothesis be defended? Kluwer Academic Publishers. Paper 7 How is conceptual innovation possible? Kluwer Academic Publishers. Should natural language definitions be insulated from, or interactive Paper 8 with, one another in sentence composition? Kluwer Academic Publish ers. Paper 9 A problem about truth-functional semantics. Basil Blackwell Publisher Ltd. Paper 10 The individuation of proper names. Oxford University Press. Paper 11 Some comments on third world epistemology. Oxford University Press. Paper 12 Guessing. The Aristotelian Society."
"Taking a fresh approach to Byron, this book argues that he should be understood as a poet whose major works develop a carefully reasoned philosophy. Situating him with reference to the thought of the period, it argues for Byron as an active thinker, whose final philosophical stance - reader-centred scepticism - has extensive practical implications"--
One of the central areas of concern in late twentieth-century philosophy is the debate between Realism and anti-Realism. But the precise nature of the issues that form the focus of the debate remains controversial. In Realism and Explanatory Priority a new way of viewing the debate is developed. The primary focus is not on the notions of existence, truth or reference, but rather on independence. A notion of independence is developed using concepts derived from the theory of explanation. It is argued that this approach enables us to clarify the exact nature of the empirical evidence that would be required to establish Realism in any area. The author defends a restricted form of Realism, which he calls Nomic Structuralism. The book will be suitable for professional philosophers of language, science and metaphysics, and their graduate students.
The distinguished scholar of ancient philosophy J.L. Ackrill here presents the best of his essays on Plato and Aristotle from the past forty years. He brings philosophical acuity and philological expertise to a range of texts and topics in ancient thought - from ethics and logic to epistemology and metaphysics - which continue to be widely discussed today.
The ten essays in this collection were written to celebrate the 50th anniversary of the lectures which became Wilfrid Sellars's Empiricism and the Philosophy of Mind, one of the crowning achievements of 20th-century analytic philosophy. Both appreciative and critical of Sellars's accomplishment, they engage with his treatment of crucial issues in metaphysics and epistemology. The topics include the standing of empiricism, Sellars's complex treatment of perception, his dissatisfaction with both foundationalist and coherentist epistemologies, his commitment to realism, and the status of the normative (the "logical space of reasons" and the "manifest image"). The volume shows how vibrant Sellarsian philosophy remains in the 21st century.
The Force of Reason and the Logic of Force investigates the concept of force through various 'episodes' in the history of philosophy. The author argues that force arises on the basis of the distinction of reality and mere appearance. The book looks at figures who reduce force to something other than itself as well as figures who develop a 'logic of force' that allows them to trace the operation of force without such a reduction. MARKET 1: Postgraduate students studying history of philosophy, medieval philosophy and continental philosophy, epistemology and theory of knowledge
Contemporary interest in realism and naturalism, emerging under the banner of speculative or new realism, has prompted continentally-trained philosophers to consider a number of texts from the canon of analytic philosophy. The philosophy of Wilfrid Sellars, in particular, has proven remarkably able to offer a contemporary re-formulation of traditional "continental" concerns that is amenable to realist and rationalist considerations, and serves as an accessible entry point into the Anglo-American tradition for continental philosophers. With the aim of appraising this fertile theoretical convergence, this volume brings together experts of both analytic and continental philosophy to discuss the legacy of Kantianism in contemporary philosophy. The individual essays explore the ways in which Sellars can be put into dialogue with the widely influential work of Quentin Meillassoux, explaining how-even though their methods, language, and proximal influences are widely different-their philosophical stances can be compared thanks to their shared Kantian heritage and interest in the problem of realism. This book will be appeal to students and scholars who are interested in Sellars, Meillassoux, contemporary realist movements in continental philosophy, and the analytic-continental debate in contemporary philosophy.
Several of the basic ideas of current language theory are subjected to critical scrutiny and found wanting, including the concept of scope, the hegemony of generative syntax, the Frege-Russell claim that verbs like is' are ambiguous, and the assumptions underlying the so-called New Theory of Reference. In their stead, new constructive ideas are proposed.
Metaphor is one of the most frequently evoked but at the same time most poorly understood concepts in philosophy and literary theory. In recent years, several interesting approaches to metaphor have been presented or outlined. In this volume, authors of some of the most important new approaches re-present their views or illustrate them by means of applications, thus allowing the reader to survey some of the prominent ongoing developments in this field. These authors include Robert Fogelin, Susan Haack, Jaakko Hintikka (with Gabriel Sandu), Bipin Indurkhya and Eva Kittay (with Eric Steinhart). Their stance is in the main constructive rather than critical; but frequent comparisons of different views further facilitate the reader's overview. In the other contributions, metaphor is related to the problems of visual representation (Noel Carroll), to the open class test (Avishai Margalit and Naomi Goldblum) as well as to Wittgenstein's idea of 'a way of life' (E.M. Zemach).
In this book, Aladdin M. Yaqub describes a simple conception of truth and shows that it yields a semantical theory that accommodates the whole range of our seemingly conflicting intuitions about truth. Yaqub's conception takes the Tarskian biconditionals (such as "The sentence 'Johannes loved Clara' is true if and only if Johannes loved Clara") as correctly and completely defining the notion of truth. The semantical theory, which is called the revision theory, that emerges from this conception paints a metaphysical picture of truth as a property whose applicability is given by a revision process rather than by a fixed extension. The main advantage of this revision process is its ability to explain why truth seems in many cases almost redundant, in others substantial, and yet in others paradoxical (as in the famous Liar). Yaqub offers a comprehensive defense of the revision theory of truth by developing consistent and adequate formal semantics for languages in which all sorts of problematic sentences (Liar and company) can be constructed. He also gives a detailed critical exposition of the proposals of Herzberger, Gupta, and Belnap. Yaqub concludes by introducing a logic of truth that further demonstrates the adequacy of the revision theory. The Liar Speaks the Truth starts with a basic and intuitive understanding of the notion of truth and ends with a complex logic of truth. The book will interest students of logic, truth theory, formal semantics, and philosophy of language.
Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This book defends a version of scientific realism, called causal realism, in the context of particle physics. The first part of the book introduces the central theses and arguments in the recent philosophical debate on scientific realism and discusses entity realism, which is the most important precursor of causal realism. It also argues against the view that the very debate on scientific realism is not worth pursuing at all. In the second part, causal realism is developed and the key distinction between two kinds of warrant for scientific claims is clarified. This distinction proves its usefulness in a case study analyzing the discovery of the neutrino. It is also shown to be effective against an influential kind of pessimism, according to which even our best present theories are likely to be replaced some day by radically distinct alternatives. The final part discusses some specific challenges posed to realism by quantum physics, such as non-locality, delayed choice and the absence of particles in relativistic quantum theories.
This monograph is unique in its kind, giving as it does an independent and self-contained introduction to the eight prominent verisimilitude proposals that make up the verisimilitude literature after the breakdown of Popper's definition in 1974. The author brings them together by comparing the ways in which they order propositional formulae. Using this method, he shows that the distinction of content and likeness definitions partitions the entire field of investigation. In addition, it is shown that the weak content definitions can be strengthened by incorporating considerations of similarity between possible worlds. The resulting refined verisimilitude definition has many desirable properties. For instance, it is the first qualitative proposal that evades the problem of truth-value dependence. In addition, in chapter five the often discussed and misunderstood problem of "language dependency" is solved. The book will be of interest to those working in the fields of logic, epistemology, philosophy of science, and (computational) linguistics.
This book is a discussion of some of the major philosophical problems centering around the topic of sense perception and the foundations of human knowledge. It begins with a characterization of our common sense understanding of the role of the senses in the acquisition of belief, and it argues that scientific accounts of the processes of perception undermine salient parts of this understanding. The naive point of view of direct realism cannot be sustained in the light of a scientifically instructed understanding of perception. This critique of direct realism points to the correctness of the representative theory of perception characteristic of such early modem philosophers as Descartes and Locke, and it also endorses the subjective tum that they defended. It argues that these positions do not require introducing sense data into the picture, and thus it avoids the intractable problems that the sense datum philosophy introduces. In addition, several versions of cognitive accounts of sense perception are criticized with the result that it is unnecessary to characterize sensory processes in intentional terms. The book then turns to a leading question introduced into modem philosophy by Descartes and Locke, the question of the accuracy of the information delivered by the senses to our faculty of belief. In particular, how accurate are our representations of the secondary qualities? The case of color is considered in detail.
The Nature of Normativity presents a complete theory about the
nature of normative thought --that is, the sort of thought that is
concerned with what ought to be the case, or what we ought to do or
think. Ralph Wedgwood defends a kind of realism about the
normative, according to which normative truths or facts are
genuinely part of reality.
David J. Chalmers constructs a highly ambitious and original
picture of the world, from a few basic elements. He develops and
extends Rudolf Carnap's attempt to do the same in Der Logische
Aufbau Der Welt (1928). Carnap gave a blueprint for describing the
entire world using a limited vocabulary, so that all truths about
the world could be derived from that description--but his Aufbau is
often seen as a noble failure. In Constructing the World, Chalmers
argues that something like the Aufbau project can succeed. With the
right vocabulary and the right derivation relation, we can indeed
construct the world.
Philosophy of Probability provides a comprehensive introduction to theoretical issues that occupy a central position in disciplines ranging from philosophy of mind and epistemology to cognitive science, decision theory and artificial intelligence. Some contributions shed new light on the standard conceptions of probability (Bayesianism, logical and computational theories); others offer detailed analyses of two important topics in the field of cognitive science: the meaning and the representation of (partial) belief, and the management of uncertainty. The authors of this well-balanced account are philosophers as well as computer scientists (among them, L.J. Cohen, D. Miller, P. Gardenfors, J. Vickers, D. Dubois and H. Prade). This multidisciplinary approach to probability is designed to illuminate the intricacies of the problems in the domain of cognitive inquiry. No one interested in epistemology or aritificial intelligence will want to miss it.
This volume in the Critical Theory and Contemporary Society series explores the arguments between critical theory and epistemology in the twentieth and twenty-first centuries. Focusing on the first and second generations of critical theorists and Luhmann's systems theory, the book examines how each approaches epistemology. It opens by looking at twentieth-century epistemology, particularly the concept of lifeworld (Lebenswelt). It then moves on to discuss structuralism, poststructuralism, critical realism, the epistemological problematics of Foucault's writings and the dialectics of systems theory. The aim is to explore whether the focal point for epistemology and the sciences remain that social and political interests actually form a concrete point of concern for the sciences as well. -- .
Intuitionism is one of the main foundations for mathematics proposed in the twentieth century and its views on logic have also notably become important with the development of theoretical computer science. This book reviews and completes the historical account of intuitionism. It also presents recent philosophical work on intuitionism and gives examples of new technical advances and applications. It brings together 21 contributions from today's leading authors on intuitionism.
Expressionism, Deleuze's philosophical commentary on Spinoza, is a critically important work because its conclusions provide the foundations for Deleuze's later metaphysical speculations on the nature of power, the body, difference and singularities. Deleuze and Spinoza is the first book to examine Deleuze's philosophical assessment of Spinoza and appraise his arguments concerning the Absolute, the philosophy of mind, epistemology and moral and political philosophy. The author respects and disagrees with Deleuze the philosopher and suggests that his arguments not only lead to eliminativism and an Hobbesian politics but that they also cast a mystifying spell.
Contemporary discussions in metaphysics, epistemology and philosophy of mind are dominated by the presupposition of naturalism. Arguing against this established convention, Jim Slagle offers a thorough defence of Alvin Plantinga's Evolutionary Argument against Naturalism (EAAN) and in doing so, reveals how it shows that evolution and naturalism are incompatible. Charting the development of Plantinga's argument, Slagle asserts that the probability of our cognitive faculties reliably producing true beliefs is low if ontological naturalism is true, and therefore all other beliefs produced by these faculties, including naturalism itself, are self-defeating. He critiques other well-known epistemological approaches, including those of Descartes and Quine, and deftly counters the many objections against the EAAN to conclude that metaphysical naturalism should be rejected on the grounds of self-defeat. By situating Plantinga's argument within a wider context and showing that science and evolution cannot entail naturalism, Slagle renders this most common metaphysical view irrational. As such, the book advocates an important reconsideration of contemporary thought at the intersection of philosophy, science and religion.
Divided into two parts, the first concentrates on the logical properties of propositions, their relation to facts and sentences, and the parallel objects of commands and questions. The second part examines theories of intentionality and discusses the relationship between different theories of naming and different accounts of belief.
The scope of this study is both ambitious and modest. One of its ambitions is to reintegrate Hegel's theory of knowledge into main stream epist ology. Hegel's views were formed in consideration of Classical Skepticism and Modern epistemology, and he frequently presupposes great familiarity with other views and the difficulties they face. Setting Hegel's discussion in the context of both traditional and contemporary epistemology is therefore necessary for correctly interpreting his issues, arguments, and views. Accordingly, this is an issues-oriented study. I analyze Hegel's problematic and method by placing them in the context of Sextus Empiricus, Descartes, Kant, Carnap, and William Alston. I discuss Carnap, rather than a Modern empiricist such as Locke or Hume, for several reasons. One is that Hegel himself refutes a fundamental presupposition of Modern empiricism, the doctrine of "knowledge by acquaintance," in the first chapter of the Phenomenology, a chapter that cannot be reconstructed within the bounds of this study. |
You may like...
Modern Perspectives in Lattice QCD…
Laurent Lellouch, Rainer Sommer, …
Hardcover
R2,259
Discovery Miles 22 590
From Classical to Quantum Fields
Laurent Baulieu, John Iliopoulos, …
Hardcover
R3,673
Discovery Miles 36 730
Deformed Spacetime - Geometrizing…
Fabio Cardone, Roberto Mignani
Hardcover
R4,291
Discovery Miles 42 910
Deep Inelastic Scattering
Robin Devenish, Amanda Cooper-Sarkar
Paperback
R1,896
Discovery Miles 18 960
Introduction To Quantum Groups
Masud Chaichian, Andrei Demichev
Hardcover
R1,843
Discovery Miles 18 430
On The Role Of Division, Jordan And…
Feza Gursey, Chia-Hsiung Tze
Hardcover
R4,160
Discovery Miles 41 600
|