![]() |
![]() |
Your cart is empty |
||
Books > Humanities > Philosophy > Topics in philosophy > Epistemology, theory of knowledge
In this book, internationally recognized experts in philosophy of science, computer science, and modeling and simulation are contributing to the discussion on how ontology, epistemology, and teleology will contribute to enable the next generation of intelligent modeling and simulation applications. It is well understood that a simulation can provide the technical means to display the behavior of a system over time, including following observed trends to predict future possible states, but how reliable and trustworthy are such predictions? The questions about what we can know (ontology), how we gain new knowledge (epistemology), and what we do with this knowledge (teleology) are therefore illuminated from these very different perspectives, as each experts uses a different facet to look at these challenges. The result of bringing these perspectives into one book is a challenging compendium that gives room for a spectrum of challenges: from general philosophy questions, such as can we use modeling and simulation and other computational means at all to discover new knowledge, down to computational methods to improve semantic interoperability between systems or methods addressing how to apply the recent insights of service oriented approaches to support distributed artificial intelligence. As such, this book has been compiled as an entry point to new domains for students, scholars, and practitioners and to raise the curiosity in them to learn more to fully address the topics of ontology, epistemology, and teleology from philosophical, computational, and conceptual viewpoints.
Truths are determined not by what we believe, but by the way the world is. Or so realists about truth believe. Philosophers call such theories correspondence theories of truth. Truthmaking theory, which now has many adherents among contemporary philosophers, is a recent development of a realist theory of truth, and in this book, first published in 2004, D. M. Armstrong offers the first full-length study of this theory. He examines its applications to different sorts of truth, including contingent truths, modal truths, truths about the past and the future, and mathematical truths. In a clear, even-handed and non-technical discussion he makes a compelling case for truthmaking and its importance in philosophy. His book marks a significant contribution to the debate and will be of interest to a wide range of readers working in analytical philosophy.
The idea of a final end of human conduct - the highest good - plays an important role in Kant's philosophy. Unlike his predecessors Kant defines the highest good as a combination of two heterogeneous elements, namely virtue and happiness. This conception lies at the centre of some of the most influential Kantian doctrines such as his famous "moral argument" for the rationality of faith, his conception of the unity of reason and his views concerning the final end of nature as well as the historical progress of mankind. To be sure, the different treatments of the highest good in Kant's work have led to a great deal of discussion among his readers. Besides Kant's arguments for moral faith, recent debate has focused on the place of the highest good within Kant's moral theory, on the antinomy of pure practical reason, and on the idea of the primacy of practical reason. This collection of new essays attempts to re-evaluate Kant's doctrine of the highest good and to determine its relevance for contemporary philosophy.
This multi-disciplinary, multi-jurisdictional collection offers the first ever full-scale analysis of legal fictions. Its focus is on fictions in legal practice, examining and evaluating their roles in a variety of different areas of practice (e.g. in Tort Law, Criminal Law and Intellectual Property Law) and in different times and places (e.g. in Roman Law, Rabbinic Law and the Common Law). The collection approaches the topic in part through the discussion of certain key classical statements by theorists including Jeremy Bentham, Alf Ross, Hans Vaihinger, Hans Kelsen and Lon Fuller. The collection opens with the first-ever translation into English of Kelsen's review of Vaihinger's As If. The 17 chapters are divided into four parts: 1) a discussion of the principal theories of fictions, as above, with a focus on Kelsen, Bentham, Fuller and classical pragmatism; 2) a discussion of the relationship between fictions and language; 3) a theoretical and historical examination and evaluation of fictions in the common law; and 4) an account of fictions in different practice areas and in different legal cultures. The collection will be of interest to theorists and historians of legal reasoning, as well as scholars and practitioners of the law more generally, in both common and civil law traditions.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic."
In einem Brief an Martin Buber schrieb FR, am 25. Oktober 1925, daB sich seine eigentliche "literarische Entwicklung" seit 1920 im Ubersetzen abgespielt habe. Wie jede Selbstcharakteristik stimmt diese Feststellung sicher nur zum Teil. Wesentlich jedoch ist die daB es sich bei FRs Ubersetzungen urn eine literarische Tatsache, Entwicklung mit unterscheidbaren Stilperioden handelt. Am Anfang dieser Entwicklung stehen die Ubersetzungen einiger Gebetszyklen, die zunachst im grade gegriindeten eigenen Hausstand gebraucht wurden. Zur zweiten Phase geh6ren die Ubertragungen von Hymnen mittelalteriicher Dichter die, gewissermassen als Sekun- dariiteratur, in die jiidische Liturgie aufgenommen wurden. Der Weg fiihrte weiter zu der Auswahl der Gedichte Jehuda Halevis und dann zur Verdeutschung der Schrift. Schon in der liturgischen Phase machte FR es sich nicht leicht. Die Ubersetzungen muBten nicht nur dem hebraischen oder dem aramai- schen Wortsinn entsprechen - sie muBten auch mit den dem deutschen Judentum vertrauten Melodien zu singen sein. Eine wissenschaftlich genaue oder "w6rtliche" Ubersetzung konnte so nicht entstehen. Die drei Worte, mit denen die meisten Segensspriiche der hebraischen Liturgie beginnen, zum Beispie- baruch ata adonai - waren w6rtlich etwa mit "Gelobt Du me in Herr" zu iibersetzen. Da standen aber den sieben hebraischen Silben nur fiinf deutsche gegeniiber, was diese unsingbar gemacht hatte. FRs Formulierung - Lob nun ja Lob dir 0 Gott - mag als bezeichnend fiir die Anfange seiner Ubersetzungen genommen werden.
The generation of meaning is the most fundamental process of the mind. It underlies all major mental functions, such as intelligence, memory, perception, and communication. Not surprisingly, it has been one of the most difficult processes to understand and represent in a model of human cognition. Dr. Christine Hardy introduces two fundamental concepts to address the complexity and richness of meaning. First, she discusses Semantic Constellations, which constitute the basic transversal network organization of mental and neural processes. Second, she addresses a highly dynamic connective process that underlies conscious thought and constantly gives birth to novel emergents or meanings. Taken together, Hardy asserts, the mind's network architecture and connective dynamics allow for self-organization, generativity, and creativity. They can also account for some of the most interesting facets of mental processes, in particular, nonlinear shifts and "breakthroughs" such as intuition, insights, and shifts in states of consciousness. This connective dynamic does not just take place within the mind. Rather, it involves a continuously evolving person-environment interaction: meaning is injected into the environment, and then retrojected, somewhat modified, back into the psyche. This means that, simultaneously, we are both perceiving reality and subtly influencing the very reality we perceive: objects, events, and other individuals. The way in which we think and feel, both individually and collectively, interacts with the physical world and directly shapes the society in which we live. The very same connective dynamic, Hardy shows, is the foundation for those rare yet striking transpersonalexperiences known as synchronicity and psychic phenomena. We live in a world in which we interact with reality at a very fundamental level. Hardy's work is a major analysis for scholars and researchers in the cognitive sciences, psychology, and parapsychology.
In A Theodicy of Hell Charles Seymour tackles one of the most difficult problems facing the western theistic tradition: to show the consonance between eternal punishment and the goodness of God. Medieval theology attempted to resolve the dilemma by arguing that any sin, no matter how slight, merits unending torment. Contemporary thinkers, on the other hand, tend to eliminate the retributive element from hell entirely. Combining historical breadth with detailed argumentation, the author develops a novel understanding of hell which avoids the extremes of both its traditional and modern rivals. He then surveys the battery of objections ranged against the possibility of eternal punishment and shows how his freedom view of hell' can withstand the attack. The work will be of particular importance for those interested in philosophy of religion and theology, including academics, students, seminarians, clergy, and anyone else with a personal desire to come to terms with this perennially challenging doctrine.
This monograph articulates and defends a theory-based epistemology of modality (TEM). According to TEM, someone justifiably believe an interesting modal claim if and only if (a) she justifiably believes a theory according to which that claim is true, (b) she believes that claim on the basis of that theory, and (c) she has no defeaters for her belief in that claim. The book has two parts. In the first, the author motivates TEM, sets out the view in detail, and defends it against a number of objections. In the second, the author considers whether TEM is worth accepting. To argue that it is, the author sets out criteria for choosing between modal epistemologies, concluding that TEM has a number of important virtues. However, the author also concedes that TEM is cautious: it probably implies that we are not justified in believing some interesting modal claims that we might take ourselves to be justified in believing. This raises a question about TEM's relationship to Peter van Inwagen's modal skepticism, which the author explores in detail. As it turns out, TEM offers a better route to modal skepticism than the one that van Inwagen provides. But rather than being a liability, the author argues that this is a further advantage of the view. Moreover, he argues that other popular modal epistemologies do not fare better: they cannot easily secure more extensive modal justification than TEM. The book concludes by clarifying TEM's relationship to the other modal epistemologies on offer, contending that TEM need not be a rival to those views, but can instead be a supplement to them.
Is it merely a matter of taste or convention to consider something right or wrong? Or can we find good reasons for our values and judgements that are independent of culture and tradition? The problem is as old as philosophy itself; and after more than two millennia of scholarly debate, there seems no end to the controversy. But Christian Illies suggests that powerful new forms of transcendental argument (a philosophical tool known since antiquity) may offer a long-sought cornerstone for morality.
This book about the philosophy of science is the second out of four volumes by Richard Ned Lebow in this book series. It not only provides a useful overview of this broad topic, but also provides deeper insight into specific topics like the philosophy of science causation, epistemology and methods, and especially on counter factual analysis.
Simon Susen examines the impact of the 'postmodern turn' on the contemporary social sciences. On the basis of an innovative five-dimensional approach, this study provides a systematic, comprehensive, and critical account of the legacy of the 'postmodern turn', notably in terms of its continuing relevance in the twenty-first century.
This book offers a challenge to certain epistemic features of belief, resulting in a unified and coherent picture of the epistemology of belief. The author examines current ideas in a number of areas, beginning with the truth-directed nature of belief in the context of the so-called 'Moore's paradoxes'. He then investigates the sensitivity of beliefs to evidence by exploring how sensory experiences can confer justifications on the beliefs they give rise to, and provides an account of the basing relation problem. The consequences of these arguments are carefully considered, particularly the issues involving the problem of easy knowledge and warrant transmission. Finally, he focuses on the purported fallibility of beliefs and our knowledge of their contents, arguing that the fallible/infallible distinction is best understood in terms of externalist/internalist conceptions of knowledge, and that the thesis of content externalism does not threaten the privileged character of self-knowledge.
This combination of historiography and theory offers the growing Anglophone readership interested in the ideas of Gilbert Simondon a thorough and unprecedented survey of the French philosopher's entire oeuvre. The publication, which breaks new ground in its thoroughness and breadth of analysis, systematically traces the interconnections between Simondon's philosophy of science and technology on the one hand, and his political philosophy on the other. The author sets Simondon's ideas in the context of the epistemology of the late 1950s and the 1960s in France, the milieu that shaped a generation of key French thinkers such as Deleuze, Foucault and Derrida. This volume explores Simondon's sources, which were as eclectic as they were influential: from the philosophy of Bergson to the cybernetics of Wiener, from the phenomenology of Merleau-Ponty to the epistemology of Canguilhem, and from Bachelard's philosophy of science to the positivist sociology and anthropology of luminaries such as Durkheim and Leroi-Gourhan. It also tackles aspects of Simondon's philosophy that relate to Heidegger and Elull in their concern with the ontological relationship between technology and society and discusses key scholars of Simondon such as Barthelemy, Combes, Stiegler, and Virno, as well as the work of contemporary protagonists in the philosophical debate on the relevance of technique. The author's intimate knowledge of Simondon's language allows him to resolve many of th e semantic errors and misinterpretations that have plagued reactions to Simondon's many philosophical neologisms, often drawn from his scientific studies.
The book expresses the conviction that the art of creating tools - Greek techne - changes its character together with the change of civilization epochs and co-determines such changes. This does not mean that tools typical for a civilization epoch determine it completely, but they change our way of perceiving and interpreting the world. There might have been many such epochs in the history of human civilization (much more than the three waves of agricultural, industrial and information civilization). This is expressed by the title Technen of the book, where n denotes a subsequent civilization epoch. During last fifty years we observed a decomposition of the old episteme (understood as a way of creating and interpreting knowledge characteristic for a given civilization epoch) of modernism, which was an episteme typical for industrial civilization. Today, the world is differently understood by the representatives of three different cultural spheres: of strict and natural sciences; of human and social sciences (especially by their part inclined towards postmodernism) and technical sciences that have a different episteme than even that of strict and natural sciences. Thus, we observe today not two cultures, but three different episteme. The book consists of four parts. First contains basic epistemological observations, second is devoted to selected elements of recent history of information technologies, third contains more detailed epistemological and general discussions, fourth specifies conclusions. The book is written from the cognitive perspective of technical sciences, with a full awareness - and discussion - of its differences from the cognitive perspective of strict sciences or human and social sciences. The main thesis of the book is that informational revolution will probably lead to a formation of a new episteme. The book includes discussions of many issues related to such general perspective, such as what is technology proper; what is intuition from a perspective of technology and of evolutionary naturalism; what are the reasons for and how large are the delays between a fundamental invention and its broad social utilization; what is the fundamental logical error (using paradoxes that are not real, only apparent) of the tradition of sceptical philosophy; what are rational foundations and examples of emergence of order out of chaos; whether civilization development based on two positive feedbacks between science, technology and the market might lead inevitably to a self-destruction of human civilization; etc.
Although influential in his own day, Karl Leonhard Reinhold's contribution to late 18th and early 19th century thought has long been overshadowed by the towering presence of Immanuel Kant, the thinker whose ideas he helped to interpret and disseminate. Today, however, a more nuanced understanding of Reinhold's contribution to post-Kantian thought is emerging. Apart from his exposition of Kant's critical philosophy, which played a significant role in the development of German idealism, Reinhold's role in the intellectual movement of Enlightenment and his contributions to early linguistic philosophy are now receiving scholarly attention. In the English-speaking world, where few translations of his work have been attempted, Reinhold has mostly been overlooked. This imbalance is corrected in the present work: the first translation into English of Reinhold's major work of philosophy, the New Theory of the Human Capacity for Representation (1789). The translators provide an overview of the main currents of thought which informed Reinhold's philosophical project, as well as notes on his reading of Kant and other important thinkers of Reinhold's day. A glossary of key terms, a bibliography of scholarly work on Reinhold and suggestions for further reading are also included.
The question how to turn the principles implicitly governing the concept of truth into an explicit definition (or explication) of the concept hence coalesced with the question how to get a finite grip on the infinity of T-sentences. Tarski's famous and ingenious move was to introduce a new concept, satisfaction, which could be, on the one hand, recursively defined, and which, on the other hand, straightforwardly yielded an explication of truth. A surprising 'by-product' of Tarski's effort to bring truth under control was the breathtaking finding that truth is in a precisely defined sense ineffable, that no non trivial language can contain a truth-predicate which would be adequate for the very 4 language . This implied that truth (and consequently semantic concepts to which truth appeared to be reducible) proved itself to be strangely 'language-dependent' we can have a concept of truth-in-L for any language L, but we cannot have a concept of truth applicable to every language. In a sense, this means, as Quine (1969, p. 68) put it, that truth belongs to "transcendental metaphysics," and Tarski's 'scientific' investigations seem to lead us back towards a surprising proximity of some more traditional philosophical views on truth. 3. TARSKI'S THEORY AS A PARADIGM So far Tarski himself. Subsequent philosophers then had to find out what his considerations of the concept of truth really mean and what are their consequences; and this now seems to be an almost interminable task."
Metadecisions: Rehabilitating Epistemology constitutes an epistemological inquiry about the foundations of knowledge of a scientific discipline. This text warns contemporary scientific disciplines that neglecting epistemological issues threatens the viability of their pronouncements and designs. It shows that the processes by which complex artefacts are created require a pluralistic approach to artefact design. It argues that viable solutions to fundamental problems in each
discipline require cooperation, creativity and respect for
contributions from all walks of life, all levels of logic and all
standards of rigor - be they in the natural sciences, the social
sciences, engineering sciences, management, the law or political
sciences. Ten cases spanning subjects like Doctor Assisted Suicides (DASs), Advising Women on The Risks of Mammograms, a Deregulation Crusade, The Crash of TWA Flight 800, The Control of The World Wide Web, The Creation of the US Department of Homeland Security, among others, are used to illustrate the application of the metasystem framework to increase knowledge and meaning of fundamental problems. The design of any human activity requires the intervention of several inquiring systems where the manager, the engineer, the scientist, the lawyer, the epistemologist, the ethicist and even the artist contribute to shape how problems in the real-world are formulated, how decisions/metadecisions to solve problems are taken, and finally, how actions are implemented.
This thought-provoking book initiates a dialogue among scholars in rhetoric and hermeneutics in many areas of the humanities. Twenty leading thinkers explore the ways these two powerful disciplines inform each other and influence a wide variety of intellectual fields. Walter Jost and Michael J. Hyde organize pivotal topics in rhetoric and hermeneutics with originality and coherence, dividing their book into four sections: Locating the Disciplines; Inventions and Applications; Arguments and Narratives; and Civic Discourse and Critical Theory. Contributors to this volume include Hans-Georg Gadamer (one of whose pieces is here translated into English for the first time), Paul Ricoeur, Gerald L. Bruns, Charles Altieri, Richard E. Palmer, Calvin O. Schrag,.Victoria Kahn, Eugene Garver, Michael Leff, Nancy S. Streuver, Wendy Olmsted, David Tracy, Donald G. Marshall, Allen Scult, Rita Copeland, William Rehg, and Steven Mailloux. For readers across the humanities, the book demonstrates the usefulness of rhetorical and hermeneutic approaches in literary, philosophical, legal, religious, and political thinking. With its stimulating new perspectives on the revival and interrelation of both rhetoric and hermeneutics, this collection is sure to serve as a benchmark for years to come.
Converging evidence from disciplines including sociobiology, evolutionary psychology and human biology forces us to adopt a new idea of what it means to be a human. As cherished concepts such as free will, naive realism, humans as creation's crowning glory fall and our moral roots in ape group dynamics become clearer, we have to take leave of many concepts that have been central to defining our humanness. What emerges is a new human, the homo novus, a human being without illusions. Leading authors from many different fields explore these issues by addressing a range of illusions and providing evidence for the need, despite considerable reluctance, to relinquish some of our most cherished ideas about ourselves.
Why the tools of philosophy offer a powerful antidote to today's epidemic of irrationality There is an epidemic of bad thinking in the world today. An alarming number of people are embracing crazy, even dangerous ideas. They believe that vaccinations cause autism. They reject the scientific consensus on climate change as a "hoax." And they blame the spread of COVID-19 on the 5G network or a Chinese cabal. Worse, bad thinking drives bad acting-it even inspired a mob to storm the U.S. Capitol. In this book, Steven Nadler and Lawrence Shapiro argue that the best antidote for bad thinking is the wisdom, insights, and practical skills of philosophy. When Bad Thinking Happens to Good People provides an engaging tour through the basic principles of logic, argument, evidence, and probability that can make all of us more reasonable and responsible citizens. When Bad Thinking Happens to Good People shows how we can more readily spot and avoid flawed arguments and unreliable information; determine whether evidence supports or contradicts an idea; distinguish between merely believing something and knowing it; and much more. In doing so, the book reveals how epistemology, which addresses the nature of belief and knowledge, and ethics, the study of moral principles that should govern our behavior, can reduce bad thinking. Moreover, the book shows why philosophy's millennia-old advice about how to lead a good, rational, and examined life is essential for escaping our current predicament. In a world in which irrationality has exploded to deadly effect, When Bad Thinking Happens to Good People is a timely and essential guide for a return to reason.
This work addresses scientism and relativism, two false philosophies that divorce science from culture in general and from tradition in particular. It helps break the isolation of science from the rest of culture by promoting popular science and reasonable history of science. It provides examples of the value of science to culture, discussions of items of the general culture, practical strategies and tools, and case studies. It is for practising professionals, political scientists and science policy students and administrators.
Sanford Goldberg investigates the role that others play in our attempts to acquire knowledge of the world. Two main forms of this reliance are examined: testimony cases, where a subject aims to acquire knowledge through accepting what another tells her; and cases involving "coverage," where a subject aims to acquire knowledge of something by reasoning that if things were not so she would have heard about it by now. Goldberg argues that these cases challenge some cherished assumptions in epistemology. Testimony cases challenge the assumption, prominent in reliabilist epistemology, that the processes through which beliefs are formed never extend beyond the boundaries of the individual believer. And both sorts of case challenge the idea that, insofar knowledge is a cognitive achievement, it is an achievement that belongs to the knowing subject herself. Goldberg uses results of this sort to question the broadly individualistic orthodoxy within reliabilist epistemology, and to explore what a non-orthodox reliabilist epistemology would look like. The resulting theory is a social-reliabilist epistemology -- one that results from the application of reliabilist criteria to situations in which belief-fixation involves epistemic reliance on others. Sanford Goldberg presents an important contribution both to the reliability literature in general epistemology and to the social epistemology of testimony and related topics.
In The Rise of Neoliberal Philosophy: Human Capital, Profitable Knowledge, and the Love of Wisdom, Brandon Absher argues that the neoliberal transformation of higher education has resulted in a paradigm shift in philosophy in the United States, leading to the rise of neoliberal philosophy. Neoliberal philosophy seeks to attract investment by demonstrating that it can produce optimal return. Further, philosophers in the neoliberal paradigm internalize and reproduce the values of the prevailing social order in their work, reorienting philosophical desire toward the production of attractive commodities. The aim of philosophy in the neoliberal university, Absher shows, has become the production of human capital and profitable knowledge.
This book examines the utopian dimension of contemporary social and political thought. Arguing for a utopian optic for the human sciences, el-Ojeili claims that major transformations of the utopian constellation have occurred since the end of the twentieth century. Following a survey of major utopian shifts in the modern period, el-Ojeili focuses on three spaces within today's utopian constellation. At the liberal centre, we see a splintering effect, particularly after the global financial crisis of 2008: a contingent neo-liberalism, a neo-Keynesian turn, and a liberalism of fear. At the far-Right margin, we see the consolidation of post-fascism, a combination of "the future in the past", elements of the post-modern present, and appeals to a novel future. Finally, at the far-Left, a new communism has emerged, with novel positions on resistance, maps of power, and a contemporary variant of the Left's artistic critique. The Utopian Constellation will be of interest to scholars and students across the human sciences with an interest in utopian studies, ideological and discourse analysis, the sociology of knowledge, and the study of political culture. |
![]() ![]() You may like...
The History of Religious Imagination in…
Christian Hengstermann
Hardcover
R3,295
Discovery Miles 32 950
Free Will, Agency, and Selfhood in…
Matthew R. Dasti, Edwin F. Bryant
Hardcover
R3,984
Discovery Miles 39 840
|