|
|
Books > Philosophy > Topics in philosophy > Logic
A selection of papers presented at the international conference
Applied Logic: Logic at Work', held in Amsterdam in December 1992.
Nowadays, the term applied logic' has a very wide meaning, as
numerous applications of logical methods in computer science,
formal linguistics and other fields testify. Such applications are
by no means restricted to the use of known logical techniques: at
its best, applied logic involves a back-and-forth dialogue between
logical theory and the problem domain. The papers focus on the
application of logic to the study of natural language, in syntax,
semantics and pragmatics, and the effect of these studies on the
development of logic. In the last decade, the dynamic nature of
natural language has been the most interesting challenge for
logicians. Dynamic semantics is here applied to new topics, the
dynamic approach is extended to syntax, and several methodological
issues in dynamic semantics are systematically investigated. Other
methodological issues in the formal studies of natural language are
discussed, such as the need for types, modal operators and other
logical operators in the formal framework. Further articles address
the scope of these methodological issues from other perspectives
ranging from cognition to computation. The volume presents papers
that are interesting for graduate students and researchers in the
field of logic, philosophy of language, formal semantics and
pragmatics, and computational linguistics.
This three volume collection gathers together responses to Weber's sociology in the period 1920-1945. Bryan Turner provides an extensive analysis of the reception of Weber.
Husserl's "Logical Investigations" is designed to help students
and specialists work their way through Husserl's expansive text by
bringing together in a single volume six self-contained, expository
yet critical essays, each the work of an international expert on
Husserl's thought and each devoted to a separate Logical
Investigation.
It is with great pleasure that we are presenting to the community
the second edition of this extraordinary handbook. It has been over
15 years since the publication of the first edition and there have
been great changes in the landscape of philosophical logic since
then. The first edition has proved invaluable to generations of
students and researchers in formal philosophy and language, as weIl
as to consumers of logic in many applied areas. The main logic
artiele in the Encyelopaedia Britannica 1999 has described the
first edition as 'the best starting point for exploring any of the
topics in logic'. We are confident that the second edition will
prove to be just as good. ! The first edition was the second
handbook published for the logic commu nity. It followed the North
Holland one volume Handbook 0/ Mathematical Logic, published in
1977, edited by the late Jon Barwise. The four volume Handbook 0/
Philosophical Logic, published 1983-1989 came at a fortunate at the
evolution of logic. This was the time when logic temporal junction
was gaining ground in computer science and artificial intelligence
cireles. These areas were under increasing commercial pressure to
provide devices which help andjor replace the human in his daily
activity. This pressure required the use of logic in the modelling
of human activity and organisa tion on the one hand and to provide
the theoretical basis for the computer program constructs on the
other.
Relevant to philosophy, law, management, and artificial
intelligence, these papers explore the applicability of
nonmonotonic or defeasible logic to normative reasoning. The
resulting systems purport to solve well-known deontic paradoxes and
to provide a better treatment than classical deontic logic does of
prima facie obligation, conditional obligation, and priorities of
normative principles.
This book defines a logical system called the Protocol-theoretic
Logic of Epistemic Norms (PLEN), it develops PLEN into a formal
framework for representing and reasoning about epistemic norms, and
it shows that PLEN is theoretically interesting and useful with
regard to the aims of such a framework. In order to motivate the
project, the author defends an account of epistemic norms called
epistemic proceduralism. The core of this view is the idea that, in
virtue of their indispensable, regulative role in cognitive life,
epistemic norms are closely intertwined with procedural rules that
restrict epistemic actions, procedures, and processes. The
resulting organizing principle of the book is that epistemic norms
are protocols for epistemic planning and control. The core of the
book is developing PLEN, which is essentially a novel variant of
propositional dynamic logic (PDL) distinguished by more or less
elaborate revisions of PDL's syntax and semantics. The syntax
encodes the procedural content of epistemic norms by means of the
well-known protocol or program constructions of dynamic and
epistemic logics. It then provides a novel language of operators on
protocols, including a range of unique protocol equivalence
relations, syntactic operations on protocols, and various
procedural relations among protocols in addition to the standard
dynamic (modal) operators of PDL. The semantics of the system then
interprets protocol expressions and expressions embedding protocols
over a class of directed multigraph-like structures rather than the
standard labeled transition systems or modal frames. The intent of
the system is to better represent epistemic dynamics, build a logic
of protocols atop it, and then show that the resulting logic of
protocols is useful as a logical framework for epistemic norms. The
resulting theory of epistemic norms centers on notions of norm
equivalence derived from theories of process equivalence familiar
from the study of dynamic and modal logics. The canonical account
of protocol equivalence in PLEN turns out to possess a number of
interesting formal features, including satisfaction of important
conditions on hyperintensional equivalence, a matter of recently
recognized importance in the logic of norms, generally. To show
that the system is interesting and useful as a framework for
representing and reasoning about epistemic norms, the author
applies the logical system to the analysis of epistemic deontic
operators, and, partly on the basis of this, establishes
representation theorems linking protocols to the action-guiding
content of epistemic norms. The protocol-theoretic logic of
epistemic norms is then shown to almost immediately validate the
main principles of epistemic proceduralism.
Perspectives on Time deals with the problem of time from different
perspectives such as logic, physics and philosophy. It contains 18
previously unpublished papers, written by philosophers from various
European countries, as well as a large introduction about the
history and the main situation in the respective fields today. The
prominent issues which are addressed in this book concern the
direction of time, the reality of tenses, the objectivity of
becoming, the existence in time, and the logical structures of
reasoning about time. The papers have been written based on
different approaches, partly depending on whether the authors
subscribe to an A-theory or a B-theory of time. Audience: Due to
the broad variety of approaches the book contains important
contributions both for philosophers, philosophers of science,
logicians and for scientists working in the field of language and
AI.
Knocking on Heaven's Door is the oldest human dream that seems
unrealized still. Religious discourse does show the road, but it
requires a blind faith in return. In this book logicians try to
hear Heaven's Call and to analyze religious discourse. As a result,
the notion of religious logic as a part of philosophical logic is
introduced. Its tasks are (1) to construct consistent logical
systems formalizing religious reasoning that at first sight seems
inconsistent (this research is fulfilled within the limits of modal
logic, paraconsistent logic and many-valued logic), (2) to carry
out an illocutionary analysis of religious discourse (this research
is fulfilled in frames of illocutionary logics), and (3) to
formalize Ancient and Medieval logical theories used in the
theology of an appropriate religion (they could be studied within
the limits of unconventional logics, such as non-monotonic logics,
non-well-founded logics, etc.).
These two volumes contain all of my articles published between 1956
and 1975 which might be of interest to readers in the
English-speaking world. The first three essays in Vol. 1 deal with
historical themes. In each case I have attempted a rational
reconstruction which, as far as possible, meets con temporary
standards of exactness. In The Problem of Universals Then and Now
some ideas of W.V. Quine and N. Goodman are used to create a modem
sketch of the history of the debate on universals beginning with
Plato and ending with Hao Wang's System: E. The second article
concerns Kant's Philosophy of Science. By analyzing his position
vis-a-vis I. Newton, Christian Wolff, and D. Hume, it is shown that
for Kant the very notion of empirical knowledge was beset with a
funda mental logical difficulty. In his metaphysics of experience
Kant offered a solution differing from all prior as well as
subsequent attempts aimed at the problem of establishing a
scientific theory. The last of the three historical papers utilizes
some concepts of modem logic to give a precise account of
Wittgenstein's so-called Picture Theory of Meaning. E. Stenius'
interpretation of this theory is taken as an intuitive starting
point while an intensional variant of Tarski's concept of a
relational system furnishes a technical instrument. The concepts of
model world and of logical space, together with those of
homomorphism and isomorphism be tween model worlds and between
logical spaces, form the conceptual basis of the reconstruction."
Using both Father Kevin Wall's eidetic matrix of "the relational
unity of being" and Edith Stein's remarkable synoptic view of
intentionality in both Aquinas and Husserl, this book uncovers
purely logical ground for a subalternate eidetic science called
"convergent phenomenology," itself located at the inmost depths of
Husserlian phenomenology. Convergent phenomenology emerges as a
distinctively new discipline dealing with relation-like objectivity
as opposed to the thing-like objectivity of traditional
phenomenology. This has grand implications for the way we as humans
conceive of God and being. The book thus benefits theologians,
logicians, and phenomenologists by revealing the constitutive
interrelationality of transcendental logic in an utterly new light
as already flowering forth into formal ontology itself. What
emerges is a rich conception of divinity and humanity.
The tableau methodology, invented in the 1950's by Beth and
Hintikka and later perfected by Smullyan and Fitting, is today one
of the most popular proof theoretical methodologies. Firstly
because it is a very intuitive tool, and secondly because it
appears to bring together the proof-theoretical and the semantical
approaches to the presentation of a logical system. The increasing
demand for improved tableau methods for various logics is mainly
prompted by extensive applications of logic in computer science,
artificial intelligence and logic programming, as well as its use
as a means of conceptual analysis in mathematics, philosophy,
linguistics and in the social sciences. In the last few years the
renewed interest in the method of analytic tableaux has generated a
plethora of new results, in classical as well as non-classical
logics. On the one hand, recent advances in tableau-based theorem
proving have drawn attention to tableaux as a powerful deduction
method for classical first-order logic, in particular for
non-clausal formulas accommodating equality. On the other hand,
there is a growing need for a diversity of non-classical logics
which can serve various applications, and for algorithmic
presentations of these logicas in a unifying framework which can
support (or suggest) a meaningful semantic interpretation. From
this point of view, the methodology of analytic tableaux seems to
be most suitable. Therefore, renewed research activity is being
devoted to investigating tableau systems for intuitionistic, modal,
temporal and many-valued logics, as well as for new families of
logics, such as non-monotonic and substructural logics. The results
require systematisation. This Handbook isthe first to provide such
a systematisation of this expanding field. It contains several
chapters on the use of tableaux methods in classical logic, but
also contains extensive discussions on: the uses of the methodology
in intuitionistic logics modal and temporal logics substructural
logics, nonmonotonic and many-valued logics the implementation of
semantic tableaux a bibliography on analytic tableaux theorem
proving. The result is a solid reference work to be used by
students and researchers in Computer Science, Artificial
Intelligence, Mathematics, Philosophy, Cognitive Sciences, Legal
Studies, Linguistics, Engineering and all the areas, whether
theoretical or applied, in which the algorithmic aspects of logical
deduction play a role.
An accessible guide for those facing the study of Logic for the
first time, this book covers key thinkers, terms and texts. "The
Key Terms in Philosophy" series offers clear, concise and
accessible introductions to the central topics in philosophy. Each
book offers a comprehensive overview of the key terms, concepts,
thinkers and major works in the history of a key area of
philosophy. Ideal for first-year students starting out in
philosophy, the series will serve as the ideal companion to study
of this fascinating subject. "Key Terms in Logic" offers the ideal
introduction to this core area in the study of philosophy,
providing detailed summaries of the important concepts in the study
of logic and the application of logic to the rest of philosophy. A
brief introduction provides context and background, while the
following chapters offer detailed definitions of key terms and
concepts, introductions to the work of key thinkers and lists of
key texts. Designed specifically to meet the needs of students and
assuming no prior knowledge of the subject, this is the ideal
reference tool for those coming to Logic for the first time. "The
Key Terms" series offers undergraduate students clear, concise and
accessible introductions to core topics. Each book includes a
comprehensive overview of the key terms, concepts, thinkers and
texts in the area covered and ends with a guide to further
resources.
Offering a bold new vision on the history of modern logic, Lukas M.
Verburgt and Matteo Cosci focus on the lasting impact of
Aristotle's syllogism between the 1820s and 1930s. For over two
millennia, deductive logic was the syllogism and syllogism was the
yardstick of sound human reasoning. During the 19th century, this
hegemony fell apart and logicians, including Boole, Frege and
Peirce, took deductive logic far beyond its Aristotelian borders.
However, contrary to common wisdom, reflections on syllogism were
also instrumental to the creation of new logical developments, such
as first-order logic and early set theory. This volume presents the
period under discussion as one of both tradition and innovation,
both continuity and discontinuity. Modern logic broke away from the
syllogistic tradition, but without Aristotle's syllogism, modern
logic would not have been born. A vital follow up to The Aftermath
of Syllogism, this book traces the longue duree history of
syllogism from Richard Whately's revival of formal logic in the
1820s through the work of David Hilbert and the Goettingen school
up to the 1930s. Bringing together a group of major international
experts, it sheds crucial new light on the emergence of modern
logic and the roots of analytic philosophy in the 19th and early
20th centuries.
Aristotle's treatise De Interpretatione is one of his central
works; it continues to be the focus of much attention and debate.
C. W. A. Whitaker presents the first systematic study of this work,
and offers a radical new view of its aims, its structure, and its
place in Aristotle's system, basing this view upon a detailed
chapter-by-chapter analysis. By treating the work systematically,
rather than concentrating on certain selected passages, Dr Whitaker
is able to show that, contrary to traditional opinion, it forms an
organized and coherent whole. He argues that the De Interpretatione
is intended to provide the underpinning for dialectic, the system
of argument by question and answer set out in Aristotle's Topics ;
and he rejects the traditional view that the De Interpretatione
concerns the assertion and is oriented towards the formal logic of
the Prior Analytics. In doing so, he sheds valuable new light on
some of Aristotle's most famous texts.
such questions for centuries (unrestricted by the capabilities of
any ha- ware).
Theprinciplesgoverningtheinteractionofseveralprocesses, forexample,
are abstract an similar to principles governing the cooperation of
two large organisation. A detailed rule based e?ective but rigid
bureaucracy is very much similar to a complex computer program
handling and manipulating data. My guess is that the principles
underlying one are very much the same as those underlying the
other.
Ibelievethedayisnotfarawayinthefuturewhenthecomputerscientist will
wake up one morning with the realisation that he is actually a kind
of formal philosopher! The projected number of volumes for this
Handbook is about 18. The
subjecthasevolvedanditsareashavebecomeinterrelatedtosuchanextent
that it no longer makes sense to dedicate volumes to topics.
However, the volumes do follow some natural groupingsof chapters.
Iwould liketothank our authorsand readersfor their contributionsand
their commitment in making this Handbook a success. Thanksalso to
our publication administrator Mrs J. Spurr for her usual dedication
and excellence and to Kluwer Academic Publishers (now Springer) for
their continuing support for the Handbook. Dov Gabbay King's
College London 10 Logic IT Natural Program Arti?cial in- Logicp-
language control spec- telligence gramming processing i?cation,
veri?cation, concurrency Temporal Expressive Expressive Planning.
Extension of logic power of tense power for re- Time depen- Horn
clause operators. current events. dent data. with time Temporal
Speci?cation Eventcalculus. capability. indices. Sepa- of tempo-
Persistence Event calculus. ration of past ral control. through
time- Temporal logic from future Decision prob- theFrame
programming. lems. Model Problem. T- checking. poral query
language. temporal transactions.
Plural predication is a pervasive part of ordinary language. We can
say that some people are fifty in number, are surrounding a
building, come from many countries, and are classmates. These
predicates can be true of some people without being true of any one
of them; they are non-distributive predications. Yet the apparatus
of predication and quantification in standard modern logic does not
allow a place for such non-distributive predicates. Thomas McKay's
book explores the enrichment of modern logic with plural
predication and quantification. We can have genuinely
non-distributive predication without relying on singularizing
procedures from set theory and mereology. The fundamental 'among'
relation can be understood in a way that does not generate any
hierarchy of plurals analogous to a hierarchy of types or a
hierarchy of higher-order logics. Singular quantification can be
understood as a special case, with the general type being
quantifiers that allow both singular and plural quantification. The
'among' relation is formally similar to a 'part of' relation, but
the relations are distinct, so that mass quantification and plural
quantification cannot be united in the same way that plural and
singular are united. Analysis of singular and plural definite
descriptions follows, with a defense of a fundamentally Russellian
analysis, but coupled with some new ideas about how to be sensitive
to the role of context. This facilitates an analysis of some
central features of the use of pronouns, both singular and plural.
This monograph shows that, through a recourse to the concepts and
methods of abstract algebraic logic, the algebraic theory of
regular varieties and the concept of analyticity in formal logic
can profitably interact. By extending the technique of Plonka sums
from algebras to logical matrices, the authors investigate the
different classes of models for logics of variable inclusion and
they shed new light into their formal properties. The book opens
with the historical origins of logics of variable inclusion and on
their philosophical motivations. It includes the basics of the
algebraic theory of regular varieties and the construction of
Plonka sums over semilattice direct systems of algebra. The core of
the book is devoted to an abstract definition of logics of left and
right variable inclusion, respectively, and the authors study their
semantics using the construction of Plonka sums of matrix models.
The authors also cover Paraconsistent Weak Kleene logic and survey
its abstract algebraic logical properties. This book is of interest
to scholars of formal logic.
Logic has attained in our century a development incomparably
greater than in any past age of its long history, and this has led
to such an enrichment and proliferation of its aspects, that the
problem of some kind of unified recom prehension of this discipline
seems nowadays unavoidable. This splitting into several subdomains
is the natural consequence of the fact that Logic has intended to
adopt in our century the status of a science. This always implies
that the general optics, under which a certain set of problems used
to be con sidered, breaks into a lot of specialized sectors of
inquiry, each of them being characterized by the introduction of
specific viewpoints and of technical tools of its own. The first
impression, that often accompanies the creation of one of such
specialized branches in a diSCipline, is that one has succeeded in
isolating the 'scientific core' of it, by restricting the somehow
vague and redundant generality of its original 'philosophical'
configuration. But, after a while, it appears that some of the
discarded aspects are indeed important and a new specialized domain
of investigation is created to explore them. By follOwing this
procedure, one finally finds himself confronted with such a variety
of independent fields of research, that one wonders whether the
fact of labelling them under a common denomination be nothing but
the contingent effect of a pure historical tradition."
|
You may like...
Formal Logic
Prior
Hardcover
R2,066
R1,262
Discovery Miles 12 620
|