![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
This is a monograph about logic. Specifically, it presents the mathe matical theory of the logic of bunched implications, BI: I consider Bl's proof theory, model theory and computation theory. However, the mono graph is also about informatics in a sense which I explain. Specifically, it is about mathematical models of resources and logics for reasoning about resources. I begin with an introduction which presents my (background) view of logic from the point of view of informatics, paying particular attention to three logical topics which have arisen from the development of logic within informatics: * Resources as a basis for semantics; * Proof-search as a basis for reasoning; and * The theory of representation of object-logics in a meta-logic. The ensuing development represents a logical theory which draws upon the mathematical, philosophical and computational aspects of logic. Part I presents the logical theory of propositional BI, together with a computational interpretation. Part II presents a corresponding devel opment for predicate BI. In both parts, I develop proof-, model- and type-theoretic analyses. I also provide semantically-motivated compu tational perspectives, so beginning a mathematical theory of resources. I have not included any analysis, beyond conjecture, of properties such as decidability, finite models, games or complexity. I prefer to leave these matters to other occasions, perhaps in broader contexts.
I am very happy to have this opportunity to introduce Luca Vigano's book on Labelled Non-Classical Logics. I put forward the methodology of labelled deductive systems to the participants of Logic Colloquium'90 (Labelled Deductive systems, a Position Paper, In J. Oikkonen and J. Vaananen, editors, Logic Colloquium '90, Volume 2 of Lecture Notes in Logic, pages 66-68, Springer, Berlin, 1993), in an attempt to bring labelling as a recognised and significant component of our logic culture. It was a response to earlier isolated uses of labels by various distinguished authors, as a means to achieve local proof theoretic goals. Labelling was used in many different areas such as resource labelling in relevance logics, prefix tableaux in modal logics, annotated logic programs in logic programming, proof tracing in truth maintenance systems, and various side annotations in higher-order proof theory, arithmetic and analysis. This widespread local use of labels was an indication of an underlying logical pattern, namely the simultaneous side-by-side manipulation of several kinds of logical information. It was clear that there was a need to establish the labelled deductive systems methodology. Modal logic is one major area where labelling can be developed quickly and sys tematically with a view of demonstrating its power and significant advantage. In modal logic the labels can play a double role."
Modal logics, originally conceived in philosophy, have recently found many applications in computer science, artificial intelligence, the foundations of mathematics, linguistics and other disciplines. Celebrated for their good computational behaviour, modal logics are used as effective formalisms for talking about time, space, knowledge, beliefs, actions, obligations, provability, etc. However, the nice computational properties can drastically change if we combine some of these formalisms into a many-dimensional system, say, to reason about knowledge bases developing in time or moving objects.
This book is about stochastic Petri nets (SPNs), which have proven to be a popular tool for modelling and performance analysis of complex discrete-event stochastic systems. The focus is on methods for modelling a system as an SPN with general firing times and for studying the long-run behavior of the resulting SPN model using computer simulation. Modelling techniques are illustrated in the context of computer, manufacturing, telecommunication, workflow, and transportation systems. The simulation discussion centers on the theory that underlies estimation procedures such as the regenerative method, the method of batch means, and spectral methods.Tying these topics together are conditions on the building blocks of an SPN under which the net is stable over time and specified estimation procedures are valid. In addition, the book develops techniques for comparing the modelling power of different discrete-event formalisms. These techniques provide a means for making principled choices between alternative modelling frameworks and also can be used to extend stability results and limit theorems from one framework to another. As an overview of fundamental modelling, stability, convergence, and estimation issues for discrete-event systems, this book will be of interest to researchers and graduate students in Applied Mathematics, Operations Research, Applied Probability, and Statistics. This book also will be of interest to practitioners of Industrial, Computer, Transportation, and Electrical Engineering, because it provides an introduction to a powerful set of tools both for modelling and for simulation-based performance analysis. Peter J. Haas is a member of the Research Staff at the IBM Almaden Research Center in San Jose, California. He also teaches Computer Simulation at Stanford University and is an Associate Editor (Simulation Area) for Operations Research.
"Mathematics in Kant's Critical Philosophy" provides a much needed reading (and re-reading) of Kant's theory of the construction of mathematical concepts through a fully contextualized analysis. In this work Lisa Shabel convincingly argues that it is only through an understanding of the relevant eighteenth century mathematics textbooks, and the related mathematical practice, can the material and context necessary for a successful interpretation of Kant's philosophy be provided. This is borne out through sustained readings of Euclid and Woolf in particular, which, when brought together with Kant's work, allows for the elucidation of several key issues and the reinterpretation of many hitherto opaque and long debated passages.
Domain theory is a rich interdisciplinary area at the intersection of logic, computer science, and mathematics. This volume contains selected papers presented at the International Symposium on Domain Theory which took place in Shanghai in October 1999. Topics of papers range from the encounters between topology and domain theory, sober spaces, Lawson topology, real number computability and continuous functionals to fuzzy modelling, logic programming, and pi-calculi. This book is a valuable reference for researchers and students interested in this rapidly developing area of theoretical computer science.
Per Martin-Loef's work on the development of constructive type theory has been of huge significance in the fields of logic and the foundations of mathematics. It is also of broader philosophical significance, and has important applications in areas such as computing science and linguistics. This volume draws together contributions from researchers whose work builds on the theory developed by Martin-Loef over the last twenty-five years. As well as celebrating the anniversary of the birth of the subject it covers many of the diverse fields which are now influenced by type theory. It is an invaluable record of areas of current activity, but also contains contributions from N. G. de Bruijn and William Tait, both important figures in the early development of the subject. Also published for the first time is one of Per Martin-Loef's earliest papers.
hiS volume in the Synthese Library Series is the result of a conference T held at the University of Roskilde, Denmark, October 31st-November 1st, 1997. The aim was to provide a forum within which philosophers, math ematicians, logicians and historians of mathematics could exchange ideas pertaining to the historical and philosophical development of proof theory. Hence the conference was called Proof Theory: History and Philosophical Significance. To quote from the conference abstract: Proof theory was developed as part of Hilberts Programme. According to Hilberts Programme one could provide mathematics with a firm and se cure foundation by formalizing all of mathematics and subsequently prove consistency of these formal systems by finitistic means. Hence proof theory was developed as a formal tool through which this goal should be fulfilled. It is well known that Hilbert's Programme in its original form was unfeasible mainly due to Gtldel's incompleteness theorems. Additionally it proved impossible to formalize all of mathematics and impossible to even prove the consistency of relatively simple formalized fragments of mathematics by finitistic methods. In spite of these problems, Gentzen showed that by extending Hilbert's proof theory it would be possible to prove the consistency of interesting formal systems, perhaps not by finitis tic methods but still by methods of minimal strength. This generalization of Hilbert's original programme has fueled modern proof theory which is a rich part of mathematical logic with many significant implications for the philosophy of mathematics."
From the Introduction: "We shall base our discussion on a set-theoretical foundation like that used in developing analysis, or algebra, or topology. We may consider our task as that of giving a mathematical analysis of the basic concepts of logic and mathematics themselves. Thus we treat mathematical and logical practice as given empirical data and attempt to develop a purely mathematical theory of logic abstracted from these data." There are 31 chapters in 5 parts and approximately 320 exercises marked by difficulty and whether or not they are necessary for further work in the book.
This is the first treatment in book format of proof-theoretic transformations - known as proof interpretations - that focuses on applications to ordinary mathematics. It covers both the necessary logical machinery behind the proof interpretations that are used in recent applications as well as - via extended case studies - carrying out some of these applications in full detail. This subject has historical roots in the 1950s. This book for the first time tells the whole story.
This research text addresses the logical aspects of the visualization of information with papers especially commissioned for this book. The authors explore the logical properties of diagrams, charts, maps, and the like, and their use in problem solving and in teaching basic reasoning skills. As computers make visual presentations of information even more commonplace,it becomes increasingly important for the research community to develop an understanding of such tools.
The forms and scope of logic rest on assumptions of how language and reasoning connect to experience. In this volume an analysis of meaning and truth provides a foundation for studying modern propositional and predicate logics. Chapters on propositional logic, parsing propositions, and meaning, truth and reference give a basis for criteria that can be used to judge formalizations of ordinary language arguments. Over 120 worked examples of formalizations of propositions and arguments illustrate the scope and limitations of modern logic, as analyzed in chapters on identity, quantifiers, descriptive names, functions, and second-order logic. The chapter on second-order logic illustrates how different conceptions of predicates and propositions do not lead to a common basis for quantification over predicates, as they do for quantification over things. Notable for its clarity of presentation, and supplemented by many exercises, this volume is suitable for philosophers, linguists, mathematicians, and computer scientists who wish to better understand the tools they use in formalizing reasoning.
The main aim of this monograph is to provide a structured study of the algebraic method in metalogic. In contrast to traditional algebraic logic, where the focus is on the algebraic forms of specific deductive systems, abstract algebraic logic is concerned with the process of algebraization itself. This book presents in a systematic way recent ideas in abstract algebraic logic centered around the notion of the Leibniz operator. The stress is put on the taxonomy of deductive systems. Isolating a list of plausible properties of the Leibniz operator serves as a basis for distinguishing certain natural classes of sentential logics. The hierarchy of deductive systems presented in the book comprises, among others, the following classes: protoalgebraic logics, equivalential logics, algebraizable logics, and Fregean logics. Because of the intimate connection between algebraic and logical structures, the book also provides a uniform treatment of various topics concerning deduction theorems and quasivarieties of algebras. The presentation of the above classes of logics is accompanied by a wealth of examples illustrating the general theory. An essential part of the book is formed by the numerous exercises integrated into the text. This book is both suitable for logically and algebraically minded graduate and advanced graduate students of mathematics, computer science and philosophy, and as a reference work for the expert.
Logic and the Modalities in the Twentieth Century is an
indispensable research tool for anyone interested in the
development of logic, including researchers, graduate and senior
undergraduate students in logic, history of logic, mathematics,
history of mathematics, computer science and artificial
intelligence, linguistics, cognitive science, argumentation theory,
philosophy, and the history of ideas.
This book is addressed primarily to researchers specializing in mathemat ical logic. It may also be of interest to students completing a Masters Degree in mathematics and desiring to embark on research in logic, as well as to teachers at universities and high schools, mathematicians in general, or philosophers wishing to gain a more rigorous conception of deductive reasoning. The material stems from lectures read from 1962 to 1968 at the Faculte des Sciences de Paris and since 1969 at the Universities of Provence and Paris-VI. The only prerequisites demanded of the reader are elementary combinatorial theory and set theory. We lay emphasis on the semantic aspect of logic rather than on syntax; in other words, we are concerned with the connection between formulas and the multirelations, or models, which satisfy them. In this context considerable importance attaches to the theory of relations, which yields a novel approach and algebraization of many concepts of logic. The present two-volume edition considerably widens the scope of the original French] one-volume edition (1967: Relation, Formule logique, Compacite, Completude). The new Volume 1 (1971: Relation et Formule logique) reproduces the old Chapters 1, 2, 3, 4, 5 and 8, redivided as follows: Word, formula (Chapter 1), Connection (Chapter 2), Relation, operator (Chapter 3), Free formula (Chapter 4), Logicalformula, denumer able-model theorem (L6wenheim-Skolem) (Chapter 5), Completeness theorem (G6del-Herbrand) and Interpolation theorem (Craig-Lyndon) (Chapter 6), Interpretability of relations (Chapter 7)."
This monograph provides a thorough analysis of two important formalisms for nonmonotonic reasoning: default logic and modal nonmonotonic logics. It is also shown how they are related to each other and how they provide the formal foundations for logic programming. The discussion is rigorous, and all main results are formally proved. Many of the results are deep and surprising, some of them previously unpublished. The book has three parts, on default logic, modal nonmonotonic logics, and connections and complexity issues, respectively. The study of general default logic is followed by a discussion of normal default logic and its connections to the closed world assumption, and also a presentation of related aspects of logic programming. The general theory of the family of modal nonmonotonic logics introduced by McDermott and Doyle is followed by studies of autoepistemic logic, the logic of reflexive knowledge, and the logic of pure necessitation, and also a short discussion of algorithms for computing knowledge and belief sets. The third part explores connections between default logic and modal nonmonotonic logics and contains results on the complexity of nonmonotonic reasoning. The ideas are presented with an elegance and unity of perspective that set a new standard of scholarship for books in this area, and the work indicates that the field has reached a very high level of maturity and sophistication. The book is intended as a reference on default logic, nonmonotonic logics, and related computational issues, and is addressed to researchers, programmers, and graduate students in the Artificial Intelligence community.
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows."
This book features a unique approach to the teaching of mathematical logic by putting it in the context of the puzzles and paradoxes of common language and rational thought. It serves as a bridge from the author 's puzzle books to his technical writing in the fascinating field of mathematical logic. Using the logic of lying and truth-telling, the author introduces the readers to informal reasoning preparing them for the formal study of symbolic logic, from propositional logic to first-order logic, a subject that has many important applications to philosophy, mathematics, and computer science. The book includes a journey through the amazing labyrinths of infinity, which have stirred the imagination of mankind as much, if not more, than any other subject.
Many philosophers have considered logical reasoning as an inborn ability of mankind and as a distinctive feature in the human mind; but we all know that the distribution of this capacity, or at any rate its development, is very unequal. Few people are able to set up a cogent argument; others are at least able to follow a logical argument and even to detect logical fallacies. Nevertheless, even among educated persons there are many who do not even attain this relatively modest level of development. According to my personal observations, lack of logical ability may be due to various circumstances. In the first place, I mention lack of general intelligence, insufficient power of concentration, and absence of formal education. Secondly, however, I have noticed that many people are unable, or sometimes rather unwilling, to argue ex hypothesi; such persons cannot, or will not, start from premisses which they know or believe to be false or even from premisses whose truth is not, in their opinion, sufficient ly warranted. Or, if they agree to start from such premisses, they sooner or later stray away from the argument into attempts first to settle the truth or falsehood of the premisses. Presumably this attitude results either from lack of imagination or from undue moral rectitude. On the other hand, proficiency in logical reasoning is not in itself a guarantee for a clear theoretic insight into the principles and foundations of logic."
This book is concerned with advances in serial-data computa tional architectures, and the CAD tools for their implementation in silicon. The bit-serial tradition at Edinburgh University (EU) stretches back some 6 years to the conception of the FIRST silicon compiler. FIRST owes much of its inspiration to Dick Lyon, then at Xerox P ARC, who proposed a 'structured-design' methodology for construction of signal processing systems from bit-serial building blocks. Based on an nMOS cell-library, FIRST automates much of Lyon's physical design process. More recently, we began to feel that FIRST should be able to exploit more modern technologies. Before this could be achieved, we were faced with a massive manual re-design task, i. e. the porting of FIRST cell-library to a new technology. As it was to avoid such tasks that FIRST was conceived in the first place, we decided to move the level of user-specification much nearer to the silicon level (while still hiding details of transistor circuit design, place and route etc., from the user), and by so doing, enable the specification of more functionally powerful libraries in technology-free form. The results of this work are in evidence as advances in serial-data design techniques, and the SECOND silicon compiler, introduced later in this book. These achievements could not have been accomplished without help from various sources. We take this opportunity to thank Profs."
Towards the end of the nineteenth century, Frege gave us the
abstraction principles and the general notion of functions.
Self-application of functions was at the heart of Russell's
paradox. This led Russell to introduce type theory in order to
avoid the paradox. Since, the twentieth century has seen an amazing
number of theories concerned with types and functions and many
applications. Progress in computer science also meant more and more
emphasis on the use of logic, types and functions to study the
syntax, semantics, design and implementation of programming
languages and theorem provers, and the correctness of proofs and
programs. The authors of this book have themselves been leading the
way by providing various extensions of type theory which have been
shown to bring many advantages. This book gathers much of their
influential work and is highly recommended for anyone interested in
type theory. The main emphasis is on:
Mathematics is often considered as a body of knowledge that is essen tially independent of linguistic formulations, in the sense that, once the content of this knowledge has been grasped, there remains only the problem of professional ability, that of clearly formulating and correctly proving it. However, the question is not so simple, and P. Weingartner's paper (Language and Coding-Dependency of Results in Logic and Mathe matics) deals with some results in logic and mathematics which reveal that certain notions are in general not invariant with respect to different choices of language and of coding processes. Five example are given: 1) The validity of axioms and rules of classical propositional logic depend on the interpretation of sentential variables; 2) The language dependency of verisimilitude; 3) The proof of the weak and strong anti inductivist theorems in Popper's theory of inductive support is not invariant with respect to limitative criteria put on classical logic; 4) The language-dependency of the concept of provability; 5) The language dependency of the existence of ungrounded and paradoxical sentences (in the sense of Kripke). The requirements of logical rigour and consistency are not the only criteria for the acceptance and appreciation of mathematical proposi tions and theories." |
You may like...
The Asian Aspiration - Why And How…
Greg Mills, Olusegun Obasanjo, …
Paperback
|