![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
The book presents a thoroughly elaborated logical theory of generalized truth-values understood as subsets of some established set of (basic) truth values. After elucidating the importance of the very notion of a truth value in logic and philosophy, we examine some possible ways of generalizing this notion. The useful four-valued logic of first-degree entailment by Nuel Belnap and the notion of a bilattice (a lattice of truth values with two ordering relations) constitute the basis for further generalizations. By doing so we elaborate the idea of a multilattice, and most notably, a trilattice of truth values - a specific algebraic structure with information ordering and two distinct logical orderings, one for truth and another for falsity. Each logical order not only induces its own logical vocabulary, but determines also its own entailment relation. We consider both semantic and syntactic ways of formalizing these relations and construct various logical calculi.
From the Introduction: "We shall base our discussion on a set-theoretical foundation like that used in developing analysis, or algebra, or topology. We may consider our task as that of giving a mathematical analysis of the basic concepts of logic and mathematics themselves. Thus we treat mathematical and logical practice as given empirical data and attempt to develop a purely mathematical theory of logic abstracted from these data." There are 31 chapters in 5 parts and approximately 320 exercises marked by difficulty and whether or not they are necessary for further work in the book.
Chapter 1 The algebraic prerequisites for the book are covered here and in the appendix. This chapter should be used as reference material and should be consulted as needed. A systematic treatment of algebras, coalgebras, bialgebras, Hopf algebras, and represen tations of these objects to the extent needed for the book is given. The material here not specifically cited can be found for the most part in [Sweedler, 1969] in one form or another, with a few exceptions. A great deal of emphasis is placed on the coalgebra which is the dual of n x n matrices over a field. This is the most basic example of a coalgebra for our purposes and is at the heart of most algebraic constructions described in this book. We have found pointed bialgebras useful in connection with solving the quantum Yang-Baxter equation. For this reason we develop their theory in some detail. The class of examples described in Chapter 6 in connection with the quantum double consists of pointed Hopf algebras. We note the quantized enveloping algebras described Hopf algebras. Thus for many reasons pointed bialgebras are elsewhere are pointed of fundamental interest in the study of the quantum Yang-Baxter equation and objects quantum groups.
This book presents a unifying framework for using priority arguments to prove theorems in computability. Priority arguments provide the most powerful theorem-proving technique in the field, but most of the applications of this technique are ad hoc, masking the unifying principles used in the proofs. The proposed framework presented isolates many of these unifying combinatorial principles and uses them to give shorter and easier-to-follow proofs of computability-theoretic theorems. Standard theorems of priority levels 1, 2, and 3 are chosen to demonstrate the framework's use, with all proofs following the same pattern. The last section features a new example requiring priority at all finite levels. The book will serve as a resource and reference for researchers in logic and computability, helping them to prove theorems in a shorter and more transparent manner.
This book collects the papers presented at the 4th International Workshop on Logic, Rationality and Interaction/ (LORI-4), held in October 2013 at the /Center for the Study of Language and Cognition, Zhejiang University, Hangzhou, China. LORI is a series that brings together researchers from a variety of logic-related fields: Game and Decision Theory, Philosophy, Linguistics, Computer Science and AI. This year had a special emphasis on Norms and Argumentation. Out of 42 submissions, 23 full papers and 11 short contributions have been selected through peer-review for inclusion in the workshop program and in this volume. The quality and diversity of these contributions witnesses a lively, fast-growing, and interdisciplinary community working at the intersection of logic and rational interaction.
Quantitative Evaluation of Fire and EMS Mobilization Times presents comprehensive empirical data on fire emergency and EMS call processing and turnout times, and aims to improve the operational benchmarks of NFPA peer consensus standards through a close examination of real-world data. The book also identifies and analyzes the elements that can influence EMS mobilization response times. Quantitative Evaluation of Fire and EMS Mobilization Times is intended for practitioners as a tool for analyzing fire emergency response times and developing methods for improving them. Researchers working in a related field will also find the book valuable.
Compactness in topology and finite generation in algebra are nice properties to start with. However, the study of compact spaces leads naturally to non-compact spaces and infinitely generated chain complexes; a classical example is the theory of covering spaces. In handling non-compact spaces we must take into account the infinity behaviour of such spaces. This necessitates modifying the usual topological and algebraic cate gories to obtain "proper" categories in which objects are equipped with a "topologized infinity" and in which morphisms are compatible with the topology at infinity. The origins of proper (topological) category theory go back to 1923, when Kere kjart6 [VT] established the classification of non-compact surfaces by adding to orien tability and genus a new invariant, consisting of a set of "ideal points" at infinity. Later, Freudenthal [ETR] gave a rigorous treatment of the topology of "ideal points" by introducing the space of "ends" of a non-compact space. In spite of its early ap pearance, proper category theory was not recognized as a distinct area of topology until the late 1960's with the work of Siebenmann [OFB], [IS], [DES] on non-compact manifolds.
Henkin-Keisler models emanate from a modification of the Henkin construction introduced by Keisler to motivate the definition of ultraproducts. Keisler modified the Henkin construction at that point at which 'new' individual constants are introduced and did so in a way that illuminates a connection between Henkin-Keisler models and ultraproducts. The resulting construction can be viewed both as a specialization of the Henkin construction and as an alternative to the ultraproduct construction. These aspects of the Henkin-Keisler construction are utilized here to present a perspective on ultraproducts and their applications accessible to the reader familiar with Henkin's proof of the completeness of first order logic and naive set theory. This approach culminates in proofs of various forms of the Keisler-Shelah characterizations of elementary equivalence and elementary classes via Henkin-Keisler models. The presentation is self-contained and proofs of more advanced results from set theory are introduced as needed. Audience: Logicians in philosophy, computer science, linguistics and mathematics.
The theory of oppositions based on Aristotelian foundations of logic has been pictured in a striking square diagram which can be understood and applied in many different ways having repercussions in various fields: epistemology, linguistics, mathematics, sociology, physics. The square can also be generalized in other two-dimensional or multi-dimensional objects extending in breadth and depth the original Aristotelian theory. The square of opposition from its origin in antiquity to the present day continues to exert a profound impact on the development of deductive logic. Since 10 years there is a new growing interest for the square due to recent discoveries and challenging interpretations. This book presents a collection of previously unpublished papers by high level specialists on the square from all over the world.
This book, which is based on Polya's method of problem solving, aids students in their transition from calculus (or precalculus) to higher-level mathematics. The book begins by providing a great deal of guidance on how to approach definitions, examples, and theorems in mathematics and ends with suggested projects for independent study. Students will follow Polya's four step approach: analyzing the problem, devising a plan to solve the problem, carrying out that plan, and then determining the implication of the result. In addition to the Polya approach to proofs, this book places special emphasis on reading proofs carefully and writing them well. The authors have included a wide variety of problems, examples, illustrations and exercises, some with hints and solutions, designed specifically to improve the student's ability to read and write proofs. Historical connections are made throughout the text, and students are encouraged to use the rather extensive bibliography to begin making connections of their own. While standard texts in this area prepare students for future courses in algebra, this book also includes chapters on sequences, convergence, and metric spaces for those wanting to bridge the gap between the standard course in calculus and one in analysis.
This is the first book on cut-elimination in first-order predicate logic from an algorithmic point of view. Instead of just proving the existence of cut-free proofs, it focuses on the algorithmic methods transforming proofs with arbitrary cuts to proofs with only atomic cuts (atomic cut normal forms, so-called ACNFs). The first part investigates traditional reductive methods from the point of view of proof rewriting. Within this general framework, generalizations of Gentzen's and Sch\"utte-Tait's cut-elimination methods are defined and shown terminating with ACNFs of the original proof. Moreover, a complexity theoretic comparison of Gentzen's and Tait's methods is given. The core of the book centers around the cut-elimination method CERES (cut elimination by resolution) developed by the authors. CERES is based on the resolution calculus and radically differs from the reductive cut-elimination methods. The book shows that CERES asymptotically outperforms all reductive methods based on Gentzen's cut-reduction rules. It obtains this result by heavy use of subsumption theorems in clause logic. Moreover, several applications of CERES are given (to interpolation, complexity analysis of cut-elimination, generalization of proofs, and to the analysis of real mathematical proofs). Lastly, the book demonstrates that CERES can be extended to nonclassical logics, in particular to finitely-valued logics and to G\"odel logic.
There are two aspects to the theory of Boolean algebras; the algebraic and the set-theoretical. A Boolean algebra can be considered as a special kind of algebraic ring, or as a generalization of the set-theoretical notion of a field of sets. Fundamental theorems in both of these directions are due to M. H. STONE, whose papers have opened a new era in the develop ment of this theory. This work treats the set-theoretical aspect, with little mention being made of the algebraic one. The book is composed of two chapters and an appendix. Chapter I is devoted to the study of Boolean algebras from the point of view of finite Boolean operations only; a greater part of its contents can be found in the books of BIRKHOFF [2J and HERMES [1]. Chapter II seems to be the first systematic study of Boolean algebras with infinite Boolean operations. To understand Chapters I and II it suffices only to know fundamental notions from general set theory and set-theoretical topology. No know ledge of lattice theory or of abstract algebra is presumed. Less familiar topological theorems are recalled, and only a few examples use more advanced topological means; but these may be omitted. All theorems in both chapters are given with full proofs.
Fuzzy Logic: State of the Art covers a wide range of both theory and applications of fuzzy sets, ranging from mathematical basics, through artificial intelligence, computer management and systems science to engineering applications. Fuzzy Logic will be of interest to researchers working in fuzzy set theory and its applications.
This book constitutes the refereed proceedings of the 11th
International Conference on Typed Lambda Calculi and Applications,
TLCA 2013, held in Eindhoven, The Netherlands, in June 2013 as part
of RDP 2013, the 7th Federated Conference on Rewriting, Deduction,
and Programming, together with the 24th International Conference on
Rewriting Techniques and Applications, RTA 2013, and several
related events.
In recent years, there have been several attempts to define a logic for information retrieval (IR). The aim was to provide a rich and uniform representation of information and its semantics with the goal of improving retrieval effectiveness. The basis of a logical model for IR is the assumption that queries and documents can be represented effectively by logical formulae. To retrieve a document, an IR system has to infer the formula representing the query from the formula representing the document. This logical interpretation of query and document emphasizes that relevance in IR is an inference process. The use of logic to build IR models enables one to obtain models that are more general than earlier well-known IR models. Indeed, some logical models are able to represent within a uniform framework various features of IR systems such as hypermedia links, multimedia data, and user's knowledge. Logic also provides a common approach to the integration of IR systems with logical database systems. Finally, logic makes it possible to reason about an IR model and its properties. This latter possibility is becoming increasingly more important since conventional evaluation methods, although good indicators of the effectiveness of IR systems, often give results which cannot be predicted, or for that matter satisfactorily explained. However, logic by itself cannot fully model IR. The success or the failure of the inference of the query formula from the document formula is not enough to model relevance in IR. It is necessary to take into account the uncertainty inherent in such an inference process. In 1986, Van Rijsbergen proposed the uncertainty logical principle to model relevance as an uncertain inference process. When proposing the principle, Van Rijsbergen was not specific about which logic and which uncertainty theory to use. As a consequence, various logics and uncertainty theories have been proposed and investigated. The choice of an appropriate logic and uncertainty mechanism has been a main research theme in logical IR modeling leading to a number of logical IR models over the years. Information Retrieval: Uncertainty and Logics contains a collection of exciting papers proposing, developing and implementing logical IR models. This book is appropriate for use as a text for a graduate-level course on Information Retrieval or Database Systems, and as a reference for researchers and practitioners in industry.
also in: THE KLUWER INTERNATIONAL SERIES ON ASIAN STUDIES IN COMPUTER AND INFORMATION SCIENCE, Volume 2
Neural Networks and Fuzzy Systems: Theory and Applications discusses theories that have proven useful in applying neural networks and fuzzy systems to real world problems. The book includes performance comparison of neural networks and fuzzy systems using data gathered from real systems. Topics covered include the Hopfield network for combinatorial optimization problems, multilayered neural networks for pattern classification and function approximation, fuzzy systems that have the same functions as multilayered networks, and composite systems that have been successfully applied to real world problems. The author also includes representative neural network models such as the Kohonen network and radial basis function network. New fuzzy systems with learning capabilities are also covered. The advantages and disadvantages of neural networks and fuzzy systems are examined. The performance of these two systems in license plate recognition, a water purification plant, blood cell classification, and other real world problems is compared.
Computer systems that analyze images are critical to a wide variety of applications such as visual inspections systems for various manufacturing processes, remote sensing of the environment from space-borne imaging platforms, and automatic diagnosis from X-rays and other medical imaging sources. Professor Azriel Rosenfeld, the founder of the field of digital image analysis, made fundamental contributions to a wide variety of problems in image processing, pattern recognition and computer vision. Professor Rosenfeld's previous students, postdoctoral scientists, and colleagues illustrate in Foundations of Image Understanding how current research has been influenced by his work as the leading researcher in the area of image analysis for over two decades. Each chapter of Foundations of Image Understanding is written by one of the world's leading experts in his area of specialization, examining digital geometry and topology (early research which laid the foundations for many industrial machine vision systems), edge detection and segmentation (fundamental to systems that analyze complex images of our three-dimensional world), multi-resolution and variable resolution representations for images and maps, parallel algorithms and systems for image analysis, and the importance of human psychophysical studies of vision to the design of computer vision systems. Professor Rosenfeld's chapter briefly discusses topics not covered in the contributed chapters, providing a personal, historical perspective on the development of the field of image understanding. Foundations of Image Understanding is an excellent source of basic material for both graduate students entering the field and established researchers who require a compact source for many of the foundational topics in image analysis.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic.
Call-by-push-value is a programming language paradigm that,
surprisingly, breaks down the call-by-value and call-by-name
paradigms into simple primitives. This monograph, written for
graduate students and researchers, exposes the call-by-push-value
structure underlying a remarkable range of semantics, including
operational semantics, domains, possible worlds, continuations and
games.
The main idea of statistical convergence is to demand convergence only for a majority of elements of a sequence. This method of convergence has been investigated in many fundamental areas of mathematics such as: measure theory, approximation theory, fuzzy logic theory, summability theory, and so on. In this monograph we consider this concept in approximating a function by linear operators, especially when the classical limit fails. The results of this book not only cover the classical and statistical approximation theory, but also are applied in the fuzzy logic via the fuzzy-valued operators. The authors in particular treat the important Korovkin approximation theory of positive linear operators in statistical and fuzzy sense. They also present various statistical approximation theorems for some specific real and complex-valued linear operators that are not positive. This is the first monograph in Statistical Approximation Theory and Fuzziness. The chapters are self-contained and several advanced courses can be taught. The research findings will be useful in various applications including applied and computational mathematics, stochastics, engineering, artificial intelligence, vision and machine learning. This monograph is directed to graduate students, researchers, practitioners and professors of all disciplines.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
One of the important areas of contemporary combinatorics is Ramsey theory. Ramsey theory is basically the study of structure preserved under partitions. The general philosophy is reflected by its interdisciplinary character. The ideas of Ramsey theory are shared by logicians, set theorists and combinatorists, and have been successfully applied in other branches of mathematics. The whole subject is quickly developing and has some new and unexpected applications in areas as remote as functional analysis and theoretical computer science. This book is a homogeneous collection of research and survey articles by leading specialists. It surveys recent activity in this diverse subject and brings the reader up to the boundary of present knowledge. It covers virtually all main approaches to the subject and suggests various problems for individual research.
This book constitutes the thoroughly refereed post-conference proceedings of the 20th International Workshop on Algebraic Development Techniques, WADT 2010, held in July 2010 in Etelsen, Germany. The 15 revised papers presented were carefully reviewed and selected from 32 presentations. The workshop deals with the following topics: foundations of algebraic specification; other approaches to formal specification including process calculi and models of concurrent, distributed and mobile computing; specification languages, methods, and environments; semantics of conceptual modeling methods and techniques; model-driven development; graph transformations, term rewriting and proof systems; integration of formal specification techniques; formal testing and quality assurance validation, and verification. |
You may like...
Graham Priest on Dialetheism and…
Can Baskent, Thomas Macaulay Ferguson
Hardcover
R4,363
Discovery Miles 43 630
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R921
Discovery Miles 9 210
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R983
Discovery Miles 9 830
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R807
Discovery Miles 8 070
|