![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Chapter 1 The algebraic prerequisites for the book are covered here and in the appendix. This chapter should be used as reference material and should be consulted as needed. A systematic treatment of algebras, coalgebras, bialgebras, Hopf algebras, and represen tations of these objects to the extent needed for the book is given. The material here not specifically cited can be found for the most part in [Sweedler, 1969] in one form or another, with a few exceptions. A great deal of emphasis is placed on the coalgebra which is the dual of n x n matrices over a field. This is the most basic example of a coalgebra for our purposes and is at the heart of most algebraic constructions described in this book. We have found pointed bialgebras useful in connection with solving the quantum Yang-Baxter equation. For this reason we develop their theory in some detail. The class of examples described in Chapter 6 in connection with the quantum double consists of pointed Hopf algebras. We note the quantized enveloping algebras described Hopf algebras. Thus for many reasons pointed bialgebras are elsewhere are pointed of fundamental interest in the study of the quantum Yang-Baxter equation and objects quantum groups.
This book presents a unifying framework for using priority arguments to prove theorems in computability. Priority arguments provide the most powerful theorem-proving technique in the field, but most of the applications of this technique are ad hoc, masking the unifying principles used in the proofs. The proposed framework presented isolates many of these unifying combinatorial principles and uses them to give shorter and easier-to-follow proofs of computability-theoretic theorems. Standard theorems of priority levels 1, 2, and 3 are chosen to demonstrate the framework's use, with all proofs following the same pattern. The last section features a new example requiring priority at all finite levels. The book will serve as a resource and reference for researchers in logic and computability, helping them to prove theorems in a shorter and more transparent manner.
This book collects the papers presented at the 4th International Workshop on Logic, Rationality and Interaction/ (LORI-4), held in October 2013 at the /Center for the Study of Language and Cognition, Zhejiang University, Hangzhou, China. LORI is a series that brings together researchers from a variety of logic-related fields: Game and Decision Theory, Philosophy, Linguistics, Computer Science and AI. This year had a special emphasis on Norms and Argumentation. Out of 42 submissions, 23 full papers and 11 short contributions have been selected through peer-review for inclusion in the workshop program and in this volume. The quality and diversity of these contributions witnesses a lively, fast-growing, and interdisciplinary community working at the intersection of logic and rational interaction.
Model theory has made substantial contributions to semialgebraic, subanalytic, p-adic, rigid and diophantine geometry. These applications range from a proof of the rationality of certain Poincare series associated to varieties over p-adic fields, to a proof of the Mordell-Lang conjecture for function fields in positive characteristic. In some cases (such as the latter) it is the most abstract aspects of model theory which are relevant. This book, originally published in 2000, arising from a series of introductory lectures for graduate students, provides the necessary background to understanding both the model theory and the mathematics behind these applications. The book is unique in that the whole spectrum of contemporary model theory (stability, simplicity, o-minimality and variations) is covered and diverse areas of geometry (algebraic, diophantine, real analytic, p-adic, and rigid) are introduced and discussed, all by leading experts in their fields.
Quantitative Evaluation of Fire and EMS Mobilization Times presents comprehensive empirical data on fire emergency and EMS call processing and turnout times, and aims to improve the operational benchmarks of NFPA peer consensus standards through a close examination of real-world data. The book also identifies and analyzes the elements that can influence EMS mobilization response times. Quantitative Evaluation of Fire and EMS Mobilization Times is intended for practitioners as a tool for analyzing fire emergency response times and developing methods for improving them. Researchers working in a related field will also find the book valuable.
Compactness in topology and finite generation in algebra are nice properties to start with. However, the study of compact spaces leads naturally to non-compact spaces and infinitely generated chain complexes; a classical example is the theory of covering spaces. In handling non-compact spaces we must take into account the infinity behaviour of such spaces. This necessitates modifying the usual topological and algebraic cate gories to obtain "proper" categories in which objects are equipped with a "topologized infinity" and in which morphisms are compatible with the topology at infinity. The origins of proper (topological) category theory go back to 1923, when Kere kjart6 [VT] established the classification of non-compact surfaces by adding to orien tability and genus a new invariant, consisting of a set of "ideal points" at infinity. Later, Freudenthal [ETR] gave a rigorous treatment of the topology of "ideal points" by introducing the space of "ends" of a non-compact space. In spite of its early ap pearance, proper category theory was not recognized as a distinct area of topology until the late 1960's with the work of Siebenmann [OFB], [IS], [DES] on non-compact manifolds.
Henkin-Keisler models emanate from a modification of the Henkin construction introduced by Keisler to motivate the definition of ultraproducts. Keisler modified the Henkin construction at that point at which 'new' individual constants are introduced and did so in a way that illuminates a connection between Henkin-Keisler models and ultraproducts. The resulting construction can be viewed both as a specialization of the Henkin construction and as an alternative to the ultraproduct construction. These aspects of the Henkin-Keisler construction are utilized here to present a perspective on ultraproducts and their applications accessible to the reader familiar with Henkin's proof of the completeness of first order logic and naive set theory. This approach culminates in proofs of various forms of the Keisler-Shelah characterizations of elementary equivalence and elementary classes via Henkin-Keisler models. The presentation is self-contained and proofs of more advanced results from set theory are introduced as needed. Audience: Logicians in philosophy, computer science, linguistics and mathematics.
The theory of oppositions based on Aristotelian foundations of logic has been pictured in a striking square diagram which can be understood and applied in many different ways having repercussions in various fields: epistemology, linguistics, mathematics, sociology, physics. The square can also be generalized in other two-dimensional or multi-dimensional objects extending in breadth and depth the original Aristotelian theory. The square of opposition from its origin in antiquity to the present day continues to exert a profound impact on the development of deductive logic. Since 10 years there is a new growing interest for the square due to recent discoveries and challenging interpretations. This book presents a collection of previously unpublished papers by high level specialists on the square from all over the world.
This is the first book on cut-elimination in first-order predicate logic from an algorithmic point of view. Instead of just proving the existence of cut-free proofs, it focuses on the algorithmic methods transforming proofs with arbitrary cuts to proofs with only atomic cuts (atomic cut normal forms, so-called ACNFs). The first part investigates traditional reductive methods from the point of view of proof rewriting. Within this general framework, generalizations of Gentzen's and Sch\"utte-Tait's cut-elimination methods are defined and shown terminating with ACNFs of the original proof. Moreover, a complexity theoretic comparison of Gentzen's and Tait's methods is given. The core of the book centers around the cut-elimination method CERES (cut elimination by resolution) developed by the authors. CERES is based on the resolution calculus and radically differs from the reductive cut-elimination methods. The book shows that CERES asymptotically outperforms all reductive methods based on Gentzen's cut-reduction rules. It obtains this result by heavy use of subsumption theorems in clause logic. Moreover, several applications of CERES are given (to interpolation, complexity analysis of cut-elimination, generalization of proofs, and to the analysis of real mathematical proofs). Lastly, the book demonstrates that CERES can be extended to nonclassical logics, in particular to finitely-valued logics and to G\"odel logic.
There are two aspects to the theory of Boolean algebras; the algebraic and the set-theoretical. A Boolean algebra can be considered as a special kind of algebraic ring, or as a generalization of the set-theoretical notion of a field of sets. Fundamental theorems in both of these directions are due to M. H. STONE, whose papers have opened a new era in the develop ment of this theory. This work treats the set-theoretical aspect, with little mention being made of the algebraic one. The book is composed of two chapters and an appendix. Chapter I is devoted to the study of Boolean algebras from the point of view of finite Boolean operations only; a greater part of its contents can be found in the books of BIRKHOFF [2J and HERMES [1]. Chapter II seems to be the first systematic study of Boolean algebras with infinite Boolean operations. To understand Chapters I and II it suffices only to know fundamental notions from general set theory and set-theoretical topology. No know ledge of lattice theory or of abstract algebra is presumed. Less familiar topological theorems are recalled, and only a few examples use more advanced topological means; but these may be omitted. All theorems in both chapters are given with full proofs.
Fuzzy Logic: State of the Art covers a wide range of both theory and applications of fuzzy sets, ranging from mathematical basics, through artificial intelligence, computer management and systems science to engineering applications. Fuzzy Logic will be of interest to researchers working in fuzzy set theory and its applications.
This book constitutes the refereed proceedings of the 11th
International Conference on Typed Lambda Calculi and Applications,
TLCA 2013, held in Eindhoven, The Netherlands, in June 2013 as part
of RDP 2013, the 7th Federated Conference on Rewriting, Deduction,
and Programming, together with the 24th International Conference on
Rewriting Techniques and Applications, RTA 2013, and several
related events.
also in: THE KLUWER INTERNATIONAL SERIES ON ASIAN STUDIES IN COMPUTER AND INFORMATION SCIENCE, Volume 2
Neural Networks and Fuzzy Systems: Theory and Applications discusses theories that have proven useful in applying neural networks and fuzzy systems to real world problems. The book includes performance comparison of neural networks and fuzzy systems using data gathered from real systems. Topics covered include the Hopfield network for combinatorial optimization problems, multilayered neural networks for pattern classification and function approximation, fuzzy systems that have the same functions as multilayered networks, and composite systems that have been successfully applied to real world problems. The author also includes representative neural network models such as the Kohonen network and radial basis function network. New fuzzy systems with learning capabilities are also covered. The advantages and disadvantages of neural networks and fuzzy systems are examined. The performance of these two systems in license plate recognition, a water purification plant, blood cell classification, and other real world problems is compared.
Computer systems that analyze images are critical to a wide variety of applications such as visual inspections systems for various manufacturing processes, remote sensing of the environment from space-borne imaging platforms, and automatic diagnosis from X-rays and other medical imaging sources. Professor Azriel Rosenfeld, the founder of the field of digital image analysis, made fundamental contributions to a wide variety of problems in image processing, pattern recognition and computer vision. Professor Rosenfeld's previous students, postdoctoral scientists, and colleagues illustrate in Foundations of Image Understanding how current research has been influenced by his work as the leading researcher in the area of image analysis for over two decades. Each chapter of Foundations of Image Understanding is written by one of the world's leading experts in his area of specialization, examining digital geometry and topology (early research which laid the foundations for many industrial machine vision systems), edge detection and segmentation (fundamental to systems that analyze complex images of our three-dimensional world), multi-resolution and variable resolution representations for images and maps, parallel algorithms and systems for image analysis, and the importance of human psychophysical studies of vision to the design of computer vision systems. Professor Rosenfeld's chapter briefly discusses topics not covered in the contributed chapters, providing a personal, historical perspective on the development of the field of image understanding. Foundations of Image Understanding is an excellent source of basic material for both graduate students entering the field and established researchers who require a compact source for many of the foundational topics in image analysis.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic.
Call-by-push-value is a programming language paradigm that,
surprisingly, breaks down the call-by-value and call-by-name
paradigms into simple primitives. This monograph, written for
graduate students and researchers, exposes the call-by-push-value
structure underlying a remarkable range of semantics, including
operational semantics, domains, possible worlds, continuations and
games.
The main idea of statistical convergence is to demand convergence only for a majority of elements of a sequence. This method of convergence has been investigated in many fundamental areas of mathematics such as: measure theory, approximation theory, fuzzy logic theory, summability theory, and so on. In this monograph we consider this concept in approximating a function by linear operators, especially when the classical limit fails. The results of this book not only cover the classical and statistical approximation theory, but also are applied in the fuzzy logic via the fuzzy-valued operators. The authors in particular treat the important Korovkin approximation theory of positive linear operators in statistical and fuzzy sense. They also present various statistical approximation theorems for some specific real and complex-valued linear operators that are not positive. This is the first monograph in Statistical Approximation Theory and Fuzziness. The chapters are self-contained and several advanced courses can be taught. The research findings will be useful in various applications including applied and computational mathematics, stochastics, engineering, artificial intelligence, vision and machine learning. This monograph is directed to graduate students, researchers, practitioners and professors of all disciplines.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
One of the important areas of contemporary combinatorics is Ramsey theory. Ramsey theory is basically the study of structure preserved under partitions. The general philosophy is reflected by its interdisciplinary character. The ideas of Ramsey theory are shared by logicians, set theorists and combinatorists, and have been successfully applied in other branches of mathematics. The whole subject is quickly developing and has some new and unexpected applications in areas as remote as functional analysis and theoretical computer science. This book is a homogeneous collection of research and survey articles by leading specialists. It surveys recent activity in this diverse subject and brings the reader up to the boundary of present knowledge. It covers virtually all main approaches to the subject and suggests various problems for individual research.
This book constitutes the thoroughly refereed post-conference proceedings of the 20th International Workshop on Algebraic Development Techniques, WADT 2010, held in July 2010 in Etelsen, Germany. The 15 revised papers presented were carefully reviewed and selected from 32 presentations. The workshop deals with the following topics: foundations of algebraic specification; other approaches to formal specification including process calculi and models of concurrent, distributed and mobile computing; specification languages, methods, and environments; semantics of conceptual modeling methods and techniques; model-driven development; graph transformations, term rewriting and proof systems; integration of formal specification techniques; formal testing and quality assurance validation, and verification.
The concept of infinity is one of the most important, and at the same time, one of the most mysterious concepts of science. Already in antiquity many philosophers and mathematicians pondered over its contradictory nature. In mathematics, the contradictions connected with infinity intensified after the creation, at the end of the 19th century, of the theory of infinite sets and the subsequent discovery, soon after, of paradoxes in this theory. At the time, many scientists ignored the paradoxes and used set theory extensively in their work, while others subjected set-theoretic methods in mathematics to harsh criticism. The debate intensified when a group of French mathematicians, who wrote under the pseudonym of Nicolas Bourbaki, tried to erect the whole edifice of mathematics on the single notion of a set. Some mathematicians greeted this attempt enthusiastically while others regarded it as an unnecessary formalization, an attempt to tear mathematics away from life-giving practical applications that sustain it. These differences notwithstanding, Bourbaki has had a significant influence on the evolution of mathematics in the twentieth century. In this book we try to tell the reader how the idea of the infinite arose and developed in physics and in mathematics, how the theory of infinite sets was constructed, what paradoxes it has led to, what significant efforts have been made to eliminate the resulting contradictions, and what routes scientists are trying to find that would provide a way out of the many difficulties.
This is a collection of surveys and research papers on topics of interest in combinatorics, given at a conference in Matrahaza, Hungary. Originally published in journal form, it is here reissued as a book due to its special interest. It is dedicated to Paul Erdoes, who attended the conference and who is represented by two articles in the collection, including one, unfinished, which he was writing on the eve of his sudden death. Erdoes was one of the greatest mathematicians of his century and often the subject of anecdotes about his somewhat unusual lifestyle. A preface, written by friends and colleagues, gives a flavour of his life, including many such stories, and also describes the broad outline and importance of his work in combinatorics and other related fields. Here is a succinct introduction to important ideas in combinatorics for researchers and graduate students.
"Kind of crude, but it works, boy, it works " AZan NeweZZ to Herb Simon, Christmas 1955 In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies, USA, it proved a number theoretic theorem in Presburger Arithmetic. Christmas 1955 heralded a computer program which generated the first proofs of some propositions of Principia Mathematica, developed by A. Newell, J. Shaw, and H. Simon at RAND Corporation, USA. In Sweden, H. Prawitz, D. Prawitz, and N. Voghera produced the first general program for the full first order predicate calculus to prove mathematical theorems; their computer proofs were obtained around 1957 and 1958, about the same time that H. Gelernter finished a computer program to prove simple high school geometry theorems. Since the field of computational logic (or automated theorem proving) is emerging from the ivory tower of academic research into real world applications, asserting also a definite place in many university curricula, we feel the time has corne to examine and evaluate its history. The article by Martin Davis in the first of this series of volumes traces the most influential ideas back to the 'prehistory' of early logical thought showing how these ideas influenced the underlying concepts of most early automatic theorem proving programs. |
You may like...
Modeling and Control of Static…
Arezki Fekik, Nacereddine Benamrouche
Hardcover
R6,170
Discovery Miles 61 700
Sustainable Nanosystems Development…
Mihai V. Putz, Marius Constantin Mirica
Hardcover
R6,326
Discovery Miles 63 260
Managing IoT and Mobile Technologies…
Shuang Geng, Kris M y Law, …
Paperback
R1,221
Discovery Miles 12 210
|