![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Relational methods can be found at various places in computer science, notably in data base theory, relational semantics of concurrency, relationaltype theory, analysis of rewriting systems, and modern programming language design. In addition, they appear in algorithms analysis and in the bulk of discrete mathematics taught to computer scientists. This book is devoted to the background of these methods. It explains how to use relational and graph-theoretic methods systematically in computer science. A powerful formal framework of relational algebra is developed with respect to applications to a diverse range of problem areas. Results are first motivated by practical examples, often visualized by both Boolean 0-1-matrices and graphs, and then derived algebraically.
Descriptive set theory has been one of the main areas of research
in set theory for almost a century. This text attempts to present a
largely balanced approach, which combines many elements of the
different traditions of the subject. It includes a wide variety of
examples, exercises (over 400), and applications, in order to
illustrate the general concepts and results of the theory.
The popular literature on mathematical logic is rather extensive and written for the most varied categories of readers. College students or adults who read it in their free time may find here a vast number of thought-provoking logical problems. The reader who wishes to enrich his mathematical background in the hope that this will help him in his everyday life can discover detailed descriptions of practical (and quite often -- not so practical ) applications of logic. The large number of popular books on logic has given rise to the hope that by applying mathematical logic, students will finally learn how to distinguish between necessary and sufficient conditions and other points of logic in the college course in mathematics. But the habit of teachers of mathematical analysis, for example, to stick to problems dealing with sequences without limit, uniformly continuous functions, etc. has, unfortunately, led to the writing of textbooks that present prescriptions for the mechanical construction of definitions of negative concepts which seem to obviate the need for any thinking on the reader's part. We are most certainly not able to enumerate everything the reader may draw out of existing books on mathematical logic, however.
This is a motivated presentation of recent results on tree transducers, applied to studying the general properties of formal models and for providing semantics to context-free languages. The authors consider top-down tree transducers, macro tree transducers, attributed tree transducers, and macro attributed tree transducers. A unified terminology is used to define them, and their transformational capacities are compared. This handbook on tree transducers will serve as a base for further research.
Now approaching its tenth year, this hugely successful book
presents an unusual attempt to publicise the field of Complex
Dynamics. The text was originally conceived as a supplemented
catalogue to the exhibition "Frontiers of Chaos," seen in Europe
and the United States, and describes the context and meaning of
these fascinating images. A total of 184 illustrations - including
88 full-colour pictures of Julia sets - are suggestive of a
coffee-table book.
Approach your problems from the right end It isn't that they can't see the solution. It is and begin with the answers. Then one day, that they can't see the problem. perhaps you will find the final question. G. K. Chesterton. The Scandal of Father 'The Hermit Gad in Crane Feathers' in R. Brown'The point of a Pin'. van Gulik's TheChinese Maze Murders. Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However, the "tree" of knowledge of mathematics and related fields does not grow only by putting forth new branches. It also happens, quite often in fact, that branches which were thought to be completely disparate are suddenly seen to be related. Further, the kind and level of sophistication of mathematics applied in various sciences has changed drastically in recent years: measure theory is used (non-trivially) in regional and theoretical economics; algebraic geometry interacts with physics; the Minkowsky lemma, coding theory and the structure of water meet one another in packing and covering theory; quantum fields, crystal defects and mathematical programming profit from homotopy theory; Lie algebras are relevant to filtering; and prediction and electrical engineering can use Stein spaces. And in addition to this there are such new emerging SUbdisciplines as "experimental mathematics," "CFD," "completely integrable systems," "chaos, synergetics and large-scale order," which are almost impossible to fit into the existing classification schemes. They draw upon widely different sections of mathematics.
J. Richard Biichi is well known for his work in mathematical logic and theoretical computer science. (He himself would have sharply objected to the qualifier "theoretical," because he more or less identified science and theory, using "theory" in a broader sense and "science" in a narrower sense than usual.) We are happy to present here this collection of his papers. I (DS)1 worked with Biichi for many years, on and off, ever since I did my Ph.D. thesis on his Sequential Calculus. His way was to travel locally, not globally: When we met we would try some specific problem, but rarely dis cussed research we had done or might do. After he died in April 1984 I sifted through the manuscripts and notes left behind and was dumbfounded to see what areas he had been in. Essentially I knew about his work in finite au tomata, monadic second-order theories, and computability. But here were at least four layers on his writing desk, and evidently he had been working on them all in parallel. I am sure that many people who knew Biichi would tell an analogous story."
Approach your problems from the right end It isn't that they can't see the solution. It is and begin with the answers. Then one day, that they can't see the problem. perhaps you will tind the tinal question. G. K. Chesterton. The Scandal of Father Brown 'The point of a Pin'. 'The Hermit CIad in Crane Feathers' in R. van Gulik's The Chinese Maze Murders. Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However, the "tree" of knowledge of mathematics and related fields does not grow only by putting forth new branches. It also happens, quite of ten in fact, that branches which were thought to be completely disparate are suddenly seen to be related. Further, the kind and level of sophistication of mathematics applied in various sciences has changed drastically in recent years: measure theory is used (non-trivially) in regional and theoretical economics; algebraic geometry interacts with physics; the Minkowsky lemma, coding theory and the structure of water meet one another in packing and covering theory; quantum fields, crystal defects and mathematical programming profit from homotopy theory; Lie algebras are relevant to fiItering; and prediction and electrical engineering can use Stein spaces. And in addition to this there are such new emerging subdisciplines as "experimental mathematics," "CFD," "completely integrable systems," "chaos, synergetics and large-scale order," which are almost impossible to fit into the existing classification schemes. They draw upon widely different sections of mathematics.
It is widely assumed that there exist certain objects which can in no way be distinguished from each other, unless by their location in space or other reference-system. Some of these are, in a broad sense, 'empirical objects', such as electrons. Their case would seem to be similar to that of certain mathematical 'objects', such as the minimum set of manifolds defining the dimensionality of an R -space. It is therefore at first sight surprising that there exists no branch of mathematics, in which a third parity-relation, besides equality and inequality, is admitted; for this would seem to furnish an appropriate model for application to such instances as these. I hope, in this work, to show that such a mathematics in feasible, and could have useful applications if only in a limited field. The concept of what I here call 'indistinguishability' is not unknown in logic, albeit much neglected. It is mentioned, for example, by F. P. Ramsey [1] who criticizes Whitehead and Russell [2] for defining 'identity' in such a way as to make indistinguishables identical. But, so far as I can discover, no one has made any systematic attempt to open up the territory which lies behind these ideas. What we find, on doing so, is a body of mathematics, offering only a limited prospect of practical usefulness, but which on the theoretical side presents a strong challenge to conventional ideas.
This volume presents a unified approach to the mathematical theory of a wide class of non-additive set functions, the so called null-additive set functions, which also includes classical measure theory. It includes such important set functions as capacities, triangular set functions, some fuzzy measures, submeasures, decomposable measures, possibility measures, distorted probabilities, autocontinuous set functions, etc. The usefulness of the theory is demonstrated by applications in nonlinear differential and difference equations; fractal geometry in the theory of chaos; the approximation of functions in modular spaces by nonlinear singular integral operators; and in the theory of diagonal theorems as a universal method for proving general and fundamental theorems in functional analysis and measure theory. Audience: This book will be of value to researchers and postgraduate students in mathematics, as well as in such diverse fields as knowledge engineering, artificial intelligence, game theory, statistics, economics, sociology and industry.
D. Hilbert, in his famous program, formulated many open mathematical problems which were stimulating for the development of mathematics and a fruitful source of very deep and fundamental ideas. During the whole 20th century, mathematicians and specialists in other fields have been solving problems which can be traced back to Hilbert's program, and today there are many basic results stimulated by this program. It is sure that even at the beginning of the third millennium, mathematicians will still have much to do. One of his most interesting ideas, lying between mathematics and physics, is his sixth problem: To find a few physical axioms which, similar to the axioms of geometry, can describe a theory for a class of physical events that is as large as possible. We try to present some ideas inspired by Hilbert's sixth problem and give some partial results which may contribute to its solution. In the Thirties the situation in both physics and mathematics was very interesting. A.N. Kolmogorov published his fundamental work Grundbegriffe der Wahrschein lichkeitsrechnung in which he, for the first time, axiomatized modern probability theory. From the mathematical point of view, in Kolmogorov's model, the set L of ex perimentally verifiable events forms a Boolean a-algebra and, by the Loomis-Sikorski theorem, roughly speaking can be represented by a a-algebra S of subsets of some non-void set n."
Many years of practical experience in teaching discrete mathematics form the basis of this text book. Part I contains problems on such topics as Boolean algebra, k-valued logics, graphs and networks, elements of coding theory, automata theory, algorithms theory, combinatorics, Boolean minimization and logical design. The exercises are preceded by ample theoretical background material. For further study the reader is referred to the extensive bibliography. Part II follows the same structure as Part I, and gives helpful hints and solutions. Audience: This book will be of great value to undergraduate students of discrete mathematics, whereas the more difficult exercises, which comprise about one-third of the material, will also appeal to postgraduates and researchers.
Despite decades of work in evolutionary algorithms, there remains a lot of uncertainty as to when it is beneficial or detrimental to use recombination or mutation. This book provides a characterization of the roles that recombination and mutation play in evolutionary algorithms. It integrates prior theoretical work and introduces new theoretical techniques for studying evolutionary algorithms. An aggregation algorithm for Markov chains is introduced which is useful for studying not only evolutionary algorithms specifically, but also complex systems in general. Practical consequences of the theory are explored and a novel method for comparing search and optimization algorithms is introduced. A focus on discrete rather than real-valued representations allows the book to bridge multiple communities, including evolutionary biologists and population geneticists.
The purpose of this book is to provide the reader who is interested in applications of fuzzy set theory, in the first place with a text to which he or she can refer for the basic theoretical ideas, concepts and techniques in this field and in the second place with a vast and up to date account of the literature. Although there are now many books about fuzzy set theory, and mainly about its applications, e. g. in control theory, there is not really a book available which introduces the elementary theory of fuzzy sets, in what I would like to call "a good degree of generality." To write a book which would treat the entire range of results concerning the basic theoretical concepts in great detail and which would also deal with all possible variants and alternatives of the theory, such as e. g. rough sets and L-fuzzy sets for arbitrary lattices L, with the possibility-probability theories and interpretations, with the foundation of fuzzy set theory via multi-valued logic or via categorical methods and so on, would have been an altogether different project. This book is far more modest in its mathematical content and in its scope.
This volume, the 7th volume in the DRUMS Handbook series, is part of the aftermath of the successful ESPRIT project DRUMS (Defeasible Reasoning and Uncertainty Management Systems) which took place in two stages from 1989- 1996. In the second stage (1993-1996) a work package was introduced devoted to the topics Reasoning and Dynamics, covering both the topics of "Dynamics of Reasoning," where reasoning is viewed as a process, and "Reasoning about Dynamics," which must be understood as pertaining to how both designers of and agents within dynamic systems may reason about these systems. The present volume presents work done in this context extended with some work done by outstanding researchers outside the project on related issues. While the previous volume in this series had its focus on the dynamics of reasoning pro cesses, the present volume is more focused on "reasoning about dynamics', viz. how (human and artificial) agents reason about (systems in) dynamic environments in order to control them. In particular we consider modelling frameworks and generic agent models for modelling these dynamic systems and formal approaches to these systems such as logics for agents and formal means to reason about agent based and compositional systems, and action & change more in general. We take this opportunity to mention that we have very pleasant recollections of the project, with its lively workshops and other meetings, with the many sites and researchers involved, both within and outside our own work package."
The first edition of the monograph Information and Randomness: An Algorithmic Perspective by Crist ian Calude was published in 1994. In my Foreword I said: "The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. . . . The present book by Calude fits very well in our series. Much original research is presented. . . making the approach richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and, thus, it is not necessary to consult other sources for classroom instruction. " The vigorous growth in the study of algorithmic information theory has continued during the past few years, which is clearly visible in the present second edition. Many new results, examples, exercises and open prob lems have been added. The additions include two entirely new chapters: "Computably Enumerable Random Reals" and "Randomness and Incom pleteness." The really comprehensive new bibliography makes the book very valuable for a researcher. The new results about the characterization of computably enumerable random reals, as well as the fascinating Omega Numbers, should contribute much to the value of the book as a textbook. The author has been directly involved in these results that have appeared in the prestigious journals Nature, New Scientist and Pour la Science."
The subject of Time has a wide intellectual appeal across different dis ciplines. This has shown in the variety of reactions received from readers of the first edition of the present Book. Many have reacted to issues raised in its philosophical discussions, while some have even solved a number of the open technical questions raised in the logical elaboration of the latter. These results will be recorded below, at a more convenient place. In the seven years after the first publication, there have been some noticeable newer developments in the logical study of Time and temporal expressions. As far as Temporal Logic proper is concerned, it seems fair to say that these amount to an increase in coverage and sophistication, rather than further break-through innovation. In fact, perhaps the most significant sources of new activity have been the applied areas of Linguistics and Computer Science (including Artificial Intelligence), where many intriguing new ideas have appeared presenting further challenges to temporal logic. Now, since this Book has a rather tight composition, it would have been difficult to interpolate this new material without endangering intelligibility."
This volume contains a collection of research papers centered around the concept of quantifier. Recently this concept has become the central point of research in logic. It is one of the important logical concepts whose exact domain and applications have so far been insufficiently explored, especially in the area of inferential and semantic properties of languages. It should thus remain the central point of research in the future. Moreover, during the last twenty years generalized quantifiers and logical technics based on them have proved their utility in various applications. The example of natu rallanguage semantics has been partcularly striking. For a long time it has been belived that elementary logic also called first-order logic was an ade quate theory of logical forms of natural language sentences. Recently it has been accepted that semantics of many natural language constructions can not be properly represented in elementary logic. It has turned out, however, that they can be described by means of generalized quantifiers. As far as computational applications oflogic are concerned, particulary interesting are semantics restricted to finite models. Under this restriction elementary logic looses several of its advantages such as axiomatizability and compactness. And for various purposes we can use equally well some semantically richer languages of which generalized quantifiers offer the most universal methods of describing extensions of elementary logic. Moreover we can look at generalized quantifiers as an explication of some specific mathematical concepts, e. g."
Simplicity theory is an extension of stability theory to a wider class of structures, containing, among others, the random graph, pseudo-finite fields, and fields with a generic automorphism. Following Kim's proof of forking symmetry' which implies a good behaviour of model-theoretic independence, this area of model theory has been a field of intense study. It has necessitated the development of some important new tools, most notably the model-theoretic treatment of hyperimaginaries (classes modulo type-definable equivalence relations). It thus provides a general notion of independence (and of rank in the supersimple case) applicable to a wide class of algebraic structures. The basic theory of forking independence is developed, and its properties in a simple structure are analyzed. No prior knowledge of stability theory is assumed; in fact many stability-theoretic results follow either from more general propositions, or are developed in side remarks. Audience: This book is intended both as an introduction to simplicity theory accessible to graduate students with some knowledge of model theory, and as a reference work for research in the field.
Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.
We are happy to present the first volume of the Handbook of Defeasible Reasoning and Uncertainty Management Systems. Uncertainty pervades the real world and must therefore be addressed by every system that attempts to represent reality. The representation of uncertainty is a ma jor concern of philosophers, logicians, artificial intelligence researchers and com puter sciencists, psychologists, statisticians, economists and engineers. The present Handbook volumes provide frontline coverage of this area. This Handbook was produced in the style of previous handbook series like the Handbook of Philosoph ical Logic, the Handbook of Logic in Computer Science, the Handbook of Logic in Artificial Intelligence and Logic Programming, and can be seen as a companion to them in covering the wide applications of logic and reasoning. We hope it will answer the needs for adequate representations of uncertainty. This Handbook series grew out of the ESPRIT Basic Research Project DRUMS II, where the acronym is made out of the Handbook series title. This project was financially supported by the European Union and regroups 20 major European research teams working in the general domain of uncertainty. As a fringe benefit of the DRUMS project, the research community was able to create this Hand book series, relying on the DRUMS participants as the core of the authors for the Handbook together with external international experts."
0.1. General remarks. For any algebraic system A, the set SubA of all subsystems of A partially ordered by inclusion forms a lattice. This is the subsystem lattice of A. (In certain cases, such as that of semigroups, in order to have the right always to say that SubA is a lattice, we have to treat the empty set as a subsystem.) The study of various inter-relationships between systems and their subsystem lattices is a rather large field of investigation developed over many years. This trend was formed first in group theory; basic relevant information up to the early seventies is contained in the book [Suz] and the surveys [K Pek St], [Sad 2], [Ar Sad], there is also a quite recent book [Schm 2]. As another inspiring source, one should point out a branch of mathematics to which the book [Baer] was devoted. One of the key objects of examination in this branch is the subspace lattice of a vector space over a skew field. A more general approach deals with modules and their submodule lattices. Examining subsystem lattices for the case of modules as well as for rings and algebras (both associative and non-associative, in particular, Lie algebras) began more than thirty years ago; there are results on this subject also for lattices, Boolean algebras and some other types of algebraic systems, both concrete and general. A lot of works including several surveys have been published here.
In 1953, exactly 50 years ago to this day, the first volume of
Studia Logica appeared under the auspices of The Philosophical
Committee of The Polish Academy of Sciences. Now, five decades
later the present volume is dedicated to a celebration of this 50th
Anniversary of Studia Logica. The volume features a series of
papers by distinguished scholars reflecting both the aim and scope
of this journal for symbolic logic.
This book presents logical foundations of dual tableaux together with a number of their applications both to logics traditionally dealt with in mathematics and philosophy (such as modal, intuitionistic, relevant, and many-valued logics) and to various applied theories of computational logic (such as temporal reasoning, spatial reasoning, fuzzy-set-based reasoning, rough-set-based reasoning, order-of magnitude reasoning, reasoning about programs, threshold logics, logics of conditional decisions). The distinguishing feature of most of these applications is that the corresponding dual tableaux are built in a relational language which provides useful means of presentation of the theories. In this way modularity of dual tableaux is ensured. We do not need to develop and implement each dual tableau from scratch, we should only extend the relational core common to many theories with the rules specific for a particular theory.
In the summer of 1991 the Department of Mathematics and Statistics of the Universite de Montreal was fortunate to host the NATO Advanced Study Institute "Algebras and Orders" as its 30th Seminaire de mathematiques superieures (SMS), a summer school with a long tradition and well-established reputation. This book contains the contributions of the invited speakers. Universal algebra- which established itself only in the 1930's- grew from traditional algebra (e.g., groups, modules, rings and lattices) and logic (e.g., propositional calculus, model theory and the theory of relations). It started by extending results from these fields but by now it is a well-established and dynamic discipline in its own right. One of the objectives of the ASI was to cover a broad spectrum of topics in this field, and to put in evidence the natural links to, and interactions with, boolean algebra, lattice theory, topology, graphs, relations, automata, theoretical computer science and (partial) orders. The theory of orders is a relatively young and vigorous discipline sharing certain topics as well as many researchers and meetings with universal algebra and lattice theory. W. Taylor surveyed the abstract clone theory which formalizes the process of compos ing operations (i.e., the formation of term operations) of an algebra as a special category with countably many objects, and leading naturally to the interpretation and equivalence of varieties." |
![]() ![]() You may like...
Mobile Information Systems Leveraging…
Gloria Bordogna, Paola Carrara
Hardcover
R3,049
Discovery Miles 30 490
Resonances in the Earth-Ionosphere…
A. P. Nickolaenko, M. Hayakawa
Hardcover
R6,740
Discovery Miles 67 400
Computational Modeling in Biological…
Lisa J. Fauci, Shay Gueron
Hardcover
R4,726
Discovery Miles 47 260
Evolutionary Data Clustering: Algorithms…
Ibrahim Aljarah, Hossam Faris, …
Hardcover
R5,371
Discovery Miles 53 710
Solar Ultraviolet Radiation - Modelling…
Christos S. Zerefos, Alkiviadis F. Bais
Hardcover
R5,931
Discovery Miles 59 310
|