![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This volume presents the results of approximately 15 years of work from researchers around the world on the use of fuzzy set theory to represent imprecision in databases. The maturity of the research in the discipline and the recent developments in commercial/industrial fuzzy databases provided an opportunity to produce this survey. In this introduction we will describe briefly how fuzzy databases fit into the overall design of database systems and then overview the organization of the text. FUZZY DATABASE LANDSCAPE The last five years have been witness to a revolution in the database research community. The dominant data models have changed and the consensus on what constitutes worthwhile research is in flux. Also, at this time, it is possible to gain a perspective on what has been accomplished in the area of fuzzy databases. Therefore, now is an opportune time to take stock of the past and establish a framework. A framework should assist in evaluating future research through a better understanding of the different aspects of imprecision that a database can model [ 1 l.
It has been shown how the common structure that defines a family of proofs can be expressed as a proof plan [5]. This common structure can be exploited in the search for particular proofs. A proof plan has two complementary components: a proof method and a proof tactic. By prescribing the structure of a proof at the level of primitive inferences, a tactic [11] provides the guarantee part of the proof. In contrast, a method provides a more declarative explanation of the proof by means of preconditions. Each method has associated effects. The execution of the effects simulates the application of the corresponding tactic. Theorem proving in the proof planning framework is a two-phase process: 1. Tactic construction is by a process of method composition: Given a goal, an applicable method is selected. The applicability of a method is determined by evaluating the method's preconditions. The method effects are then used to calculate subgoals. This process is applied recursively until no more subgoals remain. Because of the one-to-one correspondence between methods and tactics, the output from this process is a composite tactic tailored to the given goal. 2. Tactic execution generates a proof in the object-level logic. Note that no search is involved in the execution of the tactic. All the search is taken care of during the planning process. The real benefits of having separate planning and execution phases become appar ent when a proof attempt fails.
The volume is the outgrowth of a workshop with the same title held at MSRI in the week of November 13-17, 1989, and for those who did not get it, Logic from Computer Science is the converse of Logic in Computer Science, the full name of the highly successful annual LICS conferences. We meant to have a conference which would bring together the LICS commu nity with some of the more traditional "mathematical logicians" and where the emphasis would be on the flow of ideas from computer science to logic rather than the other way around. In a LICS talk, sometimes, the speaker presents a perfectly good theorem about (say) the A-calculus or finite model theory in terms of its potential applications rather than its (often more ob vious) intrinsic, foundational interest and intricate proof. This is not meant to be a criticism; the LICS meetings are, after all, organized by the IEEE Computer Society. We thought, for once, it would be fun to see what we would get if we asked the speakers to emphasize the relevance of their work for logic rather than computer science and to point out what is involved in the proofs. I think, mostly, it worked. In any case, the group of people represented as broad a selection of logicians as I have seen in recent years, and the quality of the talks was (in my view) exceptionally, unusually high. I learned a lot and (I think) others did too."
By North-American standards, philosophy is not new in Quebec: the first men tion of philosophy lectures given by a Jesuit in the College de Quebec (founded 1635) dates from 1665, and the oldest logic manuscript dates from 1679. In English-speaking universities such as McGill (founded 1829), philosophy began to be taught later, during the second half of the 19th century. The major influence on English-speaking philosophers was, at least initially, that of Scottish Empiricism. On the other hand, the strong influence of the Catholic Church on French-Canadian society meant that the staff of the facultes of the French-speaking universities consisted, until recently, almost entirely of Thomist philosophers. There was accordingly little or no work in modem Formal Logic and Philosophy of Science and precious few contacts between the philosophical communities. In the late forties, Hugues Leblanc was a young student wanting to learn Formal Logic. He could not find anyone in Quebec to teach him and he went to study at Harvard University under the supervision of W. V. Quine. His best friend Maurice L' Abbe had left, a year earlier, for Princeton to study with Alonzo Church. After receiving his Ph. D from Harvard in 1948, Leblanc started his profes sional career at Bryn Mawr College, where he stayed until 1967. He then went to Temple University, where he taught until his retirement in 1992, serving as Chair of the Department of Philosophy from 1973 until 1979.
This present volume is the Proceedings of the 14th International Conference on Near rings and Nearfields held in Hamburg at the Universitiit der Bundeswehr Hamburg, from July 30 to August 06, 1995. This Conference was attended by 70 mathematicians and many accompanying persons who represented 22 different countries from all five continents. Thus it was the largest conference devoted entirely to nearrings and nearfields. The first of these conferences took place in 1968 at the Mathematische For schungsinstitut Oberwolfach, Germany. This was also the site of the conferences in 1972, 1976, 1980 and 1989. The other eight conferences held before the Hamburg Conference took place in eight different countries. For details about this and, more over, for a general historical overview of the development of the subject, we refer to the article "On the beginnings and development of near-ring theory" by G. Betsch [3]. During the last forty years the theory of nearrings and related algebraic struc tures like nearfields, nearmodules, nearalgebras and seminearrings has developed into an extensive branch of algebra with its own features. In its position between group theory and ring theory, this relatively young branch of algebra has not only a close relationship to these two more well-known areas of algebra, but it also has, just as these two theories, very intensive connections to many further branches of mathematics.
It is a pleasure and an honor to be able to present this collection of papers to Ray Reiter on the occasion of his 60th birthday. To say that Ray's research has had a deep impact on the field of Artificial Intel ligence is a considerable understatement. Better to say that anyone thinking of do ing work in areas like deductive databases, default reasoning, diagnosis, reasoning about action, and others should realize that they are likely to end up proving corol laries to Ray's theorems. Sometimes studying related work makes us think harder about the way we approach a problem; studying Ray's work is as likely to make us want to drop our way of doing things and take up his. This is because more than a mere visionary, Ray has always been a true leader. He shows us how to proceed not by pointing from his armchair, but by blazing a trail himself, setting up camp, and waiting for the rest of us to arrive. The International Joint Conference on Ar tificial Intelligence clearly recognized this and awarded Ray its highest honor, the Research Excellence award in 1993, before it had even finished acknowledging all the founders of the field. The papers collected here sample from many of the areas where Ray has done pi oneering work. One of his earliest areas of application was databases, and this is re flected in the chapters by Bertossi et at. and the survey chapter by Minker."
The present text resulted from lectures given by the authors at the Rijks Universiteit at Utrecht. These lectures were part of a series on 'History of Contemporary Mathematics'. The need for such an enterprise was generally felt, since the curriculum at many universities is designed to suit an efficient treatment of advanced subjects rather than to reflect the development of notions and techniques. As it is very likely that this trend will continue, we decided to offer lectures of a less technical nature to provide students and interested listeners with a survey of the history of topics in our present-day mathematics. We consider it very useful for a mathematician to have an acquaintance with the history of the development of his subject, especially in the nineteenth century where the germs of many of modern disciplines can be found. Our attention has therefore been mainly directed to relatively young developments. In the lectures we tried to stay clear of both oversimplification and extreme technicality. The result is a text, that should not cause difficulties to a reader with a working knowledge of mathematics. The developments sketched in this book are fundamental for many areas in mathematics and the notions considered are crucial almost everywhere. The book may be most useful, in particular, for those teaching mathematics.
Mathematical Foundations of Computer Science, Volume I is the first of two volumes presenting topics from mathematics (mostly discrete mathematics) which have proven relevant and useful to computer science. This volume treats basic topics, mostly of a set-theoretical nature (sets, functions and relations, partially ordered sets, induction, enumerability, and diagonalization) and illustrates the usefulness of mathematical ideas by presenting applications to computer science. Readers will find useful applications in algorithms, databases, semantics of programming languages, formal languages, theory of computation, and program verification. The material is treated in a straightforward, systematic, and rigorous manner. The volume is organized by mathematical area, making the material easily accessible to the upper-undergraduate students in mathematics as well as in computer science and each chapter contains a large number of exercises. The volume can be used as a textbook, but it will also be useful to researchers and professionals who want a thorough presentation of the mathematical tools they need in a single source. In addition, the book can be used effectively as supplementary reading material in computer science courses, particularly those courses which involve the semantics of programming languages, formal languages and automata, and logic programming.
Drinfeld Moduli Schemes and Automorphic Forms: The Theory of Elliptic Modules with Applications is based on the author's original work establishing the correspondence between ell-adic rank r Galois representations and automorphic representations of GL(r) over a function field, in the local case, and, in the global case, under a restriction at a single place. It develops Drinfeld's theory of elliptic modules, their moduli schemes and covering schemes, the simple trace formula, the fixed point formula, as well as the congruence relations and a "simple" converse theorem, not yet published anywhere. This version, based on a recent course taught by the author at The Ohio State University, is updated with references to research that has extended and developed the original work. The use of the theory of elliptic modules in the present work makes it accessible to graduate students, and it will serve as a valuable resource to facilitate an entrance to this fascinating area of mathematics.
This is the first book-length treatment of hybrid logic and its proof-theory. Hybrid logic is an extension of ordinary modal logic which allows explicit reference to individual points in a model (where the points represent times, possible worlds, states in a computer, or something else). This is useful for many applications, for example when reasoning about time one often wants to formulate a series of statements about what happens at specific times. There is little consensus about proof-theory for ordinary modal logic. Many modal-logical proof systems lack important properties and the relationships between proof systems for different modal logics are often unclear. In the present book we demonstrate that hybrid-logical proof-theory remedies these deficiencies by giving a spectrum of well-behaved proof systems (natural deduction, Gentzen, tableau, and axiom systems) for a spectrum of different hybrid logics (propositional, first-order, intensional first-order, and intuitionistic).
This book constitutes the proceedings of the 12th Biennial Meeting on Mathematics in Language, MOL 12, held in Nara, Japan, in September 2011. Presented in this volume are 12 carefully selected papers, as well as the paper of the invited speaker Andreas Maletti. The papers cover such diverse topics as formal languages (string and tree transducers, grammar-independent syntactic structures, probabilistic and weighted context-free grammars, formalization of minimalist syntax), parsing and unification, lexical and compositional semantics, statistical language models, and theories of truth.
Resolution Proof Systems: An Algebraic Theory presents a new algebraic framework for the design and analysis of resolution- based automated reasoning systems for a range of non-classical logics. It develops an algebraic theory of resolution proof systems focusing on the problems of proof theory, representation and efficiency of the deductive process. A new class of logical calculi, the class of resolution logics, emerges as a second theme of the book. The logical and computational aspects of the relationship between resolution logics and resolution proof systems is explored in the context of monotonic as well as nonmonotonic reasoning. This book is aimed primarily at researchers and graduate students in artificial intelligence, symbolic and computational logic. The material is suitable as a reference book for researchers and as a text book for graduate courses on the theoretical aspects of automated reasoning and computational logic.
Assuming that the reader is familiar with sheaf theory, the book gives a self-contained introduction to the theory of constructible sheaves related to many kinds of singular spaces, such as cell complexes, triangulated spaces, semialgebraic and subanalytic sets, complex algebraic or analytic sets, stratified spaces, and quotient spaces. The relation to the underlying geometrical ideas are worked out in detail, together with many applications to the topology of such spaces. All chapters have their own detailed introduction, containing the main results and definitions, illustrated in simple terms by a number of examples. The technical details of the proof are postponed to later sections, since these are not needed for the applications.
Before his death in March, 1976, A. H. Lightstone delivered the manu script for this book to Plenum Press. Because he died before the editorial work on the manuscript was completed, I agreed (in the fall of 1976) to serve as a surrogate author and to see the project through to completion. I have changed the manuscript as little as possible, altering certain passages to correct oversights. But the alterations are minor; this is Lightstone's book. H. B. Enderton vii Preface This is a treatment of the predicate calculus in a form that serves as a foundation for nonstandard analysis. Classically, the predicates and variables of the predicate calculus are kept distinct, inasmuch as no variable is also a predicate; moreover, each predicate is assigned an order, a unique natural number that indicates the length of each tuple to which the predicate can be prefixed. These restrictions are dropped here, in order to develop a flexible, expressive language capable of exploiting the potential of nonstandard analysis. To assist the reader in grasping the basic ideas of logic, we begin in Part I by presenting the propositional calculus and statement systems. This provides a relatively simple setting in which to grapple with the some times foreign ideas of mathematical logic. These ideas are repeated in Part II, where the predicate calculus and semantical systems are studied."
Relevance logics came of age with the one and only International Conference on relevant logics in 1974. They did not however become accepted, or easy to promulgate. In March 1981 we received most of the typescript of IN MEMORIAM: ALAN ROSS ANDERSON Proceedings of the International Conference of Relevant Logic from the original editors, Kenneth W. Collier, Ann Gasper and Robert G. Wolf of Southern Illinois University. 1 They had, most unfortunately, failed to find a publisher - not, it appears, because of overall lack of merit of the essays, but because of the expense of producing the collection, lack of institutional subsidization, and doubts of publishers as to whether an expensive collection of essays on such an esoteric, not to say deviant, subject would sell. We thought that the collection of essays was still (even after more than six years in the publishing trade limbo) well worth publishing, that the subject would remain undeservedly esoteric in North America while work on it could not find publishers (it is not so esoteric in academic circles in Continental Europe, Latin America and the Antipodes) and, quite important, that we could get the collection published, and furthermore, by resorting to local means, published comparatively cheaply. It is indeed no ordinary collection. It contains work by pioneers of the main types of broadly relevant systems, and by several of the most innovative non-classical logicians of the present flourishing logical period. We have slowly re-edited and reorganised the collection and made it camera-ready.
Fuzzy Modelling: Paradigms and Practice provides an up-to-date and authoritative compendium of fuzzy models, identification algorithms and applications. Chapters in this book have been written by the leading scholars and researchers in their respective subject areas. Several of these chapters include both theoretical material and applications. The editor of this volume has organized and edited the chapters into a coherent and uniform framework. The objective of this book is to provide researchers and practitioners involved in the development of models for complex systems with an understanding of fuzzy modelling, and an appreciation of what makes these models unique. The chapters are organized into three major parts covering relational models, fuzzy neural networks and rule-based models. The material on relational models includes theory along with a large number of implemented case studies, including some on speech recognition, prediction, and ecological systems. The part on fuzzy neural networks covers some fundamentals, such as neurocomputing, fuzzy neurocomputing, etc., identifies the nature of the relationship that exists between fuzzy systems and neural networks, and includes extensive coverage of their architectures. The last part addresses the main design principles governing the development of rule-based models. Fuzzy Modelling: Paradigms and Practice provides a wealth of specific fuzzy modelling paradigms, algorithms and tools used in systems modelling. Also included is a panoply of case studies from various computer, engineering and science disciplines. This should be a primary reference work for researchers and practitioners developing models of complex systems.
This volume contains the proceedings of the Second Joint IFSA-EC and EURO-WGFS Workshop on Progress in Fuzzy Sets in Europe held on April 6 -8, 1989 in Vienna, Austria. The workshop was organized by Prof. Dr. Wolfgang H. Janko from the University of Economics in Vienna under the auspices of IFSA-EC, the European chapter of the International Fuzzy Systems Association, and EURO-WGFS, the working group on Fuzzy Sets of the Association of Eu ropean Operational Research Societies. The workshop gathered more than 30 participants coming from Western European countries (Austria, Bel gium, England, Germany, Finland, France, Hungary, Italy, Scotland and Spain) Eastern European countries (Bulgaria, the German Federal Repu blic, Hungary and Poland) and non-European countries such as China and Japan. The 15 selected and refereed papers included in the volume are in prin ciple the author's own versions, with limited editorial changes and small corrections. They are arranged in alphabetical order. I wish to thank all the contributors for their valuable papers and an outstan ding cooperation in the editorial project. I also would like to express my sincere thanks to Professor Dr. H. J. Zimmermann for the cooperation in the refereeing procedure.
The History of the Book In August 1992 the author had the opportunity to give a course on resolution theorem proving at the Summer School for Logic, Language, and Information in Essex. The challenge of this course (a total of five two-hour lectures) con sisted in the selection of the topics to be presented. Clearly the first selection has already been made by calling the course "resolution theorem proving" instead of "automated deduction" . In the latter discipline a remarkable body of knowledge has been created during the last 35 years, which hardly can be presented exhaustively, deeply and uniformly at the same time. In this situ ation one has to make a choice between a survey and a detailed presentation with a more limited scope. The author decided for the second alternative, but does not suggest that the other is less valuable. Today resolution is only one among several calculi in computational logic and automated reasoning. How ever, this does not imply that resolution is no longer up to date or its potential exhausted. Indeed the loss of the "monopoly" is compensated by new appli cations and new points of view. It was the purpose of the course mentioned above to present such new developments of resolution theory. Thus besides the traditional topics of completeness of refinements and redundancy, aspects of termination (resolution decision procedures) and of complexity are treated on an equal basis."
Categories, homological algebra, sheaves and their cohomology furnish useful methods for attacking problems in a variety of mathematical fields. This textbook provides an introduction to these methods, describing their elements and illustrating them by examples.
The volume is almost entirely composed of the research and expository papers by the participants of the International Workshop "Groups, Rings, Lie and Hopf Algebras," which was held at the Memorial University of Newfoundland, St. John's, NF, Canada. All four areas from the title of the workshop are covered. In addition, some chapters touch upon the topics, which belong to two or more areas at the same time. Audience: The readership targeted includes researchers, graduate and senior undergraduate students in mathematics and its applications.
This text covers the parts of contemporary set theory relevant to other areas of pure mathematics. After a review of "naive" set theory, it develops the Zermelo-Fraenkel axioms of the theory before discussing the ordinal and cardinal numbers. It then delves into contemporary set theory, covering such topics as the Borel hierarchy and Lebesgue measure. A final chapter presents an alternative conception of set theory useful in computer science.
When I first participated in exploring theories of nonmonotonic reasoning in the late 1970s, I had no idea of the wealth of conceptual and mathematical results that would emerge from those halting first steps. This book by Wiktor Marek and Miroslaw Truszczynski is an elegant treatment of a large body of these results. It provides the first comprehensive treatment of two influen tial nonmonotonic logics - autoepistemic and default logic - and describes a number of surprising and deep unifying relationships between them. It also relates them to various modal logics studied in the philosophical logic litera ture, and provides a thorough treatment of their applications as foundations for logic programming semantics and for truth maintenance systems. It is particularly appropriate that Marek and Truszczynski should have authored this book, since so much of the research that went into these results is due to them. Both authors were trained in the Polish school of logic and they bring to their research and writing the logical insights and sophisticated mathematics that one would expect from such a background. I believe that this book is a splendid example of the intellectual maturity of the field of artificial intelligence, and that it will provide a model of scholarship for us all for many years to come. Ray Reiter Department of Computer Science University of Toronto Toronto, Canada M5S 1A4 and The Canadian Institute for Advanced Research Table of Contents 1 1 Introduction ........."
Dr. KURT GODEL'S sixtieth birthday (April 28, 1966) and the thirty fifth anniversary of the publication of his theorems on undecidability were celebrated during the 75th Anniversary Meeting of the Ohio Ac ademy of Science at The Ohio State University, Columbus, on April 22, 1966. The celebration took the form of a Festschrift Symposium on a theme supported by the late Director of The Institute for Advanced Study at Princeton, New Jersey, Dr. J. ROBERT OPPENHEIMER: "Logic, and Its Relations to Mathematics, Natural Science, and Philosophy." The symposium also celebrated the founding of Section L (Mathematical Sciences) of the Ohio Academy of Science. Salutations to Dr. GODEL were followed by the reading of papers by S. F. BARKER, H. B. CURRY, H. RUBIN, G. E. SACKS, and G. TAKEUTI, and by the announcement of in-absentia papers contributed in honor of Dr. GODEL by A. LEVY, B. MELTZER, R. M. SOLOVAY, and E. WETTE. A short discussion of "The II Beyond Godel's I" concluded the session."
On January 22, 1990, the late John Bell held at CERN (European Laboratory for Particle Physics), Geneva a seminar organized by the Center of Quantum Philosophy, that at this time was an association of scientists interested in the interpretation of quantum mechanics. In this seminar Bell presented once again his famous theorem. Thereafter a discussion took place in which not only physical but also highly speculative epistemological and philosophical questions were vividly debated. The list of topics included: assumption of free will in Bell's theorem, the understanding of mind, the relationship between the mathematical and the physical world, the existence of unobservable causes and the limits of human knowledge in mathematics and physics. Encouraged by this stimulating discussion some of the participants decided to found an Institute for Interdisciplinary Studies (lIS) to promote philosoph ical and interdisciplinary reflection on the advances of science. Meanwhile the lIS has associated its activities with the Swiss foundation, Fondation du Leman, and the Dutch foundation, Stichting Instudo, registered in Geneva and Amsterdam, respectively. With its activities the lIS intends to strengthen the unity between the professional activities in science and the reflection on fun damental philosophical questions. In addition the interdisciplinary approach is expected to give a contribution to the progress of science and the socio economic development. At present three working groups are active within the lIS, i. e.: - the Center for Quantum Philosophy, - the Wealth Creation and Sustainable Development Group, - the Neural Science Group." |
You may like...
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R921
Discovery Miles 9 210
|