![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Since its inception by Professor Lotfi Zadeh about 18 years ago, the theory of fuzzy sets has evolved in many directions, and is finding applications in a wide variety of fields in which the phenomena under study are too complex or too ill-defined to be analyzed by conventional techniques. Thus, by providing a basis for a systematic approach to approximate reasoning and inexact inference, the theory of fuzzy sets may well have a substantial impact on scientific methodology in the years ahead, particularly in the realms of psychology, economics, engineering, law, medicine, decision-analysis, information retrieval, and artificial intelli gence. This volume consists of 24 selected papers invited by the editor, Professor Paul P. Wang. These papers cover the theory and applications of fuzzy sets, almost equal in number. We are very fortunate to have Professor A. Kaufmann to contribute an overview paper of the advances in fuzzy sets. One special feature of this volume is the strong participation of Chinese researchers in this area. The fact is that Chinese mathematicians, scientists and engineers have made important contributions to the theory and applications of fuzzy sets through the past decade. However, not until the visit of Professor A. Kaufmann to China in 1974 and again in 1980, did the Western World become fully aware of the important work of Chinese researchers. Now, Professor Paul Wang has initiated the effort to document these important contributions in this volume to expose them to the western researchers."
This book is about some recent work in a subject usually considered part of "logic" and the" foundations of mathematics," but also having close connec tions with philosophy and computer science. Namely, the creation and study of "formal systems for constructive mathematics." The general organization of the book is described in the" User's Manual" which follows this introduction, and the contents of the book are described in more detail in the introductions to Part One, Part Two, Part Three, and Part Four. This introduction has a different purpose; it is intended to provide the reader with a general view of the subject. This requires, to begin with, an elucidation of both the concepts mentioned in the phrase, "formal systems for constructive mathematics." "Con structive mathematics" refers to mathematics in which, when you prove that l a thing exists (having certain desired properties) you show how to find it. Proof by contradiction is the most common way of proving something exists without showing how to find it - one assumes that nothing exists with the desired properties, and derives a contradiction. It was only in the last two decades of the nineteenth century that mathematicians began to exploit this method of proof in ways that nobody had previously done; that was partly made possible by the creation and development of set theory by Georg Cantor and Richard Dedekind."
Rough Sets and Data Mining: Analysis of Imprecise Data is an edited collection of research chapters on the most recent developments in rough set theory and data mining. The chapters in this work cover a range of topics that focus on discovering dependencies among data, and reasoning about vague, uncertain and imprecise information. The authors of these chapters have been careful to include fundamental research with explanations as well as coverage of rough set tools that can be used for mining data bases. The contributing authors consist of some of the leading scholars in the fields of rough sets, data mining, machine learning and other areas of artificial intelligence. Among the list of contributors are Z. Pawlak, J Grzymala-Busse, K. Slowinski, and others. Rough Sets and Data Mining: Analysis of Imprecise Data will be a useful reference work for rough set researchers, data base designers and developers, and for researchers new to the areas of data mining and rough sets.
The concept of likeness to truth, like that of truth itself, is fundamental to a realist conception of inquiry. To demonstrate this we need only make two rather modest aim of an inquiry, as an inquiry, is realist assumptions: the truth doctrine (that the the truth of some matter) and the progress doctrine (that one false theory may realise this aim better than another). Together these yield the conclusion that a false theory may be more truthlike, or closer to the truth, than another. It is the aim of this book to give a rigorous philosophical analysis of the concept of likeness to truth, and to examine the consequences, some of them no doubt surprising to those who have been unduly impressed by the (admittedly important) true/false dichotomy. Truthlikeness is not only a requirement of a particular philosophical outlook, it is as deeply embedded in common sense as the concept of truth. Everyone seems to be capable of grading various propositions, in different (hypothetical) situations, according to their closeness to the truth in those situations. And (if my experience is anything to go by) there is remarkable unanimity on these pretheoretical judge ments. This is not proof that there is a single coherent concept underlying these judgements. The whole point of engaging in philosophical analysis is to make this claim plausible."
Problems in decision making and in other areas such as pattern recogni tion, control, structural engineering etc. involve numerous aspects of uncertainty. Additional vagueness is introduced as models become more complex but not necessarily more meaningful by the added details. During the last two decades one has become more and more aware of the fact that not all this uncertainty is of stochastic (random) cha racter and that, therefore, it can not be modelled appropriately by probability theory. This becomes the more obvious the more we want to represent formally human knowledge. As far as uncertain data are concerned, we have neither instru ments nor reasoning at our disposal as well defined and unquestionable as those used in the probability theory. This almost infallible do main is the result of a tremendous work by the whole scientific world. But when measures are dubious, bad or no longer possible and when we really have to make use of the richness of human reasoning in its variety, then the theories dealing with the treatment of uncertainty, some quite new and other ones older, provide the required complement, and fill in the gap left in the field of knowledge representation. Nowadays, various theories are widely used: fuzzy sets, belief function, the convenient associations between probability and fuzzines~ etc *** We are more and more in need of a wide range of instruments and theories to build models that are more and more adapted to the most complex systems.
The words "microdifferential systems in the complex domain" refer to seve ral branches of mathematics: micro local analysis, linear partial differential equations, algebra, and complex analysis. The microlocal point of view first appeared in the study of propagation of singularities of differential equations, and is spreading now to other fields of mathematics such as algebraic geometry or algebraic topology. How ever it seems that many analysts neglect very elementary tools of algebra, which forces them to confine themselves to the study of a single equation or particular square matrices, or to carryon heavy and non-intrinsic formula tions when studying more general systems. On the other hand, many alge braists ignore everything about partial differential equations, such as for example the "Cauchy problem," although it is a very natural and geometri cal setting of "inverse image." Our aim will be to present to the analyst the algebraic methods which naturally appear in such problems, and to make available to the algebraist some topics from the theory of partial differential equations stressing its geometrical aspects. Keeping this goal in mind, one can only remain at an elementary level."
Heyting'88 Summer School and Conference on Mathematical Logic, held September 13 - 23, 1988 in Chaika, Bulgaria, was honourably dedicated to Arend Heyting's 90th anniversary. It was organized by Sofia University "Kliment Ohridski" on the occasion of its centenary and by the Bulgarian Academy of Sciences, with sponsorship of the Association for Symbolic Logic. The Meeting gathered some 115 participants from 19 countries. The present volume consists of invited and selected papers. Included are all the invited lectures submitted for publication and the 14 selected contributions, chosen out of 56 submissions by the Selection Committee. The selection was made on the basis of reports of PC members, an average of 4 per sLlbmission. All the papers are concentrated on the topics of the Meeting: Recursion Theory, Modal and Non-classical Logics, Intuitionism and Constructivism, Related Applications to Computer and Other Sciences, Life and Work of Arend Heyting. I am pleased to thank all persons and institutions that contributed to the success of the Meeting: sponsors, Programme Committee members and additional referees, the members of the Organizing Committee, our secretaries K. Lozanova and L. Nikolova, as well as K. Angelov, V. Bozhichkova, A. Ditchev, D. Dobrev, N. Dimitrov, R. Draganova, G. Gargov, N. Georgieva, M. Janchev, P. Marinov, S. Nikolova, S. Radev, I. Soskov, A. Soskova and v. Sotirov, who helped in the organization, Plenum Press and at last but not least all participants in the Meeting and contributors to this volume.
This volume presents the results of approximately 15 years of work from researchers around the world on the use of fuzzy set theory to represent imprecision in databases. The maturity of the research in the discipline and the recent developments in commercial/industrial fuzzy databases provided an opportunity to produce this survey. In this introduction we will describe briefly how fuzzy databases fit into the overall design of database systems and then overview the organization of the text. FUZZY DATABASE LANDSCAPE The last five years have been witness to a revolution in the database research community. The dominant data models have changed and the consensus on what constitutes worthwhile research is in flux. Also, at this time, it is possible to gain a perspective on what has been accomplished in the area of fuzzy databases. Therefore, now is an opportune time to take stock of the past and establish a framework. A framework should assist in evaluating future research through a better understanding of the different aspects of imprecision that a database can model [ 1 l.
It has been shown how the common structure that defines a family of proofs can be expressed as a proof plan [5]. This common structure can be exploited in the search for particular proofs. A proof plan has two complementary components: a proof method and a proof tactic. By prescribing the structure of a proof at the level of primitive inferences, a tactic [11] provides the guarantee part of the proof. In contrast, a method provides a more declarative explanation of the proof by means of preconditions. Each method has associated effects. The execution of the effects simulates the application of the corresponding tactic. Theorem proving in the proof planning framework is a two-phase process: 1. Tactic construction is by a process of method composition: Given a goal, an applicable method is selected. The applicability of a method is determined by evaluating the method's preconditions. The method effects are then used to calculate subgoals. This process is applied recursively until no more subgoals remain. Because of the one-to-one correspondence between methods and tactics, the output from this process is a composite tactic tailored to the given goal. 2. Tactic execution generates a proof in the object-level logic. Note that no search is involved in the execution of the tactic. All the search is taken care of during the planning process. The real benefits of having separate planning and execution phases become appar ent when a proof attempt fails.
The volume is the outgrowth of a workshop with the same title held at MSRI in the week of November 13-17, 1989, and for those who did not get it, Logic from Computer Science is the converse of Logic in Computer Science, the full name of the highly successful annual LICS conferences. We meant to have a conference which would bring together the LICS commu nity with some of the more traditional "mathematical logicians" and where the emphasis would be on the flow of ideas from computer science to logic rather than the other way around. In a LICS talk, sometimes, the speaker presents a perfectly good theorem about (say) the A-calculus or finite model theory in terms of its potential applications rather than its (often more ob vious) intrinsic, foundational interest and intricate proof. This is not meant to be a criticism; the LICS meetings are, after all, organized by the IEEE Computer Society. We thought, for once, it would be fun to see what we would get if we asked the speakers to emphasize the relevance of their work for logic rather than computer science and to point out what is involved in the proofs. I think, mostly, it worked. In any case, the group of people represented as broad a selection of logicians as I have seen in recent years, and the quality of the talks was (in my view) exceptionally, unusually high. I learned a lot and (I think) others did too."
By North-American standards, philosophy is not new in Quebec: the first men tion of philosophy lectures given by a Jesuit in the College de Quebec (founded 1635) dates from 1665, and the oldest logic manuscript dates from 1679. In English-speaking universities such as McGill (founded 1829), philosophy began to be taught later, during the second half of the 19th century. The major influence on English-speaking philosophers was, at least initially, that of Scottish Empiricism. On the other hand, the strong influence of the Catholic Church on French-Canadian society meant that the staff of the facultes of the French-speaking universities consisted, until recently, almost entirely of Thomist philosophers. There was accordingly little or no work in modem Formal Logic and Philosophy of Science and precious few contacts between the philosophical communities. In the late forties, Hugues Leblanc was a young student wanting to learn Formal Logic. He could not find anyone in Quebec to teach him and he went to study at Harvard University under the supervision of W. V. Quine. His best friend Maurice L' Abbe had left, a year earlier, for Princeton to study with Alonzo Church. After receiving his Ph. D from Harvard in 1948, Leblanc started his profes sional career at Bryn Mawr College, where he stayed until 1967. He then went to Temple University, where he taught until his retirement in 1992, serving as Chair of the Department of Philosophy from 1973 until 1979.
This present volume is the Proceedings of the 14th International Conference on Near rings and Nearfields held in Hamburg at the Universitiit der Bundeswehr Hamburg, from July 30 to August 06, 1995. This Conference was attended by 70 mathematicians and many accompanying persons who represented 22 different countries from all five continents. Thus it was the largest conference devoted entirely to nearrings and nearfields. The first of these conferences took place in 1968 at the Mathematische For schungsinstitut Oberwolfach, Germany. This was also the site of the conferences in 1972, 1976, 1980 and 1989. The other eight conferences held before the Hamburg Conference took place in eight different countries. For details about this and, more over, for a general historical overview of the development of the subject, we refer to the article "On the beginnings and development of near-ring theory" by G. Betsch [3]. During the last forty years the theory of nearrings and related algebraic struc tures like nearfields, nearmodules, nearalgebras and seminearrings has developed into an extensive branch of algebra with its own features. In its position between group theory and ring theory, this relatively young branch of algebra has not only a close relationship to these two more well-known areas of algebra, but it also has, just as these two theories, very intensive connections to many further branches of mathematics.
It is a pleasure and an honor to be able to present this collection of papers to Ray Reiter on the occasion of his 60th birthday. To say that Ray's research has had a deep impact on the field of Artificial Intel ligence is a considerable understatement. Better to say that anyone thinking of do ing work in areas like deductive databases, default reasoning, diagnosis, reasoning about action, and others should realize that they are likely to end up proving corol laries to Ray's theorems. Sometimes studying related work makes us think harder about the way we approach a problem; studying Ray's work is as likely to make us want to drop our way of doing things and take up his. This is because more than a mere visionary, Ray has always been a true leader. He shows us how to proceed not by pointing from his armchair, but by blazing a trail himself, setting up camp, and waiting for the rest of us to arrive. The International Joint Conference on Ar tificial Intelligence clearly recognized this and awarded Ray its highest honor, the Research Excellence award in 1993, before it had even finished acknowledging all the founders of the field. The papers collected here sample from many of the areas where Ray has done pi oneering work. One of his earliest areas of application was databases, and this is re flected in the chapters by Bertossi et at. and the survey chapter by Minker."
The present text resulted from lectures given by the authors at the Rijks Universiteit at Utrecht. These lectures were part of a series on 'History of Contemporary Mathematics'. The need for such an enterprise was generally felt, since the curriculum at many universities is designed to suit an efficient treatment of advanced subjects rather than to reflect the development of notions and techniques. As it is very likely that this trend will continue, we decided to offer lectures of a less technical nature to provide students and interested listeners with a survey of the history of topics in our present-day mathematics. We consider it very useful for a mathematician to have an acquaintance with the history of the development of his subject, especially in the nineteenth century where the germs of many of modern disciplines can be found. Our attention has therefore been mainly directed to relatively young developments. In the lectures we tried to stay clear of both oversimplification and extreme technicality. The result is a text, that should not cause difficulties to a reader with a working knowledge of mathematics. The developments sketched in this book are fundamental for many areas in mathematics and the notions considered are crucial almost everywhere. The book may be most useful, in particular, for those teaching mathematics.
Mathematical Foundations of Computer Science, Volume I is the first of two volumes presenting topics from mathematics (mostly discrete mathematics) which have proven relevant and useful to computer science. This volume treats basic topics, mostly of a set-theoretical nature (sets, functions and relations, partially ordered sets, induction, enumerability, and diagonalization) and illustrates the usefulness of mathematical ideas by presenting applications to computer science. Readers will find useful applications in algorithms, databases, semantics of programming languages, formal languages, theory of computation, and program verification. The material is treated in a straightforward, systematic, and rigorous manner. The volume is organized by mathematical area, making the material easily accessible to the upper-undergraduate students in mathematics as well as in computer science and each chapter contains a large number of exercises. The volume can be used as a textbook, but it will also be useful to researchers and professionals who want a thorough presentation of the mathematical tools they need in a single source. In addition, the book can be used effectively as supplementary reading material in computer science courses, particularly those courses which involve the semantics of programming languages, formal languages and automata, and logic programming.
Drinfeld Moduli Schemes and Automorphic Forms: The Theory of Elliptic Modules with Applications is based on the author's original work establishing the correspondence between ell-adic rank r Galois representations and automorphic representations of GL(r) over a function field, in the local case, and, in the global case, under a restriction at a single place. It develops Drinfeld's theory of elliptic modules, their moduli schemes and covering schemes, the simple trace formula, the fixed point formula, as well as the congruence relations and a "simple" converse theorem, not yet published anywhere. This version, based on a recent course taught by the author at The Ohio State University, is updated with references to research that has extended and developed the original work. The use of the theory of elliptic modules in the present work makes it accessible to graduate students, and it will serve as a valuable resource to facilitate an entrance to this fascinating area of mathematics.
This is the first book-length treatment of hybrid logic and its proof-theory. Hybrid logic is an extension of ordinary modal logic which allows explicit reference to individual points in a model (where the points represent times, possible worlds, states in a computer, or something else). This is useful for many applications, for example when reasoning about time one often wants to formulate a series of statements about what happens at specific times. There is little consensus about proof-theory for ordinary modal logic. Many modal-logical proof systems lack important properties and the relationships between proof systems for different modal logics are often unclear. In the present book we demonstrate that hybrid-logical proof-theory remedies these deficiencies by giving a spectrum of well-behaved proof systems (natural deduction, Gentzen, tableau, and axiom systems) for a spectrum of different hybrid logics (propositional, first-order, intensional first-order, and intuitionistic).
This book constitutes the proceedings of the 12th Biennial Meeting on Mathematics in Language, MOL 12, held in Nara, Japan, in September 2011. Presented in this volume are 12 carefully selected papers, as well as the paper of the invited speaker Andreas Maletti. The papers cover such diverse topics as formal languages (string and tree transducers, grammar-independent syntactic structures, probabilistic and weighted context-free grammars, formalization of minimalist syntax), parsing and unification, lexical and compositional semantics, statistical language models, and theories of truth.
Resolution Proof Systems: An Algebraic Theory presents a new algebraic framework for the design and analysis of resolution- based automated reasoning systems for a range of non-classical logics. It develops an algebraic theory of resolution proof systems focusing on the problems of proof theory, representation and efficiency of the deductive process. A new class of logical calculi, the class of resolution logics, emerges as a second theme of the book. The logical and computational aspects of the relationship between resolution logics and resolution proof systems is explored in the context of monotonic as well as nonmonotonic reasoning. This book is aimed primarily at researchers and graduate students in artificial intelligence, symbolic and computational logic. The material is suitable as a reference book for researchers and as a text book for graduate courses on the theoretical aspects of automated reasoning and computational logic.
Assuming that the reader is familiar with sheaf theory, the book gives a self-contained introduction to the theory of constructible sheaves related to many kinds of singular spaces, such as cell complexes, triangulated spaces, semialgebraic and subanalytic sets, complex algebraic or analytic sets, stratified spaces, and quotient spaces. The relation to the underlying geometrical ideas are worked out in detail, together with many applications to the topology of such spaces. All chapters have their own detailed introduction, containing the main results and definitions, illustrated in simple terms by a number of examples. The technical details of the proof are postponed to later sections, since these are not needed for the applications.
Before his death in March, 1976, A. H. Lightstone delivered the manu script for this book to Plenum Press. Because he died before the editorial work on the manuscript was completed, I agreed (in the fall of 1976) to serve as a surrogate author and to see the project through to completion. I have changed the manuscript as little as possible, altering certain passages to correct oversights. But the alterations are minor; this is Lightstone's book. H. B. Enderton vii Preface This is a treatment of the predicate calculus in a form that serves as a foundation for nonstandard analysis. Classically, the predicates and variables of the predicate calculus are kept distinct, inasmuch as no variable is also a predicate; moreover, each predicate is assigned an order, a unique natural number that indicates the length of each tuple to which the predicate can be prefixed. These restrictions are dropped here, in order to develop a flexible, expressive language capable of exploiting the potential of nonstandard analysis. To assist the reader in grasping the basic ideas of logic, we begin in Part I by presenting the propositional calculus and statement systems. This provides a relatively simple setting in which to grapple with the some times foreign ideas of mathematical logic. These ideas are repeated in Part II, where the predicate calculus and semantical systems are studied."
Relevance logics came of age with the one and only International Conference on relevant logics in 1974. They did not however become accepted, or easy to promulgate. In March 1981 we received most of the typescript of IN MEMORIAM: ALAN ROSS ANDERSON Proceedings of the International Conference of Relevant Logic from the original editors, Kenneth W. Collier, Ann Gasper and Robert G. Wolf of Southern Illinois University. 1 They had, most unfortunately, failed to find a publisher - not, it appears, because of overall lack of merit of the essays, but because of the expense of producing the collection, lack of institutional subsidization, and doubts of publishers as to whether an expensive collection of essays on such an esoteric, not to say deviant, subject would sell. We thought that the collection of essays was still (even after more than six years in the publishing trade limbo) well worth publishing, that the subject would remain undeservedly esoteric in North America while work on it could not find publishers (it is not so esoteric in academic circles in Continental Europe, Latin America and the Antipodes) and, quite important, that we could get the collection published, and furthermore, by resorting to local means, published comparatively cheaply. It is indeed no ordinary collection. It contains work by pioneers of the main types of broadly relevant systems, and by several of the most innovative non-classical logicians of the present flourishing logical period. We have slowly re-edited and reorganised the collection and made it camera-ready.
Fuzzy Modelling: Paradigms and Practice provides an up-to-date and authoritative compendium of fuzzy models, identification algorithms and applications. Chapters in this book have been written by the leading scholars and researchers in their respective subject areas. Several of these chapters include both theoretical material and applications. The editor of this volume has organized and edited the chapters into a coherent and uniform framework. The objective of this book is to provide researchers and practitioners involved in the development of models for complex systems with an understanding of fuzzy modelling, and an appreciation of what makes these models unique. The chapters are organized into three major parts covering relational models, fuzzy neural networks and rule-based models. The material on relational models includes theory along with a large number of implemented case studies, including some on speech recognition, prediction, and ecological systems. The part on fuzzy neural networks covers some fundamentals, such as neurocomputing, fuzzy neurocomputing, etc., identifies the nature of the relationship that exists between fuzzy systems and neural networks, and includes extensive coverage of their architectures. The last part addresses the main design principles governing the development of rule-based models. Fuzzy Modelling: Paradigms and Practice provides a wealth of specific fuzzy modelling paradigms, algorithms and tools used in systems modelling. Also included is a panoply of case studies from various computer, engineering and science disciplines. This should be a primary reference work for researchers and practitioners developing models of complex systems.
This volume contains the proceedings of the Second Joint IFSA-EC and EURO-WGFS Workshop on Progress in Fuzzy Sets in Europe held on April 6 -8, 1989 in Vienna, Austria. The workshop was organized by Prof. Dr. Wolfgang H. Janko from the University of Economics in Vienna under the auspices of IFSA-EC, the European chapter of the International Fuzzy Systems Association, and EURO-WGFS, the working group on Fuzzy Sets of the Association of Eu ropean Operational Research Societies. The workshop gathered more than 30 participants coming from Western European countries (Austria, Bel gium, England, Germany, Finland, France, Hungary, Italy, Scotland and Spain) Eastern European countries (Bulgaria, the German Federal Repu blic, Hungary and Poland) and non-European countries such as China and Japan. The 15 selected and refereed papers included in the volume are in prin ciple the author's own versions, with limited editorial changes and small corrections. They are arranged in alphabetical order. I wish to thank all the contributors for their valuable papers and an outstan ding cooperation in the editorial project. I also would like to express my sincere thanks to Professor Dr. H. J. Zimmermann for the cooperation in the refereeing procedure. |
You may like...
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R921
Discovery Miles 9 210
|