![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
In contributing a foreword to this book I am complying with a wish my husband expressed a few days before his death. He had completed the manuscript of this work, which may be considered a companion volume to his book Formal Methods. The task of seeing it through the press was undertaken by Mr. J. J. A. Mooij, acting director of the Institute for Research in Foundations and the Philosophy of Science (Instituut voor Grondslagenonderzoek en Filoso: fie der Exacte Wetenschappen) of the University of Amsterdam, with the help of Mrs. E. M. Barth, lecturer at the Institute. I wish to thank Mr. Mooij and Mrs. Barth most cordially for the care with which they have acquitted themselves of this delicate task and for the speed with which they have brought it to completion. I also wish to express my gratitude to Miss L. E. Minning, M. A., for the helpful advice she has so kindly given to Mr. Mooij and Mrs. Barth during the proof reading. C. P. C. BETH-PASTOOR VII PREFACE A few years ago Mr. Horace S.
Reflexive Structures: An Introduction to Computability Theory is concerned with the foundations of the theory of recursive functions. The approach taken presents the fundamental structures in a fairly general setting, but avoiding the introduction of abstract axiomatic domains. Natural numbers and numerical functions are considered exclusively, which results in a concrete theory conceptually organized around Church's thesis. The book develops the important structures in recursive function theory: closure properties, reflexivity, enumeration, and hyperenumeration. Of particular interest is the treatment of recursion, which is considered from two different points of view: via the minimal fixed point theory of continuous transformations, and via the well known stack algorithm. Reflexive Structures is intended as an introduction to the general theory of computability. It can be used as a text or reference in senior undergraduate and first year graduate level classes in computer science or mathematics.
The purpose of this book is to provide the reader who is interested in applications of fuzzy set theory, in the first place with a text to which he or she can refer for the basic theoretical ideas, concepts and techniques in this field and in the second place with a vast and up to date account of the literature. Although there are now many books about fuzzy set theory, and mainly about its applications, e. g. in control theory, there is not really a book available which introduces the elementary theory of fuzzy sets, in what I would like to call "a good degree of generality." To write a book which would treat the entire range of results concerning the basic theoretical concepts in great detail and which would also deal with all possible variants and alternatives of the theory, such as e. g. rough sets and L-fuzzy sets for arbitrary lattices L, with the possibility-probability theories and interpretations, with the foundation of fuzzy set theory via multi-valued logic or via categorical methods and so on, would have been an altogether different project. This book is far more modest in its mathematical content and in its scope.
With the vision that machines can be rendered smarter, we have witnessed for more than a decade tremendous engineering efforts to implement intelligent sys tems. These attempts involve emulating human reasoning, and researchers have tried to model such reasoning from various points of view. But we know precious little about human reasoning processes, learning mechanisms and the like, and in particular about reasoning with limited, imprecise knowledge. In a sense, intelligent systems are machines which use the most general form of human knowledge together with human reasoning capability to reach decisions. Thus the general problem of reasoning with knowledge is the core of design methodology. The attempt to use human knowledge in its most natural sense, that is, through linguistic descriptions, is novel and controversial. The novelty lies in the recognition of a new type of un certainty, namely fuzziness in natural language, and the controversality lies in the mathematical modeling process. As R. Bellman [7] once said, decision making under uncertainty is one of the attributes of human intelligence. When uncertainty is understood as the impossi bility to predict occurrences of events, the context is familiar to statisticians. As such, efforts to use probability theory as an essential tool for building intelligent systems have been pursued (Pearl [203], Neapolitan [182)). The methodology seems alright if the uncertain knowledge in a given problem can be modeled as probability measures.
We are happy to present the first volume of the Handbook of Defeasible Reasoning and Uncertainty Management Systems. Uncertainty pervades the real world and must therefore be addressed by every system that attempts to represent reality. The representation of uncertainty is a ma jor concern of philosophers, logicians, artificial intelligence researchers and com puter sciencists, psychologists, statisticians, economists and engineers. The present Handbook volumes provide frontline coverage of this area. This Handbook was produced in the style of previous handbook series like the Handbook of Philosoph ical Logic, the Handbook of Logic in Computer Science, the Handbook of Logic in Artificial Intelligence and Logic Programming, and can be seen as a companion to them in covering the wide applications of logic and reasoning. We hope it will answer the needs for adequate representations of uncertainty. This Handbook series grew out of the ESPRIT Basic Research Project DRUMS II, where the acronym is made out of the Handbook series title. This project was financially supported by the European Union and regroups 20 major European research teams working in the general domain of uncertainty. As a fringe benefit of the DRUMS project, the research community was able to create this Hand book series, relying on the DRUMS participants as the core of the authors for the Handbook together with external international experts."
Metric fixed point theory encompasses the branch of fixed point theory which metric conditions on the underlying space and/or on the mappings play a fundamental role. In some sense the theory is a far-reaching outgrowth of Banach's contraction mapping principle. A natural extension of the study of contractions is the limiting case when the Lipschitz constant is allowed to equal one. Such mappings are called nonexpansive. Nonexpansive mappings arise in a variety of natural ways, for example in the study of holomorphic mappings and hyperconvex metric spaces. Because most of the spaces studied in analysis share many algebraic and topological properties as well as metric properties, there is no clear line separating metric fixed point theory from the topological or set-theoretic branch of the theory. Also, because of its metric underpinnings, metric fixed point theory has provided the motivation for the study of many geometric properties of Banach spaces. The contents of this Handbook reflect all of these facts. The purpose of the Handbook is to provide a primary resource for anyone interested in fixed point theory with a metric flavor. The goal is to provide information for those wishing to find results that might apply to their own work and for those wishing to obtain a deeper understanding of the theory. The book should be of interest to a wide range of researchers in mathematical analysis as well as to those whose primary interest is the study of fixed point theory and the underlying spaces. The level of exposition is directed to a wide audience, including students and established researchers.
Fuzzy Sets, Logics and Reasoning about Knowledge reports recent results concerning the genuinely logical aspects of fuzzy sets in relation to algebraic considerations, knowledge representation and commonsense reasoning. It takes a state-of-the-art look at multiple-valued and fuzzy set-based logics, in an artificial intelligence perspective. The papers, all of which are written by leading contributors in their respective fields, are grouped into four sections. The first section presents a panorama of many-valued logics in connection with fuzzy sets. The second explores algebraic foundations, with an emphasis on MV algebras. The third is devoted to approximate reasoning methods and similarity-based reasoning. The fourth explores connections between fuzzy knowledge representation, especially possibilistic logic and prioritized knowledge bases. Readership: Scholars and graduate students in logic, algebra, knowledge representation, and formal aspects of artificial intelligence.
In this monograph, questions of extensions and relaxations are consid ered. These questions arise in many applied problems in connection with the operation of perturbations. In some cases, the operation of "small" per turbations generates "small" deviations of basis indexes; a corresponding stability takes place. In other cases, small perturbations generate spas modic change of a result and of solutions defining this result. These cases correspond to unstable problems. The effect of an unstability can arise in extremal problems or in other related problems. In this connection, we note the known problem of constructing the attainability domain in con trol theory. Of course, extremal problems and those of attainability (in abstract control theory) are connected. We exploit this connection here (see Chapter 5). However, basic attention is paid to the problem of the attainability of elements of a topological space under vanishing perturba tions of restrictions. The stability property is frequently missing; the world of unstable problems is of interest for us. We construct regularizing proce dures. However, in many cases, it is possible to establish a certain property similar to partial stability. We call this property asymptotic nonsensitivity or roughness under the perturbation of some restrictions. The given prop erty means the following: in the corresponding problem, it is the same if constraints are weakened in some "directions" or not. On this basis, it is possible to construct a certain classification of constraints, selecting "di rections of roughness" and "precision directions.""
Multi-Criteria Decision Making (MCDM) has been one of the fastest growing problem areas in many disciplines. The central problem is how to evaluate a set of alternatives in terms of a number of criteria. Although this problem is very relevant in practice, there are few methods available and their quality is hard to determine. Thus, the question Which is the best method for a given problem?' has become one of the most important and challenging ones. This is exactly what this book has as its focus and why it is important. The author extensively compares, both theoretically and empirically, real-life MCDM issues and makes the reader aware of quite a number of surprising abnormalities' with some of these methods. What makes this book so valuable and different is that even though the analyses are rigorous, the results can be understood even by the non-specialist. Audience: Researchers, practitioners, and students; it can be used as a textbook for senior undergraduate or graduate courses in business and engineering.
The algorithmic solution of problems has always been one of the major concerns of mathematics. For a long time such solutions were based on an intuitive notion of algorithm. It is only in this century that metamathematical problems have led to the intensive search for a precise and sufficiently general formalization of the notions of computability and algorithm. In the 1930s, a number of quite different concepts for this purpose were pro posed, such as Turing machines, WHILE-programs, recursive functions, Markov algorithms, and Thue systems. All these concepts turned out to be equivalent, a fact summarized in Church's thesis, which says that the resulting definitions form an adequate formalization of the intuitive notion of computability. This had and continues to have an enormous effect. First of all, with these notions it has been possible to prove that various problems are algorithmically unsolvable. Among of group these undecidable problems are the halting problem, the word problem theory, the Post correspondence problem, and Hilbert's tenth problem. Secondly, concepts like Turing machines and WHILE-programs had a strong influence on the development of the first computers and programming languages. In the era of digital computers, the question of finding efficient solutions to algorithmically solvable problems has become increasingly important. In addition, the fact that some problems can be solved very efficiently, while others seem to defy all attempts to find an efficient solution, has called for a deeper under standing of the intrinsic computational difficulty of problems."
This book proposes a uniform logic and probabilistic (LP) approach to risk estimation and analysis in engineering and economics. It covers the methodological and theoretical basis of risk management at the design, test, and operation stages of economic, banking, and engineering systems with groups of incompatible events (GIE). This edition includes new chapters providing a detailed treatment of scenario logic and probabilistic models for revealing bribes. It also contains clear definitions and notations, revised sections and chapters, an extended list of references, and a new subject index, as well as more than a hundred illustrations and tables which motivate the presentation.
The present monograph intends to establish a solid link among three fields: fuzzy set theory, information retrieval, and cluster analysis. Fuzzy set theory supplies new concepts and methods for the other two fields, and provides a common frame work within which they can be reorganized. Four principal groups of readers are assumed: researchers or students who are interested in (a) application of fuzzy sets, (b) theory of information retrieval or bibliographic databases, (c) hierarchical clustering, and (d) application of methods in systems science. Readers in group (a) may notice that the fuzzy set theory used here is very simple, since only finite sets are dealt with. This simplification enables the max min algebra to deal with fuzzy relations and matrices as equivalent entities. Fuzzy graphs are also used for describing theoretical properties of fuzzy relations. This assumption of finite sets is sufficient for applying fuzzy sets to information retrieval and cluster analysis. This means that little theory, beyond the basic theory of fuzzy sets, is required. Although readers in group (b) with little background in the theory of fuzzy sets may have difficulty with a few sections, they will also find enough in this monograph to support an intuitive grasp of this new concept of fuzzy information retrieval. Chapter 4 provides fuzzy retrieval without the use of mathematical symbols. Also, fuzzy graphs will serve as an aid to the intuitive understanding of fuzzy relations."
Internal logic is the logic of content. The content is here arithmetic and the emphasis is on a constructive logic of arithmetic (arithmetical logic). Kronecker's general arithmetic of forms (polynomials) together with Fermat's infinite descent is put to use in an internal consistency proof. The view is developed in the context of a radical arithmetization of mathematics and logic and covers the many-faceted heritage of Kronecker's work, which includes not only Hilbert, but also Frege, Cantor, Dedekind, Husserl and Brouwer. The book will be of primary interest to logicians, philosophers and mathematicians interested in the foundations of mathematics and the philosophical implications of constructivist mathematics. It may also be of interest to historians, since it covers a fifty-year period, from 1880 to 1930, which has been crucial in the foundational debates and their repercussions on the contemporary scene.
Constructive mathematics is based on the thesis that the meaning of a mathematical formula is given, not by its truth-conditions, but in terms of what constructions count as a proof of it. However, the meaning of the terms `construction' and `proof' has never been adequately explained (although Kriesel, Goodman and Martin-Loef have attempted axiomatisations). This monograph develops precise (though not wholly formal) definitions of construction and proof, and describes the algorithmic substructure underlying intuitionistic logic. Interpretations of Heyting arithmetic and constructive analysis are given. The philosophical basis of constructivism is explored thoroughly in Part I. The author seeks to answer objections from platonists and to reconcile his position with the central insights of Hilbert's formalism and logic. Audience: Philosophers of mathematics and logicians, both academic and graduate students, particularly those interested in Brouwer and Hilbert; theoretical computer scientists interested in the foundations of functional programming languages and program correctness calculi.
The present monograph is a slightly revised version of my Habilitations schrift Proof-theoretic Aspects of Intensional and Non-Classical Logics, successfully defended at Leipzig University, November 1997. It collects work on proof systems for modal and constructive logics I have done over the last few years. The main concern is display logic, a certain refinement of Gentzen's sequent calculus developed by Nuel D. Belnap. This book is far from offering a comprehensive presentation of generalized sequent systems for modal logics broadly conceived. The proof-theory of non-classical logics is a rapidly developing field, and even the generalizations of the ordinary notion of sequent listed in Chapter 1 can hardly be presented in great detail within a single volume. In addition to further investigating the various approaches toward generalized Gentzen systems, it is important to compare them and to discuss their relative advantages and disadvantages. An initial attempt at bringing together work on different kinds of proof systems for modal logics has been made in [188]. Another step in the same direction is [196]. Since Chapter 1 contains introductory considerations and, moreover, every remaining chapter begins with some surveying or summarizing remarks, in this preface I shall only emphasize a relation to philosophy that is important to me, register the sources of papers that have entered this book in some form or another, and acknowledge advice and support.
Fuzzy Control of Industrial Systems: Theory and Applications presents the basic theoretical framework of crisp and fuzzy set theory, relating these concepts to control engineering based on the analogy between the Laplace transfer function of linear systems and the fuzzy relation of a nonlinear fuzzy system. Included are generic aspects of fuzzy systems with an emphasis on the many degrees of freedom and its practical design implications, modeling and systems identification techniques based on fuzzy rules, parametrized rules and relational equations, and the principles of adaptive fuzzy and neurofuzzy systems. Practical design aspects of fuzzy controllers are covered by the detailed treatment of fuzzy and neurofuzzy software design tools with an emphasis on iterative fuzzy tuning, while novel stability limit testing methods and the definition and practical examples of the new concept of collaborative control systems are also given. In addition, case studies of successful applications in industrial automation, process control, electric power technology, electric traction, traffic engineering, wastewater treatment, manufacturing, mineral processing and automotive engineering are also presented, in order to assist industrial control systems engineers in recognizing situations when fuzzy and neurofuzzy would offer certain advantages over traditional methods, particularly in controlling highly nonlinear and time-variant plants and processes.
On the history of the book: In the early 1990s several new methods and perspectives in au- mated deduction emerged. We just mention the superposition calculus, meta-term inference and schematization, deductive decision procedures, and automated model building. It was this last ?eld which brought the authors of this book together. In 1994 they met at the Conference on Automated Deduction (CADE-12) in Nancy and agreed upon the general point of view, that semantics and, in particular, construction of models should play a central role in the ?eld of automated deduction. In the following years the deduction groups of the laboratory LEIBNIZ at IMAG Grenoble and the University of Technology in Vienna organized several bilateral projects promoting this topic. This book emerged as a main result of this cooperation. The authors are aware of the fact, that the book does not cover all relevant methods of automated model building (also called model construction or model generation); instead the book focuses on deduction-based symbolic methods for the construction of Herbrand models developed in the last 12 years. Other methods of automated model building, in particular also ?nite model building, are mainly treated in the ?nal chapter; this chapter is less formal and detailed but gives a broader view on the topic and a comparison of di?erent approaches. Howtoreadthisbook: In the introduction we give an overview of automated deduction in a historical context, taking into account its relationship with the human views on formal and informal proofs.
The notion of a dominated or rnajorized operator rests on a simple idea that goes as far back as the Cauchy method of majorants. Loosely speaking, the idea can be expressed as follows. If an operator (equation) under study is dominated by another operator (equation), called a dominant or majorant, then the properties of the latter have a substantial influence on the properties of the former . Thus, operators or equations that have "nice" dominants must possess "nice" properties. In other words, an operator with a somehow qualified dominant must be qualified itself. Mathematical tools, putting the idea of domination into a natural and complete form, were suggested by L. V. Kantorovich in 1935-36. He introduced the funda mental notion of a vector space normed by elements of a vector lattice and that of a linear operator between such spaces which is dominated by a positive linear or monotone sublinear operator. He also applied these notions to solving functional equations. In the succeedingyears many authors studied various particular cases of lattice normed spaces and different classes of dominated operators. However, research was performed within and in the spirit of the theory of vector and normed lattices. So, it is not an exaggeration to say that dominated operators, as independent objects of investigation, were beyond the reach of specialists for half a century. As a consequence, the most important structural properties and some interesting applications of dominated operators have become available since recently."
A practical introduction to the development of proofs and certified programs using Coq. An invaluable tool for researchers, students, and engineers interested in formal methods and the development of zero-fault software.
In Commutative Algebra certain /-adic filtrations of Noetherian rings, i.e. the so-called Zariski rings, are at the basis of singularity theory. Apart from that it is mainly in the context of Homological Algebra that filtered rings and the associated graded rings are being studied not in the least because of the importance of double complexes and their spectral sequences. Where non-commutative algebra is concerned, applications of the theory of filtrations were mainly restricted to the study of enveloping algebras of Lie algebras and, more extensively even, to the study of rings of differential operators. It is clear that the operation of completion at a filtration has an algebraic genotype but a topological fenotype and it is exactly the symbiosis of Algebra and Topology that works so well in the commutative case, e.g. ideles and adeles in number theory or the theory of local fields, Puisseux series etc, .... . In Non commutative algebra the bridge between Algebra and Analysis is much more narrow and it seems that many analytic techniques of the non-commutative kind are still to be developed. Nevertheless there is the magnificent example of the analytic theory of rings of differential operators and 1J-modules a la Kashiwara-Shapira."
Quantum Structures and the Nature of Reality is a collection of papers written for an interdisciplinary audience about the quantum structure research within the International Quantum Structures Association. The advent of quantum mechanics has changed our scientific worldview in a fundamental way. Many popular and semi-popular books have been published about the paradoxical aspects of quantum mechanics. Usually, however, these reflections find their origin in the standard views on quantum mechanics, most of all the wave-particle duality picture. Contrary to relativity theory, where the meaning of its revolutionary ideas was linked from the start with deep structural changes in the geometrical nature of our world, the deep structural changes about the nature of our reality that are indicated by quantum mechanics cannot be traced within the standard formulation. The study of the structure of quantum theory, its logical content, its axiomatic foundation, has been motivated primarily by the search for their structural changes. Due to the high mathematical sophistication of this quantum structure research, no books have been published which try to explain the recent results for an interdisciplinary audience. This book tries to fill this gap by collecting contributions from some of the main researchers in the field. They reveal the steps that have been taken towards a deeper structural understanding of quantum theory.
Semiorder is probably one of the most frequently ordered structures in science. It naturally appears in fields like psychometrics, economics, decision sciences, linguistics and archaeology. It explicitly takes into account the inevitable imprecisions of scientific instruments by allowing the replacement of precise numbers by intervals. The purpose of this book is to dissect this structure and to study its fundamental properties. The main subjects treated are the numerical representations of semiorders, the generalizations of the concept to valued relations, the aggregation of semiorders and their basic role in a general theoretical framework for multicriteria decision-aid methods. Audience: This volume is intended for students and researchers in the fields of decision analysis, management science, operations research, discrete mathematics, classification, social choice theory, and order theory, as well as for practitioners in the design of decision tools.
During the last few decades the ideas, methods, and results of the theory of Boolean algebras have played an increasing role in various branches of mathematics and cybernetics. This monograph is devoted to the fundamentals of the theory of Boolean constructions in universal algebra. Also considered are the problems of presenting different varieties of universal algebra with these constructions, and applications for investigating the spectra and skeletons of varieties of universal algebras. For researchers whose work involves universal algebra and logic.
Fuzzy logics are many-valued logics that are well suited to reasoning in the context of vagueness. They provide the basis for the wider field of Fuzzy Logic, encompassing diverse areas such as fuzzy control, fuzzy databases, and fuzzy mathematics. This book provides an accessible and up-to-date introduction to this fast-growing and increasingly popular area. It focuses in particular on the development and applications of "proof-theoretic" presentations of fuzzy logics; the result of more than ten years of intensive work by researchers in the area, including the authors. In addition to providing alternative elegant presentations of fuzzy logics, proof-theoretic methods are useful for addressing theoretical problems (including key standard completeness results) and developing efficient deduction and decision algorithms. Proof-theoretic presentations also place fuzzy logics in the broader landscape of non-classical logics, revealing deep relations with other logics studied in Computer Science, Mathematics, and Philosophy. The book builds methodically from the semantic origins of fuzzy logics to proof-theoretic presentations such as Hilbert and Gentzen systems, introducing both theoretical and practical applications of these presentations.
In the 20th century philosophy of mathematics has to a great extent been dominated by views developed during the so-called foundational crisis in the beginning of that century. These views have primarily focused on questions pertaining to the logical structure of mathematics and questions regarding the justi?cation and consistency of mathematics. Paradigmatic in this - spect is Hilbert's program which inherits from Frege and Russell the project to formalize all areas of ordinary mathematics and then adds the requi- ment of a proof, by epistemically privileged means (?nitistic reasoning), of the consistency of such formalized theories. While interest in modi?ed v- sions of the original foundational programs is still thriving, in the second part of the twentieth century several philosophers and historians of mat- matics have questioned whether such foundational programs could exhaust the realm of important philosophical problems to be raised about the nature of mathematics. Some have done so in open confrontation (and hostility) to the logically based analysis of mathematics which characterized the cl- sical foundational programs, while others (and many of the contributors to this book belong to this tradition) have only called for an extension of the range of questions and problems that should be raised in connection with an understanding of mathematics. The focus has turned thus to a consideration of what mathematicians are actually doing when they produce mathematics. Questions concerning concept-formation, understanding, heuristics, changes instyle of reasoning, the role of analogies and diagrams etc. |
![]() ![]() You may like...
The Public School Arithmetic - Based on…
J a (James Alexander) 18 McLellan, A F (Albert Flintoft) Ames
Hardcover
R991
Discovery Miles 9 910
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R993
Discovery Miles 9 930
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R1,060
Discovery Miles 10 600
Primary Maths for Scotland Textbook 1C…
Craig Lowther, Antoinette Irwin, …
Paperback
Arnon Avron on Semantics and Proof…
Ofer Arieli, Anna Zamansky
Hardcover
R3,988
Discovery Miles 39 880
Hajnal Andreka and Istvan Nemeti on…
Judit Madarasz, Gergely Szekely
Hardcover
R3,123
Discovery Miles 31 230
|