![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Fuzzy theory is an interesting name for a method that has been highly effective in a wide variety of significant, real-world applications. A few examples make this readily apparent. As the result of a faulty design the method of computer-programmed trading, the biggest stock market crash in history was triggered by a small fraction of a percent change in the interest rate in a Western European country. A fuzzy theory ap proach would have weighed a number of relevant variables and the ranges of values for each of these variables. Another example, which is rather simple but pervasive, is that of an electronic thermostat that turns on heat or air conditioning at a specific temperature setting. In fact, actual comfort level involves other variables such as humidity and the location of the sun with respect to windows in a home, among others. Because of its great applied significance, fuzzy theory has generated widespread activity internationally. In fact, institutions devoted to research in this area have come into being. As the above examples suggest, Fuzzy Systems Theory is of fundamen tal importance for the analysis and design of a wide variety of dynamic systems. This clearly manifests the fundamental importance of time con siderations in the Fuzzy Systems design approach in dynamic systems. This textbook by Prof. Dr. Jernej Virant provides what is evidently a uniquely significant and comprehensive treatment of this subject on the international scene."
L.E.J. Brouwer (1881-1966) is best known for his revolutionary ideas on topology and foundations of mathematics (intuitionism). The present collection contains a mixture of letters; university and faculty correspondence has been included, some of which shed light on the student years, and in particular on the exchange of letters with his PhD adviser, Korteweg. Acting as the natural sequel to the publication of Brouwer's biography, this book provides instrumental reading for those wishing to gain a deeper understanding of Brouwer and his role in the twentieth century. Striking a good balance of biographical and scientific information, the latter deals with innovations in topology (Cantor-Schoenflies style and the new topology) and foundations. The topological period in his research is well represented in correspondence with Hilbert, Schoenflies, Poincare, Blumenthal, Lebesgue, Baire, Koebe, and foundational topics are discussed in letters exchanged with Weyl, Fraenkel, Heyting, van Dantzig and others. There is also a large part of correspondence on matters related to the interbellum scientific politics. This book will appeal to both graduate students and researchers with an interest in topology, the history of mathematics, the foundations of mathematics, philosophy and general science.
Drinfeld Moduli Schemes and Automorphic Forms: The Theory of Elliptic Modules with Applications is based on the author's original work establishing the correspondence between ell-adic rank r Galois representations and automorphic representations of GL(r) over a function field, in the local case, and, in the global case, under a restriction at a single place. It develops Drinfeld's theory of elliptic modules, their moduli schemes and covering schemes, the simple trace formula, the fixed point formula, as well as the congruence relations and a "simple" converse theorem, not yet published anywhere. This version, based on a recent course taught by the author at The Ohio State University, is updated with references to research that has extended and developed the original work. The use of the theory of elliptic modules in the present work makes it accessible to graduate students, and it will serve as a valuable resource to facilitate an entrance to this fascinating area of mathematics.
This book presents logical foundations of dual tableaux together with a number of their applications both to logics traditionally dealt with in mathematics and philosophy (such as modal, intuitionistic, relevant, and many-valued logics) and to various applied theories of computational logic (such as temporal reasoning, spatial reasoning, fuzzy-set-based reasoning, rough-set-based reasoning, order-of magnitude reasoning, reasoning about programs, threshold logics, logics of conditional decisions). The distinguishing feature of most of these applications is that the corresponding dual tableaux are built in a relational language which provides useful means of presentation of the theories. In this way modularity of dual tableaux is ensured. We do not need to develop and implement each dual tableau from scratch, we should only extend the relational core common to many theories with the rules specific for a particular theory.
This systematic and historical treatment of Russell's contributions to analytic philosophy, from his embrace of analysis in 1898 to his landmark theory of descriptions in 1905, draws important connections between his philosophically motivated conception of analysis and the technical apparatus he devised to facilitate analyses in mathematics
This book grew out of my confusion. If logic is objective how can there be so many logics? Is there one right logic, or many right ones? Is there some underlying unity that connects them? What is the significance of the mathematical theorems about logic which I've learned if they have no connection to our everyday reasoning? The answers I propose revolve around the perception that what one pays attention to in reasoning determines which logic is appropriate. The act of abstracting from our reasoning in our usual language is the stepping stone from reasoned argument to logic. We cannot take this step alone, for we reason together: logic is reasoning which has some objective value. For you to understand my answers, or perhaps better, conjectures, I have retraced my steps: from the concrete to the abstract, from examples, to general theory, to further confirming examples, to reflections on the significance of the work.
In recent years, an impetuous development of new, unconventional theories, methods, techniques and technologies in computer and information sciences, systems analysis, decision-making and control, expert systems, data modelling, engineering, etc. , resulted in a considerable increase of interest in adequate mathematical description and analysis of objects, phenomena, and processes which are vague or imprecise by their very nature. Classical two-valued logic and the related notion of a set, together with its mathematical consequences, are then often inadequate or insufficient formal tools, and can even become useless for applications because of their (too) categorical character: 'true - false', 'belongs - does not belong', 'is - is not', 'black - white', '0 - 1', etc. This is why one replaces classical logic by various types of many-valued logics and, on the other hand, more general notions are introduced instead of or beside that of a set. Let us mention, for instance, fuzzy sets and derivative concepts, flou sets and twofold fuzzy sets, which have been created for different purposes as well as using distinct formal and informal motivations. A kind of numerical information concerning of 'how many' elements those objects are composed seems to be one of the simplest and more important types of information about them. To get it, one needs a suitable notion of cardinality and, moreover, a possibility to calculate with such cardinalities. Unfortunately, neither fuzzy sets nor the other nonclassical concepts have been equipped with a satisfactory (nonclassical) cardinality theory.
When I first participated in exploring theories of nonmonotonic reasoning in the late 1970s, I had no idea of the wealth of conceptual and mathematical results that would emerge from those halting first steps. This book by Wiktor Marek and Miroslaw Truszczynski is an elegant treatment of a large body of these results. It provides the first comprehensive treatment of two influen tial nonmonotonic logics - autoepistemic and default logic - and describes a number of surprising and deep unifying relationships between them. It also relates them to various modal logics studied in the philosophical logic litera ture, and provides a thorough treatment of their applications as foundations for logic programming semantics and for truth maintenance systems. It is particularly appropriate that Marek and Truszczynski should have authored this book, since so much of the research that went into these results is due to them. Both authors were trained in the Polish school of logic and they bring to their research and writing the logical insights and sophisticated mathematics that one would expect from such a background. I believe that this book is a splendid example of the intellectual maturity of the field of artificial intelligence, and that it will provide a model of scholarship for us all for many years to come. Ray Reiter Department of Computer Science University of Toronto Toronto, Canada M5S 1A4 and The Canadian Institute for Advanced Research Table of Contents 1 1 Introduction ........."
This volume contains the accounts of papers delivered at the Nato Advanced Study Institute on Finite and Infinite Combinatorics in Sets and Logic held at the Banff Centre, Alberta, Canada from April 21 to May 4, 1991. As the title suggests the meeting brought together workers interested in the interplay between finite and infinite combinatorics, set theory, graph theory and logic. It used to be that infinite set theory, finite combinatorics and logic could be viewed as quite separate and independent subjects. But more and more those disciplines grow together and become interdependent of each other with ever more problems and results appearing which concern all of those disciplines. I appreciate the financial support which was provided by the N. A. T. O. Advanced Study Institute programme, the Natural Sciences and Engineering Research Council of Canada and the Department of Mathematics and Statistics of the University of Calgary. 11l'te meeting on Finite and Infinite Combinatorics in Sets and Logic followed two other meetings on discrete mathematics held in Banff, the Symposium on Ordered Sets in 1981 and the Symposium on Graphs and Order in 1984. The growing inter-relation between the different areas in discrete mathematics is maybe best illustrated by the fact that many of the participants who were present at the previous meetings also attended this meeting on Finite and Infinite Combinatorics in Sets and Logic.
This volume offers comprehensive coverage of intelligent systems, including fundamental aspects, software-, sensors-, and hardware-related issues. Moreover, the contributors to this volume provide, beyond a systematic overview of intelligent interfaces and systems, deep, practical knowledge in building and using intelligent systems in various applications. Special emphasis is placed on specific aspects and requirements in applications.
This book constitutes the refereed proceedings of the International Symposium on Logical Foundations of Computer Science, LFCS 2013, held in San Diego, CA, USA in January 2013. The volume presents 29 revised refereed papers carefully selected by the program committee. The scope of the Symposium is broad and includes constructive mathematics and type theory; logic, automata and automatic structures; computability and randomness; logical foundations of programming; logical aspects of computational complexity; logic programming and constraints; automated deduction and interactive theorem proving; logical methods in protocol and program verification; logical methods in program specification and extraction; domain theory logic; logical foundations of database theory; equational logic and term rewriting; lambda and combinatory calculi; categorical logic and topological semantics; linear logic; epistemic and temporal logics; intelligent and multiple agent system logics; logics of proof and justification; nonmonotonic reasoning; logic in game theory and social software; logic of hybrid systems; distributed system logics; mathematical fuzzy logic; system design logics; and other logics in computer science.
Before his death in March, 1976, A. H. Lightstone delivered the manu script for this book to Plenum Press. Because he died before the editorial work on the manuscript was completed, I agreed (in the fall of 1976) to serve as a surrogate author and to see the project through to completion. I have changed the manuscript as little as possible, altering certain passages to correct oversights. But the alterations are minor; this is Lightstone's book. H. B. Enderton vii Preface This is a treatment of the predicate calculus in a form that serves as a foundation for nonstandard analysis. Classically, the predicates and variables of the predicate calculus are kept distinct, inasmuch as no variable is also a predicate; moreover, each predicate is assigned an order, a unique natural number that indicates the length of each tuple to which the predicate can be prefixed. These restrictions are dropped here, in order to develop a flexible, expressive language capable of exploiting the potential of nonstandard analysis. To assist the reader in grasping the basic ideas of logic, we begin in Part I by presenting the propositional calculus and statement systems. This provides a relatively simple setting in which to grapple with the some times foreign ideas of mathematical logic. These ideas are repeated in Part II, where the predicate calculus and semantical systems are studied."
Fuzzy Set Theory and Advanced Mathematical Applications contains contributions by many of the leading experts in the field, including coverage of the mathematical foundations of the theory, decision making and systems science, and recent developments in fuzzy neural control. The book supplies a readable, practical toolkit with a clear introduction to fuzzy set theory and its evolution in mathematics and new results on foundations of fuzzy set theory, decision making and systems science, and fuzzy control and neural systems. Each chapter is self-contained, providing up-to-date coverage of its subject. Audience: An important reference work for university students, and researchers and engineers working in both industrial and academic settings.
Knowledge discovery is an area of computer science that attempts to uncover interesting and useful patterns in data that permit a computer to perform a task autonomously or assist a human in performing a task more efficiently. Soft Computing for Knowledge Discovery provides a self-contained and systematic exposition of the key theory and algorithms that form the core of knowledge discovery from a soft computing perspective. It focuses on knowledge representation, machine learning, and the key methodologies that make up the fabric of soft computing - fuzzy set theory, fuzzy logic, evolutionary computing, and various theories of probability (e.g. naive Bayes and Bayesian networks, Dempster-Shafer theory, mass assignment theory, and others). In addition to describing many state-of-the-art soft computing approaches to knowledge discovery, the author introduces Cartesian granule features and their corresponding learning algorithms as an intuitive approach to knowledge discovery. This new approach embraces the synergistic spirit of soft computing and exploits uncertainty in order to achieve tractability, transparency and generalization. Parallels are drawn between this approach and other well known approaches (such as naive Bayes and decision trees) leading to equivalences under certain conditions. The approaches presented are further illustrated in a battery of both artificial and real-world problems. Knowledge discovery in real-world problems, such as object recognition in outdoor scenes, medical diagnosis and control, is described in detail. These case studies provide further examples of how to apply the presented concepts and algorithms to practical problems. The author provides web page access to an online bibliography, datasets, source codes for several algorithms described in the book, and other information. Soft Computing for Knowledge Discovery is for advanced undergraduates, professionals and researchers in computer science, engineering and business information systems who work or have an interest in the dynamic fields of knowledge discovery and soft computing.
Advances in Computational Intelligence and Learning: Methods and Applications presents new developments and applications in the area of Computational Intelligence, which essentially describes methods and approaches that mimic biologically intelligent behavior in order to solve problems that have been difficult to solve by classical mathematics. Generally Fuzzy Technology, Artificial Neural Nets and Evolutionary Computing are considered to be such approaches. The Editors have assembled new contributions in the areas of fuzzy sets, neural sets and machine learning, as well as combinations of them (so called hybrid methods) in the first part of the book. The second part of the book is dedicated to applications in the areas that are considered to be most relevant to Computational Intelligence.
Mathematics of Fuzzy Sets: Logic, Topology and Measure Theory is a major attempt to provide much-needed coherence for the mathematics of fuzzy sets. Much of this book is new material required to standardize this mathematics, making this volume a reference tool with broad appeal as well as a platform for future research. Fourteen chapters are organized into three parts: mathematical logic and foundations (Chapters 1-2), general topology (Chapters 3-10), and measure and probability theory (Chapters 11-14). Chapter 1 deals with non-classical logics and their syntactic and semantic foundations. Chapter 2 details the lattice-theoretic foundations of image and preimage powerset operators. Chapters 3 and 4 lay down the axiomatic and categorical foundations of general topology using lattice-valued mappings as a fundamental tool. Chapter 3 focuses on the fixed-basis case, including a convergence theory demonstrating the utility of the underlying axioms. Chapter 4 focuses on the more general variable-basis case, providing a categorical unification of locales, fixed-basis topological spaces, and variable-basis compactifications. Chapter 5 relates lattice-valued topologies to probabilistic topological spaces and fuzzy neighborhood spaces. Chapter 6 investigates the important role of separation axioms in lattice-valued topology from the perspective of space embedding and mapping extension problems, while Chapter 7 examines separation axioms from the perspective of Stone-Cech-compactification and Stone-representation theorems. Chapters 8 and 9 introduce the most important concepts and properties of uniformities, including the covering and entourage approaches and the basic theory of precompact or complete [0,1]-valued uniform spaces. Chapter 10 sets out the algebraic, topological, and uniform structures of the fundamentally important fuzzy real line and fuzzy unit interval. Chapter 11 lays the foundations of generalized measure theory and representation by Markov kernels. Chapter 12 develops the important theory of conditioning operators with applications to measure-free conditioning. Chapter 13 presents elements of pseudo-analysis with applications to the Hamilton Jacobi equation and optimization problems. Chapter 14 surveys briefly the fundamentals of fuzzy random variables which are [0,1]-valued interpretations of random sets.
Stochastic analysis is not only a thriving area of pure mathematics with intriguing connections to partial differential equations and differential geometry. It also has numerous applications in the natural and social sciences (for instance in financial mathematics or theoretical quantum mechanics) and therefore appears in physics and economics curricula as well. However, existing approaches to stochastic analysis either presuppose various concepts from measure theory and functional analysis or lack full mathematical rigour. This short book proposes to solve the dilemma: By adopting E. Nelson's "radically elementary" theory of continuous-time stochastic processes, it is based on a demonstrably consistent use of infinitesimals and thus permits a radically simplified, yet perfectly rigorous approach to stochastic calculus and its fascinating applications, some of which (notably the Black-Scholes theory of option pricing and the Feynman path integral) are also discussed in the book.
This book is about Granular Computing (GC) - an emerging conceptual and of information processing. As the name suggests, GC concerns computing paradigm processing of complex information entities - information granules. In essence, information granules arise in the process of abstraction of data and derivation of knowledge from information. Information granules are everywhere. We commonly use granules of time (seconds, months, years). We granulate images; millions of pixels manipulated individually by computers appear to us as granules representing physical objects. In natural language, we operate on the basis of word-granules that become crucial entities used to realize interaction and communication between humans. Intuitively, we sense that information granules are at the heart of all our perceptual activities. In the past, several formal frameworks and tools, geared for processing specific information granules, have been proposed. Interval analysis, rough sets, fuzzy sets have all played important role in knowledge representation and processing. Subsequently, information granulation and information granules arose in numerous application domains. Well-known ideas of rule-based systems dwell inherently on information granules. Qualitative modeling, being one of the leading threads of AI, operates on a level of information granules. Multi-tier architectures and hierarchical systems (such as those encountered in control engineering), planning and scheduling systems all exploit information granularity. We also utilize information granules when it comes to functionality granulation, reusability of information and efficient ways of developing underlying information infrastructures.
Since the introduction of genetic algorithms in the 1970s, an enormous number of articles together with several significant monographs and books have been published on this methodology. As a result, genetic algorithms have made a major contribution to optimization, adaptation, and learning in a wide variety of unexpected fields. Over the years, many excellent books in genetic algorithm optimization have been published; however, they focus mainly on single-objective discrete or other hard optimization problems under certainty. There appears to be no book that is designed to present genetic algorithms for solving not only single-objective but also fuzzy and multiobjective optimization problems in a unified way. Genetic Algorithms And Fuzzy Multiobjective Optimization introduces the latest advances in the field of genetic algorithm optimization for 0-1 programming, integer programming, nonconvex programming, and job-shop scheduling problems under multiobjectiveness and fuzziness. In addition, the book treats a wide range of actual real world applications. The theoretical material and applications place special stress on interactive decision-making aspects of fuzzy multiobjective optimization for human-centered systems in most realistic situations when dealing with fuzziness. The intended readers of this book are senior undergraduate students, graduate students, researchers, and practitioners in the fields of operations research, computer science, industrial engineering, management science, systems engineering, and other engineering disciplines that deal with the subjects of multiobjective programming for discrete or other hard optimization problems under fuzziness. Real world research applications are used throughout the book to illustrate the presentation. These applications are drawn from complex problems. Examples include flexible scheduling in a machine center, operation planning of district heating and cooling plants, and coal purchase planning in an actual electric power plant.
The significance of foundational debate in mathematics that took place in the 1920s seems to have been recognized only in circles of mathematicians and philosophers. A period in the history of mathematics when mathematics and philosophy, usually so far away from each other, seemed to meet. The foundational debate is presented with all its brilliant contributions and its shortcomings, its new ideas and its misunderstandings.
Fuzzy Sets in Decision Analysis, Operations Research and Statistics includes chapters on fuzzy preference modeling, multiple criteria analysis, ranking and sorting methods, group decision-making and fuzzy game theory. It also presents optimization techniques such as fuzzy linear and non-linear programming, applications to graph problems and fuzzy combinatorial methods such as fuzzy dynamic programming. In addition, the book also accounts for advances in fuzzy data analysis, fuzzy statistics, and applications to reliability analysis. These topics are covered within four parts: * Decision Making, * Mathematical Programming, * Statistics and Data Analysis, and * Reliability, Maintenance and Replacement. The scope and content of the book has resulted from multiple interactions between the editor of the volume, the series editors, the series advisory board, and experts in each chapter area. Each chapter was written by a well-known researcher on the topic and reviewed by other experts in the area. These expert reviewers sometimes became co-authors because of the extent of their contribution to the chapter.As a result, twenty-five authors from twelve countries and four continents were involved in the creation of the 13 chapters, which enhances the international character of the project and gives an idea of how carefully the Handbook has been developed.
This book contains the lectures given at the NATO ASI 910820 "Cellular Automata and Cooperative Systems" Meeting which was held at the Centre de Physique des Houches, France, from June 22 to July 2, 1992. This workshop brought together mathematical physicists, theoretical physicists and mathe maticians working in fields related to local interacting systems, cellular and probabilistic automata, statistical physics, and complexity theory, as well as applications of these fields. We would like to thank our sponsors and supporters whose interest and help was essential for the success of the meeting: the NATO Scientific Affairs Division, the DRET (Direction des Recherches, Etudes et Techniques), the Ministere des Affaires Etrangeres, the National Science Foundation. We would also like to thank all the secretaries who helped us during the preparation of the meeting, in particular Maryse Cohen-Solal (CPT, Marseille) and Janice Nowinski (Courant Institute, New York). We are grateful for the fine work of Mrs. Gladys Cavallone in preparing this volume."
Uncertainty has been of concern to engineers, managers and . scientists for many centuries. In management sciences there have existed definitions of uncertainty in a rather narrow sense since the beginning of this century. In engineering and uncertainty has for a long time been considered as in sciences, however, synonymous with random, stochastic, statistic, or probabilistic. Only since the early sixties views on uncertainty have ~ecome more heterogeneous and more tools to model uncertainty than statistics have been proposed by several scientists. The problem of modeling uncertainty adequately has become more important the more complex systems have become, the faster the scientific and engineering world develops, and the more important, but also more difficult, forecasting of future states of systems have become. The first question one should probably ask is whether uncertainty is a phenomenon, a feature of real world systems, a state of mind or a label for a situation in which a human being wants to make statements about phenomena, i. e. , reality, models, and theories, respectively. One cart also ask whether uncertainty is an objective fact or just a subjective impression which is closely related to individual persons. Whether uncertainty is an objective feature of physical real systems seems to be a philosophical question. This shall not be answered in this volume.
Assuming that the reader is familiar with sheaf theory, the book gives a self-contained introduction to the theory of constructible sheaves related to many kinds of singular spaces, such as cell complexes, triangulated spaces, semialgebraic and subanalytic sets, complex algebraic or analytic sets, stratified spaces, and quotient spaces. The relation to the underlying geometrical ideas are worked out in detail, together with many applications to the topology of such spaces. All chapters have their own detailed introduction, containing the main results and definitions, illustrated in simple terms by a number of examples. The technical details of the proof are postponed to later sections, since these are not needed for the applications.
The theory of finite automata on finite stings, infinite strings, and trees has had a dis tinguished history. First, automata were introduced to represent idealized switching circuits augmented by unit delays. This was the period of Shannon, McCullouch and Pitts, and Howard Aiken, ending about 1950. Then in the 1950s there was the work of Kleene on representable events, of Myhill and Nerode on finite coset congruence relations on strings, of Rabin and Scott on power set automata. In the 1960s, there was the work of Btichi on automata on infinite strings and the second order theory of one successor, then Rabin's 1968 result on automata on infinite trees and the second order theory of two successors. The latter was a mystery until the introduction of forgetful determinacy games by Gurevich and Harrington in 1982. Each of these developments has successful and prospective applications in computer science. They should all be part of every computer scientist's toolbox. Suppose that we take a computer scientist's point of view. One can think of finite automata as the mathematical representation of programs that run us ing fixed finite resources. Then Btichi's SIS can be thought of as a theory of programs which run forever (like operating systems or banking systems) and are deterministic. Finally, Rabin's S2S is a theory of programs which run forever and are nondeterministic. Indeed many questions of verification can be decided in the decidable theories of these automata. |
![]() ![]() You may like...
Applied Crystallography - Proceedings Of…
Danuta Stroz, Henryk Morawiec
Hardcover
R5,238
Discovery Miles 52 380
Advanced Introduction to Artificial…
Tom Davenport, John Glaser, …
Paperback
R663
Discovery Miles 6 630
Applications of Nanofluid Transportation…
Mohsen Sheikholeslami
Hardcover
R7,056
Discovery Miles 70 560
Towards Efficient Fuzzy Information…
Chongfu Huang, Yong Shi
Hardcover
R3,260
Discovery Miles 32 600
Networks in the Global World V…
Artem Antonyuk, Nikita Basov
Hardcover
R4,631
Discovery Miles 46 310
Electrical Properties of Materials
Laszlo Solymar, Donald Walsh, …
Hardcover
R4,812
Discovery Miles 48 120
Low Carbon Energy Supply Technologies…
Atul Sharma, Amritanshu Shukla, …
Paperback
R1,752
Discovery Miles 17 520
Phase Transformations in Multicomponent…
Dieter M. Herlach
Hardcover
|