![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
During 1996-97 MSRI held a full academic-year program on combinatorics, with special emphasis on its connections to other branches of mathematics, such as algebraic geometry, topology, commutative algebra, representation theory, and convex geometry. The rich combinatorial problems arising from the study of various algebraic structures are the subject of this book, which features work done or presented at the program's seminars. The text contains contributions on matroid bundles, combinatorial representation theory, lattice points in polyhedra, bilinear forms, combinatorial differential topology and geometry, Macdonald polynomials and geometry, enumeration of matchings, the generalized Baues problem, and Littlewood-Richardson semigroups. These expository articles, written by some of the most respected researchers in the field, present the state of the art to graduate students and researchers in combinatorics as well as in algebra, geometry, and topology.
Mountaineers use pitons to protect themselves from falls. The lead climber wears a harness to which a rope is tied. As the climber ascends, the rope is paid out by a partner on the ground. As described thus far, the climber receives no protection from the rope or the partner. However, the climber generally carries several spike-like pitons and stops when possible to drive one into a small crack or crevice in the rock face. After climbing just above the piton, the climber clips the rope to the piton, using slings and carabiners. A subsequent fall would result in the climber hanging from the piton if the piton stays in the rock, the slings and carabiners do not fail, the rope does not break, the partner is holding the rope taut and secure, and the climber had not climbed too high above the piton before falling. The climber's safety clearly depends on all of the components of the system. But the piton is distinguished because it connects the natural to the artificial. In 1987 I designed an assembly-level language for Warren Hunt's FM8501 verified microprocessor. I wanted the language to be conveniently used as the object code produced by verified compilers. Thus, I envisioned the language as the first software link in a trusted chain from verified hardware to verified applications programs. Thinking of the hardware as the "rock" I named the language "Piton."
One of the attractions of fuzzy logic is its utility in solving many real engineering problems. As many have realised, the major obstacles in building a real intelligent machine involve dealing with random disturbances, processing large amounts of imprecise data, interacting with a dynamically changing environment, and coping with uncertainty. Neural-fuzzy techniques help one to solve many of these problems. Fuzzy Logic and Intelligent Systems reflects the most recent developments in neural networks and fuzzy logic, and their application in intelligent systems. In addition, the balance between theoretical work and applications makes the book suitable for both researchers and engineers, as well as for graduate students.
Change, Choice and Inference unifies lively and significant strands of research in logic, philosophy, economics and artificial intelligence.
Fuzzy theory is an interesting name for a method that has been highly effective in a wide variety of significant, real-world applications. A few examples make this readily apparent. As the result of a faulty design the method of computer-programmed trading, the biggest stock market crash in history was triggered by a small fraction of a percent change in the interest rate in a Western European country. A fuzzy theory ap proach would have weighed a number of relevant variables and the ranges of values for each of these variables. Another example, which is rather simple but pervasive, is that of an electronic thermostat that turns on heat or air conditioning at a specific temperature setting. In fact, actual comfort level involves other variables such as humidity and the location of the sun with respect to windows in a home, among others. Because of its great applied significance, fuzzy theory has generated widespread activity internationally. In fact, institutions devoted to research in this area have come into being. As the above examples suggest, Fuzzy Systems Theory is of fundamen tal importance for the analysis and design of a wide variety of dynamic systems. This clearly manifests the fundamental importance of time con siderations in the Fuzzy Systems design approach in dynamic systems. This textbook by Prof. Dr. Jernej Virant provides what is evidently a uniquely significant and comprehensive treatment of this subject on the international scene."
L.E.J. Brouwer (1881-1966) is best known for his revolutionary ideas on topology and foundations of mathematics (intuitionism). The present collection contains a mixture of letters; university and faculty correspondence has been included, some of which shed light on the student years, and in particular on the exchange of letters with his PhD adviser, Korteweg. Acting as the natural sequel to the publication of Brouwer's biography, this book provides instrumental reading for those wishing to gain a deeper understanding of Brouwer and his role in the twentieth century. Striking a good balance of biographical and scientific information, the latter deals with innovations in topology (Cantor-Schoenflies style and the new topology) and foundations. The topological period in his research is well represented in correspondence with Hilbert, Schoenflies, Poincare, Blumenthal, Lebesgue, Baire, Koebe, and foundational topics are discussed in letters exchanged with Weyl, Fraenkel, Heyting, van Dantzig and others. There is also a large part of correspondence on matters related to the interbellum scientific politics. This book will appeal to both graduate students and researchers with an interest in topology, the history of mathematics, the foundations of mathematics, philosophy and general science.
This book introduces a new approach to building models of bounded arithmetic, with techniques drawn from recent results in computational complexity. Propositional proof systems and bounded arithmetics are closely related. In particular, proving lower bounds on the lengths of proofs in propositional proof systems is equivalent to constructing certain extensions of models of bounded arithmetic. This offers a clean and coherent framework for thinking about lower bounds for proof lengths, and it has proved quite successful in the past. This book outlines a brand new method for constructing models of bounded arithmetic, thus for proving independence results and establishing lower bounds for proof lengths. The models are built from random variables defined on a sample space which is a non-standard finite set and sampled by functions of some restricted computational complexity. It will appeal to anyone interested in logical approaches to fundamental problems in complexity theory.
In recent years, an impetuous development of new, unconventional theories, methods, techniques and technologies in computer and information sciences, systems analysis, decision-making and control, expert systems, data modelling, engineering, etc. , resulted in a considerable increase of interest in adequate mathematical description and analysis of objects, phenomena, and processes which are vague or imprecise by their very nature. Classical two-valued logic and the related notion of a set, together with its mathematical consequences, are then often inadequate or insufficient formal tools, and can even become useless for applications because of their (too) categorical character: 'true - false', 'belongs - does not belong', 'is - is not', 'black - white', '0 - 1', etc. This is why one replaces classical logic by various types of many-valued logics and, on the other hand, more general notions are introduced instead of or beside that of a set. Let us mention, for instance, fuzzy sets and derivative concepts, flou sets and twofold fuzzy sets, which have been created for different purposes as well as using distinct formal and informal motivations. A kind of numerical information concerning of 'how many' elements those objects are composed seems to be one of the simplest and more important types of information about them. To get it, one needs a suitable notion of cardinality and, moreover, a possibility to calculate with such cardinalities. Unfortunately, neither fuzzy sets nor the other nonclassical concepts have been equipped with a satisfactory (nonclassical) cardinality theory.
This book grew out of my confusion. If logic is objective how can there be so many logics? Is there one right logic, or many right ones? Is there some underlying unity that connects them? What is the significance of the mathematical theorems about logic which I've learned if they have no connection to our everyday reasoning? The answers I propose revolve around the perception that what one pays attention to in reasoning determines which logic is appropriate. The act of abstracting from our reasoning in our usual language is the stepping stone from reasoned argument to logic. We cannot take this step alone, for we reason together: logic is reasoning which has some objective value. For you to understand my answers, or perhaps better, conjectures, I have retraced my steps: from the concrete to the abstract, from examples, to general theory, to further confirming examples, to reflections on the significance of the work.
This book presents logical foundations of dual tableaux together with a number of their applications both to logics traditionally dealt with in mathematics and philosophy (such as modal, intuitionistic, relevant, and many-valued logics) and to various applied theories of computational logic (such as temporal reasoning, spatial reasoning, fuzzy-set-based reasoning, rough-set-based reasoning, order-of magnitude reasoning, reasoning about programs, threshold logics, logics of conditional decisions). The distinguishing feature of most of these applications is that the corresponding dual tableaux are built in a relational language which provides useful means of presentation of the theories. In this way modularity of dual tableaux is ensured. We do not need to develop and implement each dual tableau from scratch, we should only extend the relational core common to many theories with the rules specific for a particular theory.
This is the first book devoted to the systematic study of sparse graphs and sparse finite structures. Although the notion of sparsity appears in various contexts and is a typical example of a hard to define notion, the authors devised an unifying classification of general classes of structures. This approach is very robust and it has many remarkable properties. For example the classification is expressible in many different ways involving most extremal combinatorial invariants. This study of sparse structures found applications in such diverse areas as algorithmic graph theory, complexity of algorithms, property testing, descriptive complexity and mathematical logic (homomorphism preservation,fixed parameter tractability and constraint satisfaction problems). It should be stressed that despite of its generality this approach leads to linear (and nearly linear) algorithms. Jaroslav Nesetril is a professor at Charles University, Prague; Patrice Ossona de Mendez is a CNRS researcher et EHESS, Paris. This book is related to the material presented by the first author at ICM 2010.
Originally published in 1981, this book forms volume 15 of the Encyclopedia of Mathematics and its Applications. The text provides a clear and thorough treatment of its subject, adhering to a clean exposition of the mathematical content of serious formulations of rational physical alternatives of quantum theory as elaborated in the influential works of the period, to which the authors made a significant contribution. The treatment falls into three distinct, logical parts: in the first part, the modern version of accumulated wisdom is presented, avoiding as far as possible the traditional language of classical physics for its interpretational character; in the second part, the individual structural elements for the logical content of the theory are laid out; in part three, the results of section two are used to reconstruct the usual Hilbert space formulation of quantum mechanics in a novel way.
Paolo Mancosu presents a series of innovative studies in the history and the philosophy of logic and mathematics in the first half of the twentieth century. The Adventure of Reason is divided into five main sections: history of logic (from Russell to Tarski); foundational issues (Hilbert's program, constructivity, Wittgenstein, Goedel); mathematics and phenomenology (Weyl, Becker, Mahnke); nominalism (Quine, Tarski); semantics (Tarski, Carnap, Neurath). Mancosu exploits extensive untapped archival sources to make available a wealth of new material that deepens in significant ways our understanding of these fascinating areas of modern intellectual history. At the same time, the book is a contribution to recent philosophical debates, in particular on the prospects for a successful nominalist reconstruction of mathematics, the nature of finitist intuition, the viability of alternative definitions of logical consequence, and the extent to which phenomenology can hope to account for the exact sciences.
In this 1987 text Professor Jech gives a unified treatment of the various forcing methods used in set theory, and presents their important applications. Product forcing, iterated forcing and proper forcing have proved powerful tools when studying the foundations of mathematics, for instance in consistency proofs. The book is based on graduate courses though some results are also included, making the book attractive to set theorists and logicians.
Graph theory meets number theory in this stimulating book. Ihara zeta functions of finite graphs are reciprocals of polynomials, sometimes in several variables. Analogies abound with number-theoretic functions such as Riemann/Dedekind zeta functions. For example, there is a Riemann hypothesis (which may be false) and prime number theorem for graphs. Explicit constructions of graph coverings use Galois theory to generalize Cayley and Schreier graphs. Then non-isomorphic simple graphs with the same zeta are produced, showing you cannot hear the shape of a graph. The spectra of matrices such as the adjacency and edge adjacency matrices of a graph are essential to the plot of this book, which makes connections with quantum chaos and random matrix theory, plus expander/Ramanujan graphs of interest in computer science. Created for beginning graduate students, the book will also appeal to researchers. Many well-chosen illustrations and exercises, both theoretical and computer-based, are included throughout.
This volume contains the accounts of papers delivered at the Nato Advanced Study Institute on Finite and Infinite Combinatorics in Sets and Logic held at the Banff Centre, Alberta, Canada from April 21 to May 4, 1991. As the title suggests the meeting brought together workers interested in the interplay between finite and infinite combinatorics, set theory, graph theory and logic. It used to be that infinite set theory, finite combinatorics and logic could be viewed as quite separate and independent subjects. But more and more those disciplines grow together and become interdependent of each other with ever more problems and results appearing which concern all of those disciplines. I appreciate the financial support which was provided by the N. A. T. O. Advanced Study Institute programme, the Natural Sciences and Engineering Research Council of Canada and the Department of Mathematics and Statistics of the University of Calgary. 11l'te meeting on Finite and Infinite Combinatorics in Sets and Logic followed two other meetings on discrete mathematics held in Banff, the Symposium on Ordered Sets in 1981 and the Symposium on Graphs and Order in 1984. The growing inter-relation between the different areas in discrete mathematics is maybe best illustrated by the fact that many of the participants who were present at the previous meetings also attended this meeting on Finite and Infinite Combinatorics in Sets and Logic.
Relational mathematics is to operations research and informatics what numerical mathematics is to engineering: it is intended to help modelling, reasoning, and computing. Its applications are therefore diverse, ranging from psychology, linguistics, decision aid, and ranking to machine learning and spatial reasoning. Although many developments have been made in recent years, they have rarely been shared amongst this broad community of researchers. This comprehensive 2010 overview begins with an easy introduction to the topic, assuming a minimum of prerequisites; but it is nevertheless theoretically sound and up to date. It is suitable for applied scientists, explaining all the necessary mathematics from scratch using a multitude of visualised examples, via matrices and graphs. It ends with tangible results on the research level. The author illustrates the theory and demonstrates practical tasks in operations research, social sciences and the humanities.
This volume offers comprehensive coverage of intelligent systems, including fundamental aspects, software-, sensors-, and hardware-related issues. Moreover, the contributors to this volume provide, beyond a systematic overview of intelligent interfaces and systems, deep, practical knowledge in building and using intelligent systems in various applications. Special emphasis is placed on specific aspects and requirements in applications.
Knowledge discovery is an area of computer science that attempts to uncover interesting and useful patterns in data that permit a computer to perform a task autonomously or assist a human in performing a task more efficiently. Soft Computing for Knowledge Discovery provides a self-contained and systematic exposition of the key theory and algorithms that form the core of knowledge discovery from a soft computing perspective. It focuses on knowledge representation, machine learning, and the key methodologies that make up the fabric of soft computing - fuzzy set theory, fuzzy logic, evolutionary computing, and various theories of probability (e.g. naive Bayes and Bayesian networks, Dempster-Shafer theory, mass assignment theory, and others). In addition to describing many state-of-the-art soft computing approaches to knowledge discovery, the author introduces Cartesian granule features and their corresponding learning algorithms as an intuitive approach to knowledge discovery. This new approach embraces the synergistic spirit of soft computing and exploits uncertainty in order to achieve tractability, transparency and generalization. Parallels are drawn between this approach and other well known approaches (such as naive Bayes and decision trees) leading to equivalences under certain conditions. The approaches presented are further illustrated in a battery of both artificial and real-world problems. Knowledge discovery in real-world problems, such as object recognition in outdoor scenes, medical diagnosis and control, is described in detail. These case studies provide further examples of how to apply the presented concepts and algorithms to practical problems. The author provides web page access to an online bibliography, datasets, source codes for several algorithms described in the book, and other information. Soft Computing for Knowledge Discovery is for advanced undergraduates, professionals and researchers in computer science, engineering and business information systems who work or have an interest in the dynamic fields of knowledge discovery and soft computing.
Fuzzy Set Theory and Advanced Mathematical Applications contains contributions by many of the leading experts in the field, including coverage of the mathematical foundations of the theory, decision making and systems science, and recent developments in fuzzy neural control. The book supplies a readable, practical toolkit with a clear introduction to fuzzy set theory and its evolution in mathematics and new results on foundations of fuzzy set theory, decision making and systems science, and fuzzy control and neural systems. Each chapter is self-contained, providing up-to-date coverage of its subject. Audience: An important reference work for university students, and researchers and engineers working in both industrial and academic settings.
Advances in Computational Intelligence and Learning: Methods and Applications presents new developments and applications in the area of Computational Intelligence, which essentially describes methods and approaches that mimic biologically intelligent behavior in order to solve problems that have been difficult to solve by classical mathematics. Generally Fuzzy Technology, Artificial Neural Nets and Evolutionary Computing are considered to be such approaches. The Editors have assembled new contributions in the areas of fuzzy sets, neural sets and machine learning, as well as combinations of them (so called hybrid methods) in the first part of the book. The second part of the book is dedicated to applications in the areas that are considered to be most relevant to Computational Intelligence.
This book is about Granular Computing (GC) - an emerging conceptual and of information processing. As the name suggests, GC concerns computing paradigm processing of complex information entities - information granules. In essence, information granules arise in the process of abstraction of data and derivation of knowledge from information. Information granules are everywhere. We commonly use granules of time (seconds, months, years). We granulate images; millions of pixels manipulated individually by computers appear to us as granules representing physical objects. In natural language, we operate on the basis of word-granules that become crucial entities used to realize interaction and communication between humans. Intuitively, we sense that information granules are at the heart of all our perceptual activities. In the past, several formal frameworks and tools, geared for processing specific information granules, have been proposed. Interval analysis, rough sets, fuzzy sets have all played important role in knowledge representation and processing. Subsequently, information granulation and information granules arose in numerous application domains. Well-known ideas of rule-based systems dwell inherently on information granules. Qualitative modeling, being one of the leading threads of AI, operates on a level of information granules. Multi-tier architectures and hierarchical systems (such as those encountered in control engineering), planning and scheduling systems all exploit information granularity. We also utilize information granules when it comes to functionality granulation, reusability of information and efficient ways of developing underlying information infrastructures.
Since the introduction of genetic algorithms in the 1970s, an enormous number of articles together with several significant monographs and books have been published on this methodology. As a result, genetic algorithms have made a major contribution to optimization, adaptation, and learning in a wide variety of unexpected fields. Over the years, many excellent books in genetic algorithm optimization have been published; however, they focus mainly on single-objective discrete or other hard optimization problems under certainty. There appears to be no book that is designed to present genetic algorithms for solving not only single-objective but also fuzzy and multiobjective optimization problems in a unified way. Genetic Algorithms And Fuzzy Multiobjective Optimization introduces the latest advances in the field of genetic algorithm optimization for 0-1 programming, integer programming, nonconvex programming, and job-shop scheduling problems under multiobjectiveness and fuzziness. In addition, the book treats a wide range of actual real world applications. The theoretical material and applications place special stress on interactive decision-making aspects of fuzzy multiobjective optimization for human-centered systems in most realistic situations when dealing with fuzziness. The intended readers of this book are senior undergraduate students, graduate students, researchers, and practitioners in the fields of operations research, computer science, industrial engineering, management science, systems engineering, and other engineering disciplines that deal with the subjects of multiobjective programming for discrete or other hard optimization problems under fuzziness. Real world research applications are used throughout the book to illustrate the presentation. These applications are drawn from complex problems. Examples include flexible scheduling in a machine center, operation planning of district heating and cooling plants, and coal purchase planning in an actual electric power plant.
The theory of finite automata on finite stings, infinite strings, and trees has had a dis tinguished history. First, automata were introduced to represent idealized switching circuits augmented by unit delays. This was the period of Shannon, McCullouch and Pitts, and Howard Aiken, ending about 1950. Then in the 1950s there was the work of Kleene on representable events, of Myhill and Nerode on finite coset congruence relations on strings, of Rabin and Scott on power set automata. In the 1960s, there was the work of Btichi on automata on infinite strings and the second order theory of one successor, then Rabin's 1968 result on automata on infinite trees and the second order theory of two successors. The latter was a mystery until the introduction of forgetful determinacy games by Gurevich and Harrington in 1982. Each of these developments has successful and prospective applications in computer science. They should all be part of every computer scientist's toolbox. Suppose that we take a computer scientist's point of view. One can think of finite automata as the mathematical representation of programs that run us ing fixed finite resources. Then Btichi's SIS can be thought of as a theory of programs which run forever (like operating systems or banking systems) and are deterministic. Finally, Rabin's S2S is a theory of programs which run forever and are nondeterministic. Indeed many questions of verification can be decided in the decidable theories of these automata.
The significance of foundational debate in mathematics that took place in the 1920s seems to have been recognized only in circles of mathematicians and philosophers. A period in the history of mathematics when mathematics and philosophy, usually so far away from each other, seemed to meet. The foundational debate is presented with all its brilliant contributions and its shortcomings, its new ideas and its misunderstandings. |
You may like...
For the Sister Who Has Everything - A…
Bruce Miller, Team Golfwell
Hardcover
Micro-Electronics and Clothing - The…
Kurt Hoffman, Howard Rush
Hardcover
Business Process Change - Reengineering…
William Kettinger, Varun Grover
Hardcover
R1,937
Discovery Miles 19 370
|