![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Originally published in 1981, this book forms volume 15 of the Encyclopedia of Mathematics and its Applications. The text provides a clear and thorough treatment of its subject, adhering to a clean exposition of the mathematical content of serious formulations of rational physical alternatives of quantum theory as elaborated in the influential works of the period, to which the authors made a significant contribution. The treatment falls into three distinct, logical parts: in the first part, the modern version of accumulated wisdom is presented, avoiding as far as possible the traditional language of classical physics for its interpretational character; in the second part, the individual structural elements for the logical content of the theory are laid out; in part three, the results of section two are used to reconstruct the usual Hilbert space formulation of quantum mechanics in a novel way.
Graph theory meets number theory in this stimulating book. Ihara zeta functions of finite graphs are reciprocals of polynomials, sometimes in several variables. Analogies abound with number-theoretic functions such as Riemann/Dedekind zeta functions. For example, there is a Riemann hypothesis (which may be false) and prime number theorem for graphs. Explicit constructions of graph coverings use Galois theory to generalize Cayley and Schreier graphs. Then non-isomorphic simple graphs with the same zeta are produced, showing you cannot hear the shape of a graph. The spectra of matrices such as the adjacency and edge adjacency matrices of a graph are essential to the plot of this book, which makes connections with quantum chaos and random matrix theory, plus expander/Ramanujan graphs of interest in computer science. Created for beginning graduate students, the book will also appeal to researchers. Many well-chosen illustrations and exercises, both theoretical and computer-based, are included throughout.
Many mechanics and physics problems have variational formulations making them appropriate for numerical treatment by finite element techniques and efficient iterative methods. This book describes the mathematical background and reviews the techniques for solving problems, including those that require large computations such as transonic flows for compressible fluids and the Navier-Stokes equations for incompressible viscous fluids. Finite element approximations and non-linear relaxation, augmented Lagrangians, and nonlinear least square methods are all covered in detail, as are many applications. "Numerical Methods for Nonlinear Variational Problems," originally published in the Springer Series in Computational Physics, is a classic in applied mathematics and computational physics and engineering. This long-awaited softcover re-edition is still a valuable resource for practitioners in industry and physics and for advanced students.
This volume contains the accounts of papers delivered at the Nato Advanced Study Institute on Finite and Infinite Combinatorics in Sets and Logic held at the Banff Centre, Alberta, Canada from April 21 to May 4, 1991. As the title suggests the meeting brought together workers interested in the interplay between finite and infinite combinatorics, set theory, graph theory and logic. It used to be that infinite set theory, finite combinatorics and logic could be viewed as quite separate and independent subjects. But more and more those disciplines grow together and become interdependent of each other with ever more problems and results appearing which concern all of those disciplines. I appreciate the financial support which was provided by the N. A. T. O. Advanced Study Institute programme, the Natural Sciences and Engineering Research Council of Canada and the Department of Mathematics and Statistics of the University of Calgary. 11l'te meeting on Finite and Infinite Combinatorics in Sets and Logic followed two other meetings on discrete mathematics held in Banff, the Symposium on Ordered Sets in 1981 and the Symposium on Graphs and Order in 1984. The growing inter-relation between the different areas in discrete mathematics is maybe best illustrated by the fact that many of the participants who were present at the previous meetings also attended this meeting on Finite and Infinite Combinatorics in Sets and Logic.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
Before his death in March, 1976, A. H. Lightstone delivered the manu script for this book to Plenum Press. Because he died before the editorial work on the manuscript was completed, I agreed (in the fall of 1976) to serve as a surrogate author and to see the project through to completion. I have changed the manuscript as little as possible, altering certain passages to correct oversights. But the alterations are minor; this is Lightstone's book. H. B. Enderton vii Preface This is a treatment of the predicate calculus in a form that serves as a foundation for nonstandard analysis. Classically, the predicates and variables of the predicate calculus are kept distinct, inasmuch as no variable is also a predicate; moreover, each predicate is assigned an order, a unique natural number that indicates the length of each tuple to which the predicate can be prefixed. These restrictions are dropped here, in order to develop a flexible, expressive language capable of exploiting the potential of nonstandard analysis. To assist the reader in grasping the basic ideas of logic, we begin in Part I by presenting the propositional calculus and statement systems. This provides a relatively simple setting in which to grapple with the some times foreign ideas of mathematical logic. These ideas are repeated in Part II, where the predicate calculus and semantical systems are studied."
Knowledge discovery is an area of computer science that attempts to uncover interesting and useful patterns in data that permit a computer to perform a task autonomously or assist a human in performing a task more efficiently. Soft Computing for Knowledge Discovery provides a self-contained and systematic exposition of the key theory and algorithms that form the core of knowledge discovery from a soft computing perspective. It focuses on knowledge representation, machine learning, and the key methodologies that make up the fabric of soft computing - fuzzy set theory, fuzzy logic, evolutionary computing, and various theories of probability (e.g. naive Bayes and Bayesian networks, Dempster-Shafer theory, mass assignment theory, and others). In addition to describing many state-of-the-art soft computing approaches to knowledge discovery, the author introduces Cartesian granule features and their corresponding learning algorithms as an intuitive approach to knowledge discovery. This new approach embraces the synergistic spirit of soft computing and exploits uncertainty in order to achieve tractability, transparency and generalization. Parallels are drawn between this approach and other well known approaches (such as naive Bayes and decision trees) leading to equivalences under certain conditions. The approaches presented are further illustrated in a battery of both artificial and real-world problems. Knowledge discovery in real-world problems, such as object recognition in outdoor scenes, medical diagnosis and control, is described in detail. These case studies provide further examples of how to apply the presented concepts and algorithms to practical problems. The author provides web page access to an online bibliography, datasets, source codes for several algorithms described in the book, and other information. Soft Computing for Knowledge Discovery is for advanced undergraduates, professionals and researchers in computer science, engineering and business information systems who work or have an interest in the dynamic fields of knowledge discovery and soft computing.
Fuzzy Set Theory and Advanced Mathematical Applications contains contributions by many of the leading experts in the field, including coverage of the mathematical foundations of the theory, decision making and systems science, and recent developments in fuzzy neural control. The book supplies a readable, practical toolkit with a clear introduction to fuzzy set theory and its evolution in mathematics and new results on foundations of fuzzy set theory, decision making and systems science, and fuzzy control and neural systems. Each chapter is self-contained, providing up-to-date coverage of its subject. Audience: An important reference work for university students, and researchers and engineers working in both industrial and academic settings.
The theory of fuzzy sets has become known in Czechoslovakia in the early seventies. Since then, it was applied in various areas of science, engineering and economics where indeterminate concepts had to be handled. There has been a number of national semi- nars and conferences devoted to this topic. However, the International Symposium on Fuzzy Approach to Reasoning and Decision-Making, held in 1990, was the first really representative international meeting of this kind organized in Czechoslovakia. The symposium took place in the House of Scientists of the Czechoslovak Academy of Sciences in Bechyne from June 25 till 29, 1990. Its main organizer was Mining In- stitute of the Czechoslovak Academy of Sciences in Ostrava in cooperation and support of several other institutions and organizations. A crucial role in preparing of the Sym- posium was played by the working group for Fuzzy Sets and Systems which is active in the frame of the Society of Czechoslovak Mathematicians and Physicists. The organizing and program committee was headed by Dr. Vilem Novak from the Mining Institute in Ostrava. Its members (in alphabetical order) were Dr. Martin Cerny (Prague), Prof. Bla- hoslav Harman (Liptovsky Mikulas), Ema Hyklova (Prague), Prof. Zdenek Karpfsek (Brno), Jan Laub (Prague), Dr. Milan MareS - vice-chairman (Prague), Prof. Radko Mesiar (Bratislava), Dr. Jifi Nekola - vice-chairman (Prague), Daria Novakova (Os- trava), Dr. Jaroslav Ramfk (Ostrava), Prof. Dr. Beloslav Riecan (Bratislava), Dr. Jana TalaSova (Pi'erov) and Dr. Milos Vitek (Pardubice).
Games, Norms, and Reasons: Logic at the Crossroads provides an overview of modern logic focusing on its relationships with other disciplines, including new interfaces with rational choice theory, epistemology, game theory and informatics. This book continues a series called "Logic at the Crossroads" whose title reflects a view that the deep insights from the classical phase of mathematical logic can form a harmonious mixture with a new, more ambitious research agenda of understanding and enhancing human reasoning and intelligent interaction. The editors have gathered together articles from active authors in this new area that explore dynamic logical aspects of norms, reasons, preferences and beliefs in human agency, human interaction and groups. The book pays a special tribute to Professor Rohit Parikh, a pioneer in this movement.
ThesubjectofthisbookisSemi-In?niteAlgebra,ormorespeci?cally,Semi-In?nite Homological Algebra. The term "semi-in?nite" is loosely associated with objects that can be viewed as extending in both a "positive" and a "negative" direction, withsomenaturalpositioninbetween,perhapsde?nedupto a"?nite"movement. Geometrically, this would mean an in?nite-dimensional variety with a natural class of "semi-in?nite" cycles or subvarieties, having always a ?nite codimension in each other, but in?nite dimension and codimension in the whole variety [37]. (For further instances of semi-in?nite mathematics see, e. g. , [38] and [57], and references below. ) Examples of algebraic objects of the semi-in?nite type range from certain in?nite-dimensional Lie algebras to locally compact totally disconnected topolo- cal groups to ind-schemes of ind-in?nite type to discrete valuation ?elds. From an abstract point of view, these are ind-pro-objects in various categories, often - dowed with additional structures. One contribution we make in this monograph is the demonstration of another class of algebraic objects that should be thought of as "semi-in?nite", even though they do not at ?rst glance look quite similar to the ones in the above list. These are semialgebras over coalgebras, or more generally over corings - the associative algebraic structures of semi-in?nite nature. The subject lies on the border of Homological Algebra with Representation Theory, and the introduction of semialgebras into it provides an additional link with the theory of corings [23], as the semialgebrasare the natural objects dual to corings.
also in: THE KLUWER INTERNATIONAL SERIES ON ASIAN STUDIES IN COMPUTER AND INFORMATION SCIENCE, Volume 2
Advances in Computational Intelligence and Learning: Methods and Applications presents new developments and applications in the area of Computational Intelligence, which essentially describes methods and approaches that mimic biologically intelligent behavior in order to solve problems that have been difficult to solve by classical mathematics. Generally Fuzzy Technology, Artificial Neural Nets and Evolutionary Computing are considered to be such approaches. The Editors have assembled new contributions in the areas of fuzzy sets, neural sets and machine learning, as well as combinations of them (so called hybrid methods) in the first part of the book. The second part of the book is dedicated to applications in the areas that are considered to be most relevant to Computational Intelligence.
A new foundation of Topology, summarized under the name Convenient Topology, is considered such that several deficiencies of topological and uniform spaces are remedied. This does not mean that these spaces are superfluous. It means exactly that a better framework for handling problems of a topological nature is used. In this setting semiuniform convergence spaces play an essential role. They include not only convergence structures such as topological structures and limit space structures, but also uniform convergence structures such as uniform structures and uniform limit space structures, and they are suitable for studying continuity, Cauchy continuity and uniform continuity as well as convergence structures in function spaces, e.g. simple convergence, continuous convergence and uniform convergence. Various interesting results are presented which cannot be obtained by using topological or uniform spaces in the usual context. The text is self-contained with the exception of the last chapter, where the intuitive concept of nearness is incorporated in Convenient Topology (there exist already excellent expositions on nearness spaces).
Every mathematician agrees that every mathematician must know some set theory; the disagreement begins in trying to decide how much is some. This book contains my answer to that question. The purpose of the book is to tell the beginning student of advanced mathematics the basic set theoretic facts of life, and to do so with the minimum of philosophical discourse and logical formalism. The point of view throughout is that of a prospective mathematician anxious to study groups, or integrals, or manifolds. From this point of view the concepts and methods of this book are merely some of the standard mathematical tools; the expert specialist will find nothing new here. Scholarly bibliographical credits and references are out of place in a purely expository book such as this one. The student who gets interested in set theory for its own sake should know, however, that there is much more to the subject than there is in this book. One of the most beautiful sources of set-theoretic wisdom is still Hausdorff's Set theory. A recent and highly readable addition to the literature, with an extensive and up-to-date bibliography, is Axiomatic set theory by Suppes."
Computer systems that analyze images are critical to a wide variety of applications such as visual inspections systems for various manufacturing processes, remote sensing of the environment from space-borne imaging platforms, and automatic diagnosis from X-rays and other medical imaging sources. Professor Azriel Rosenfeld, the founder of the field of digital image analysis, made fundamental contributions to a wide variety of problems in image processing, pattern recognition and computer vision. Professor Rosenfeld's previous students, postdoctoral scientists, and colleagues illustrate in Foundations of Image Understanding how current research has been influenced by his work as the leading researcher in the area of image analysis for over two decades. Each chapter of Foundations of Image Understanding is written by one of the world's leading experts in his area of specialization, examining digital geometry and topology (early research which laid the foundations for many industrial machine vision systems), edge detection and segmentation (fundamental to systems that analyze complex images of our three-dimensional world), multi-resolution and variable resolution representations for images and maps, parallel algorithms and systems for image analysis, and the importance of human psychophysical studies of vision to the design of computer vision systems. Professor Rosenfeld's chapter briefly discusses topics not covered in the contributed chapters, providing a personal, historical perspective on the development of the field of image understanding. Foundations of Image Understanding is an excellent source of basic material for both graduate students entering the field and established researchers who require a compact source for many of the foundational topics in image analysis.
This book is about Granular Computing (GC) - an emerging conceptual and of information processing. As the name suggests, GC concerns computing paradigm processing of complex information entities - information granules. In essence, information granules arise in the process of abstraction of data and derivation of knowledge from information. Information granules are everywhere. We commonly use granules of time (seconds, months, years). We granulate images; millions of pixels manipulated individually by computers appear to us as granules representing physical objects. In natural language, we operate on the basis of word-granules that become crucial entities used to realize interaction and communication between humans. Intuitively, we sense that information granules are at the heart of all our perceptual activities. In the past, several formal frameworks and tools, geared for processing specific information granules, have been proposed. Interval analysis, rough sets, fuzzy sets have all played important role in knowledge representation and processing. Subsequently, information granulation and information granules arose in numerous application domains. Well-known ideas of rule-based systems dwell inherently on information granules. Qualitative modeling, being one of the leading threads of AI, operates on a level of information granules. Multi-tier architectures and hierarchical systems (such as those encountered in control engineering), planning and scheduling systems all exploit information granularity. We also utilize information granules when it comes to functionality granulation, reusability of information and efficient ways of developing underlying information infrastructures.
Since the introduction of genetic algorithms in the 1970s, an enormous number of articles together with several significant monographs and books have been published on this methodology. As a result, genetic algorithms have made a major contribution to optimization, adaptation, and learning in a wide variety of unexpected fields. Over the years, many excellent books in genetic algorithm optimization have been published; however, they focus mainly on single-objective discrete or other hard optimization problems under certainty. There appears to be no book that is designed to present genetic algorithms for solving not only single-objective but also fuzzy and multiobjective optimization problems in a unified way. Genetic Algorithms And Fuzzy Multiobjective Optimization introduces the latest advances in the field of genetic algorithm optimization for 0-1 programming, integer programming, nonconvex programming, and job-shop scheduling problems under multiobjectiveness and fuzziness. In addition, the book treats a wide range of actual real world applications. The theoretical material and applications place special stress on interactive decision-making aspects of fuzzy multiobjective optimization for human-centered systems in most realistic situations when dealing with fuzziness. The intended readers of this book are senior undergraduate students, graduate students, researchers, and practitioners in the fields of operations research, computer science, industrial engineering, management science, systems engineering, and other engineering disciplines that deal with the subjects of multiobjective programming for discrete or other hard optimization problems under fuzziness. Real world research applications are used throughout the book to illustrate the presentation. These applications are drawn from complex problems. Examples include flexible scheduling in a machine center, operation planning of district heating and cooling plants, and coal purchase planning in an actual electric power plant.
Rule-based fuzzy modeling has been recognised as a powerful technique for the modeling of partly-known nonlinear systems. Fuzzy models can effectively integrate information from different sources, such as physical laws, empirical models, measurements and heuristics. Application areas of fuzzy models include prediction, decision support, system analysis, control design, etc. Fuzzy Modeling for Control addresses fuzzy modeling from the systems and control engineering points of view. It focuses on the selection of appropriate model structures, on the acquisition of dynamic fuzzy models from process measurements (fuzzy identification), and on the design of nonlinear controllers based on fuzzy models. To automatically generate fuzzy models from measurements, a comprehensive methodology is developed which employs fuzzy clustering techniques to partition the available data into subsets characterized by locally linear behaviour. The relationships between the presented identification method and linear regression are exploited, allowing for the combination of fuzzy logic techniques with standard system identification tools. Attention is paid to the trade-off between the accuracy and transparency of the obtained fuzzy models. Control design based on a fuzzy model of a nonlinear dynamic process is addressed, using the concepts of model-based predictive control and internal model control with an inverted fuzzy model. To this end, methods to exactly invert specific types of fuzzy models are presented. In the context of predictive control, branch-and-bound optimization is applied. The main features of the presented techniques are illustrated by means of simple examples. In addition, three real-world applications are described. Finally, software tools for building fuzzy models from measurements are available from the author.
The significance of foundational debate in mathematics that took place in the 1920s seems to have been recognized only in circles of mathematicians and philosophers. A period in the history of mathematics when mathematics and philosophy, usually so far away from each other, seemed to meet. The foundational debate is presented with all its brilliant contributions and its shortcomings, its new ideas and its misunderstandings.
Assuming that the reader is familiar with sheaf theory, the book gives a self-contained introduction to the theory of constructible sheaves related to many kinds of singular spaces, such as cell complexes, triangulated spaces, semialgebraic and subanalytic sets, complex algebraic or analytic sets, stratified spaces, and quotient spaces. The relation to the underlying geometrical ideas are worked out in detail, together with many applications to the topology of such spaces. All chapters have their own detailed introduction, containing the main results and definitions, illustrated in simple terms by a number of examples. The technical details of the proof are postponed to later sections, since these are not needed for the applications.
This book contains the lectures given at the NATO ASI 910820 "Cellular Automata and Cooperative Systems" Meeting which was held at the Centre de Physique des Houches, France, from June 22 to July 2, 1992. This workshop brought together mathematical physicists, theoretical physicists and mathe maticians working in fields related to local interacting systems, cellular and probabilistic automata, statistical physics, and complexity theory, as well as applications of these fields. We would like to thank our sponsors and supporters whose interest and help was essential for the success of the meeting: the NATO Scientific Affairs Division, the DRET (Direction des Recherches, Etudes et Techniques), the Ministere des Affaires Etrangeres, the National Science Foundation. We would also like to thank all the secretaries who helped us during the preparation of the meeting, in particular Maryse Cohen-Solal (CPT, Marseille) and Janice Nowinski (Courant Institute, New York). We are grateful for the fine work of Mrs. Gladys Cavallone in preparing this volume."
Uncertainty has been of concern to engineers, managers and . scientists for many centuries. In management sciences there have existed definitions of uncertainty in a rather narrow sense since the beginning of this century. In engineering and uncertainty has for a long time been considered as in sciences, however, synonymous with random, stochastic, statistic, or probabilistic. Only since the early sixties views on uncertainty have ~ecome more heterogeneous and more tools to model uncertainty than statistics have been proposed by several scientists. The problem of modeling uncertainty adequately has become more important the more complex systems have become, the faster the scientific and engineering world develops, and the more important, but also more difficult, forecasting of future states of systems have become. The first question one should probably ask is whether uncertainty is a phenomenon, a feature of real world systems, a state of mind or a label for a situation in which a human being wants to make statements about phenomena, i. e. , reality, models, and theories, respectively. One cart also ask whether uncertainty is an objective fact or just a subjective impression which is closely related to individual persons. Whether uncertainty is an objective feature of physical real systems seems to be a philosophical question. This shall not be answered in this volume.
Fuzzy Sets in Decision Analysis, Operations Research and Statistics includes chapters on fuzzy preference modeling, multiple criteria analysis, ranking and sorting methods, group decision-making and fuzzy game theory. It also presents optimization techniques such as fuzzy linear and non-linear programming, applications to graph problems and fuzzy combinatorial methods such as fuzzy dynamic programming. In addition, the book also accounts for advances in fuzzy data analysis, fuzzy statistics, and applications to reliability analysis. These topics are covered within four parts: * Decision Making, * Mathematical Programming, * Statistics and Data Analysis, and * Reliability, Maintenance and Replacement. The scope and content of the book has resulted from multiple interactions between the editor of the volume, the series editors, the series advisory board, and experts in each chapter area. Each chapter was written by a well-known researcher on the topic and reviewed by other experts in the area. These expert reviewers sometimes became co-authors because of the extent of their contribution to the chapter.As a result, twenty-five authors from twelve countries and four continents were involved in the creation of the 13 chapters, which enhances the international character of the project and gives an idea of how carefully the Handbook has been developed.
The theory of finite automata on finite stings, infinite strings, and trees has had a dis tinguished history. First, automata were introduced to represent idealized switching circuits augmented by unit delays. This was the period of Shannon, McCullouch and Pitts, and Howard Aiken, ending about 1950. Then in the 1950s there was the work of Kleene on representable events, of Myhill and Nerode on finite coset congruence relations on strings, of Rabin and Scott on power set automata. In the 1960s, there was the work of Btichi on automata on infinite strings and the second order theory of one successor, then Rabin's 1968 result on automata on infinite trees and the second order theory of two successors. The latter was a mystery until the introduction of forgetful determinacy games by Gurevich and Harrington in 1982. Each of these developments has successful and prospective applications in computer science. They should all be part of every computer scientist's toolbox. Suppose that we take a computer scientist's point of view. One can think of finite automata as the mathematical representation of programs that run us ing fixed finite resources. Then Btichi's SIS can be thought of as a theory of programs which run forever (like operating systems or banking systems) and are deterministic. Finally, Rabin's S2S is a theory of programs which run forever and are nondeterministic. Indeed many questions of verification can be decided in the decidable theories of these automata. |
You may like...
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R983
Discovery Miles 9 830
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R807
Discovery Miles 8 070
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
|