![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This work grew out of Errett Bishop's fundamental treatise 'Founda tions of Constructive Analysis' (FCA), which appeared in 1967 and which contained the bountiful harvest of a remarkably short period of research by its author. Truly, FCA was an exceptional book, not only because of the quantity of original material it contained, but also as a demonstration of the practicability of a program which most ma thematicians believed impossible to carry out. Errett's book went out of print shortly after its publication, and no second edition was produced by its publishers. Some years later, 'by a set of curious chances', it was agreed that a new edition of FCA would be published by Springer Verlag, the revision being carried out by me under Errett's supervision; at the same time, Errett gener ously insisted that I become a joint author. The revision turned out to be much more substantial than we had anticipated, and took longer than we would have wished. Indeed, tragically, Errett died before the work was completed. The present book is the result of our efforts. Although substantially based on FCA, it contains so much new material, and such full revision and expansion of the old, that it is essentially a new book. For this reason, and also to preserve the integrity of the original, I decided to give our joint work a title of its own. Most of the new material outside Chapter 5 originated with Errett."
In 1965 Juris Hartmanis and Richard E. Stearns published a paper "On the Computational Complexity of Algorithms." The field of complexity theory takes its name from this seminal paper and many of the major concepts and issues of complexity theory were introduced by Hartmanis in subsequent work. In honor of the contribution of Juris Hartmanis to the field of complexity theory, a special session of invited talks by Richard E. Stearns, Allan Borodin and Paul Young was held at the third annual meeting of the Structure in Complexity conference, and the first three chapters of this book are the final versions of these talks. They recall intellectual and professional trends in Hartmanis' contributions. All but one of the remainder of the chapters in this volume originated as a presentation at one of the recent meetings of the Structure in Complexity Theory Conference and appeared in preliminary form in the conference proceedings. In all, these expositions form an excellent description of much of contemporary complexity theory.
It is widely assumed that there exist certain objects which can in no way be distinguished from each other, unless by their location in space or other reference-system. Some of these are, in a broad sense, 'empirical objects', such as electrons. Their case would seem to be similar to that of certain mathematical 'objects', such as the minimum set of manifolds defining the dimensionality of an R -space. It is therefore at first sight surprising that there exists no branch of mathematics, in which a third parity-relation, besides equality and inequality, is admitted; for this would seem to furnish an appropriate model for application to such instances as these. I hope, in this work, to show that such a mathematics in feasible, and could have useful applications if only in a limited field. The concept of what I here call 'indistinguishability' is not unknown in logic, albeit much neglected. It is mentioned, for example, by F. P. Ramsey [1] who criticizes Whitehead and Russell [2] for defining 'identity' in such a way as to make indistinguishables identical. But, so far as I can discover, no one has made any systematic attempt to open up the territory which lies behind these ideas. What we find, on doing so, is a body of mathematics, offering only a limited prospect of practical usefulness, but which on the theoretical side presents a strong challenge to conventional ideas.
Edited in collaboration with FoLLI, the Association of Logic, Language and Information, this book constitutes the refereed proceedings of the Third International Workshop on Logic, Rationality, and Interaction, LORI 2011, held in Guangzhou, China, in October 2011. The 25 revised full papers presented together with 12 posters were carefully reviewed and selected from 52 submissions. Among the topics covered are semantic models for knowledge, for belief, and for uncertainty; dynamic logics of knowledge, information flow, and action; logical analysis of the structure of games; belief revision, belief merging; logics and preferences, compact preference representation; logics of intentions, plans, and goals; logics of probability and uncertainty; logical approaches to decision making and planning; argument systems and their role in interaction; norms, normative interaction, and normative multiagent systems; and logical and computational approaches to social choice.
With the vision that machines can be rendered smarter, we have witnessed for more than a decade tremendous engineering efforts to implement intelligent sys tems. These attempts involve emulating human reasoning, and researchers have tried to model such reasoning from various points of view. But we know precious little about human reasoning processes, learning mechanisms and the like, and in particular about reasoning with limited, imprecise knowledge. In a sense, intelligent systems are machines which use the most general form of human knowledge together with human reasoning capability to reach decisions. Thus the general problem of reasoning with knowledge is the core of design methodology. The attempt to use human knowledge in its most natural sense, that is, through linguistic descriptions, is novel and controversial. The novelty lies in the recognition of a new type of un certainty, namely fuzziness in natural language, and the controversality lies in the mathematical modeling process. As R. Bellman [7] once said, decision making under uncertainty is one of the attributes of human intelligence. When uncertainty is understood as the impossi bility to predict occurrences of events, the context is familiar to statisticians. As such, efforts to use probability theory as an essential tool for building intelligent systems have been pursued (Pearl [203], Neapolitan [182)). The methodology seems alright if the uncertain knowledge in a given problem can be modeled as probability measures.
Operations Research is a field whose major contribution has been to propose a rigorous fonnulation of often ill-defmed problems pertaining to the organization or the design of large scale systems, such as resource allocation problems, scheduling and the like. While this effort did help a lot in understanding the nature of these problems, the mathematical models have proved only partially satisfactory due to the difficulty in gathering precise data, and in formulating objective functions that reflect the multi-faceted notion of optimal solution according to human experts. In this respect linear programming is a typical example of impressive achievement of Operations Research, that in its detenninistic fonn is not always adapted to real world decision-making : everything must be expressed in tenns of linear constraints ; yet the coefficients that appear in these constraints may not be so well-defined, either because their value depends upon other parameters (not accounted for in the model) or because they cannot be precisely assessed, and only qualitative estimates of these coefficients are available. Similarly the best solution to a linear programming problem may be more a matter of compromise between various criteria rather than just minimizing or maximizing a linear objective function. Lastly the constraints, expressed by equalities or inequalities between linear expressions, are often softer in reality that what their mathematical expression might let us believe, and infeasibility as detected by the linear programming techniques can often been coped with by making trade-offs with the real world.
Et moi, .... si j'avait su comment en revenir, One service mathematics has rendered the je n'y serais point alle.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non. The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
'Et moi .... si j'avait su comment en revenir. One service mathematics has rendered the human race. It has put common sense back je n'y serais point aUe.' it belongs. on the topmost shelf next Jules Verne where to the dusty canister labelled 'discarded non. The series is divergent: therefore we may be sense'. Eric T. Bell able to do something with it. o. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
Most papers published in this volume are based on lectures presented at the Chico Conference on Semigroups held on the Chico campus of the Cal ifornia State University on April 10-12, 1986. The conference was spon sored by the California State University, Chico in cooperation with the Engineering Computer Sciences Department of the Pacific Gas and Electric Company. The program included seven 50-minute addresses and seventeen 30-minute lectures. Speakers were invited by the organizing committee consisting of S. M. Goberstein and P. M. Higgins. The purpose of the conference was to bring together some of the leading researchers in the area of semigroup theory for a discussion of major recent developments in the field. The algebraic theory of semigroups is growing so rapidly and new important results are being produced at such a rate that the need for another meeting was well justified. It was hoped that the conference would help to disseminate new results more rapidly among those working in semi groups and related areas and that the exchange of ideas would stimulate research in the subject even further. These hopes were realized beyond all expectations."
As understanding of the engineering design and configuration processes grows, the recognition that these processes intrinsically involve imprecise information is also growing. This book collects some of the most recent work in the area of representation and manipulation of imprecise information during the syn thesis of new designs and selection of configurations. These authors all utilize the mathematics of fuzzy sets to represent information that has not-yet been reduced to precise descriptions, and in most cases also use the mathematics of probability to represent more traditional stochastic uncertainties such as un controlled manufacturing variations, etc. These advances form the nucleus of new formal methods to solve design, configuration, and concurrent engineering problems. Hans-Jurgen Sebastian Aachen, Germany Erik K. Antonsson Pasadena, California ACKNOWLEDGMENTS We wish to thank H.-J. Zimmermann for inviting us to write this book. We are also grateful to him for many discussions about this new field Fuzzy Engineering Design which have been very stimulating. We wish to thank our collaborators in particular: B. Funke, M. Tharigen, K. Miiller, S. Jarvinen, T. Goudarzi-Pour, and T. Kriese in Aachen who worked in the PROKON project and who elaborated some of the results presented in the book. We also wish to thank Michael J. Scott for providing invaluable editorial assis tance. Finally, the book would not have been possible without the many contributions and suggestions of Alex Greene of Kluwer Academic Publishers. 1 MODELING IMPRECISION IN ENGINEERING DESIGN Erik K. Antonsson, Ph.D., P.E."
Fuzzy Logic Foundations and Industrial Applications is an organized edited collection of contributed chapters covering basic fuzzy logic theory, fuzzy linear programming, and applications. Special emphasis has been given to coverage of recent research results, and to industrial applications of fuzzy logic. The chapters are new works that have been written exclusively for this book by many of the leading and prominent researchers (such as Ronald Yager, Ellen Hisdal, Etienne Kerre, and others) in this field. The contributions are original and each chapter is self-contained. The authors have been careful to indicate direct links between fuzzy set theory and its industrial applications. Fuzzy Logic Foundations and Industrial Applications is an invaluable work that provides researchers and industrial engineers with up-to-date coverage of new results on fuzzy logic and relates these results to their industrial use.
The present collection of papers derives from a philosophy conference organised in the Sicilian town of M ussomeli in September 1991. The con ference aimed at providing an analysis of certain aspects of the thought of Michael Dummett, whose contributions have been very influential in several aspects of the philosophical debate continuing within the analyt ical tradition. Logic, the philosophy of mathematics, the interpretation of Frege's philosophy, and metaphysics are only some of the areas within which Dummett's ideas have been fruitful over the years. The papers contained in this book, and Dummett's replies, will, it is hoped, not merely offer a partial reconstruction of a philosopher's life work, but provide an exciting and challenging vantage point from which to look at some of the main problems of contemporary philosophy. The First International Philosophy Conference of M ussomeli - this is what the conference was called - was an extraordinary event in many ways. The quality of the papers presented, the international reputa tion of many of the participants, the venue itself, together with the unavoidable, and sometimes quite funny, organisational hiccups, made that meeting memorable. Perhaps principally memorable was the warmth and sympathy of the people of Mussomeli who strongly supported and encouraged this initia tive. A special mention is also due to the City Council Administrators, who spared no effort to make the Conference a success."
On the history of the book: In the early 1990s several new methods and perspectives in au- mated deduction emerged. We just mention the superposition calculus, meta-term inference and schematization, deductive decision procedures, and automated model building. It was this last ?eld which brought the authors of this book together. In 1994 they met at the Conference on Automated Deduction (CADE-12) in Nancy and agreed upon the general point of view, that semantics and, in particular, construction of models should play a central role in the ?eld of automated deduction. In the following years the deduction groups of the laboratory LEIBNIZ at IMAG Grenoble and the University of Technology in Vienna organized several bilateral projects promoting this topic. This book emerged as a main result of this cooperation. The authors are aware of the fact, that the book does not cover all relevant methods of automated model building (also called model construction or model generation); instead the book focuses on deduction-based symbolic methods for the construction of Herbrand models developed in the last 12 years. Other methods of automated model building, in particular also ?nite model building, are mainly treated in the ?nal chapter; this chapter is less formal and detailed but gives a broader view on the topic and a comparison of di?erent approaches. Howtoreadthisbook: In the introduction we give an overview of automated deduction in a historical context, taking into account its relationship with the human views on formal and informal proofs.
The work of which this is an English translation appeared originally in French as Precis de logique mathematique. In 1954 Dr. Albert Menne brought out a revised and somewhat enlarged edition in German (Grund riss der Logistik, F. Schoningh, Paderborn). In making my translation I have used both editions. For the most part I have followed the original French edition, since I thought there was some advantage in keeping the work as short as possible. However, I have included the more extensive historical notes of Dr. Menne, his bibliography, and the two sections on modal logic and the syntactical categories ( 25 and 27), which were not in the original. I have endeavored to correct the typo graphical errors that appeared in the original editions and have made a few additions to the bibliography. In making the translation I have profited more than words can tell from the ever-generous help of Fr. Bochenski while he was teaching at the University of Notre Dame during 1955-56. OTTO BIRD Notre Dame, 1959 I GENERAL PRINCIPLES O. INTRODUCTION 0. 1. Notion and history. Mathematical logic, also called 'logistic', .symbolic logic', the 'algebra of logic', and, more recently, simply 'formal logic', is the set of logical theories elaborated in the course of the last century with the aid of an artificial notation and a rigorously deductive method."
Metric fixed point theory encompasses the branch of fixed point theory which metric conditions on the underlying space and/or on the mappings play a fundamental role. In some sense the theory is a far-reaching outgrowth of Banach's contraction mapping principle. A natural extension of the study of contractions is the limiting case when the Lipschitz constant is allowed to equal one. Such mappings are called nonexpansive. Nonexpansive mappings arise in a variety of natural ways, for example in the study of holomorphic mappings and hyperconvex metric spaces. Because most of the spaces studied in analysis share many algebraic and topological properties as well as metric properties, there is no clear line separating metric fixed point theory from the topological or set-theoretic branch of the theory. Also, because of its metric underpinnings, metric fixed point theory has provided the motivation for the study of many geometric properties of Banach spaces. The contents of this Handbook reflect all of these facts. The purpose of the Handbook is to provide a primary resource for anyone interested in fixed point theory with a metric flavor. The goal is to provide information for those wishing to find results that might apply to their own work and for those wishing to obtain a deeper understanding of the theory. The book should be of interest to a wide range of researchers in mathematical analysis as well as to those whose primary interest is the study of fixed point theory and the underlying spaces. The level of exposition is directed to a wide audience, including students and established researchers.
1. Interpolation problems play an important role both in theoretical and applied investigations. This explains the great number of works dedicated to classical and new interpolation problems ([1)-[5], [8), [13)-[16], [26)-[30], [57]). In this book we use a method of operator identities for investigating interpo lation problems. Following the method of operator identities we formulate a general interpolation problem containing the classical interpolation problems (Nevanlinna Pick, Caratheodory, Schur, Humburger, Krein) as particular cases. We write down the abstract form of the Potapov inequality. By solving this inequality we give the description of the set of solutions of the general interpolation problem in the terms of the linear-fractional transformation. Then we apply the obtained general results to a number of classical and new interpolation problems. Some chapters of the book are dedicated to the application of the interpola tion theory results to several other problems (the extension problem, generalized stationary processes, spectral theory, nonlinear integrable equations, functions with operator arguments). 2. Now we shall proceed to a more detailed description of the book contents.
This book contains the proceedings of the International Symposium on Mathematical Morphology and its Applications to Image and Signal Processing IV, held June 3-5, 1998, in Amsterdam, The Netherlands. The purpose of the work is to provide the image analysis community with a sampling of recent developments in theoretical and practical aspects of mathematical morphology and its applications to image and signal processing. Among the areas covered are: digitization and connectivity, skeletonization, multivariate morphology, morphological segmentation, color image processing, filter design, gray-scale morphology, fuzzy morphology, decomposition of morphological operators, random sets and statistical inference, differential morphology and scale-space, morphological algorithms and applications. Audience: This volume will be of interest to research mathematicians and computer scientists whose work involves mathematical morphology, image and signal processing.
1. BASIC CONCEPTS OF INTERACTIVE THEOREM PROVING Interactive Theorem Proving ultimately aims at the construction of powerful reasoning tools that let us (computer scientists) prove things we cannot prove without the tools, and the tools cannot prove without us. Interaction typi cally is needed, for example, to direct and control the reasoning, to speculate or generalize strategic lemmas, and sometimes simply because the conjec ture to be proved does not hold. In software verification, for example, correct versions of specifications and programs typically are obtained only after a number of failed proof attempts and subsequent error corrections. Different interactive theorem provers may actually look quite different: They may support different logics (first-or higher-order, logics of programs, type theory etc.), may be generic or special-purpose tools, or may be tar geted to different applications. Nevertheless, they share common concepts and paradigms (e.g. architectural design, tactics, tactical reasoning etc.). The aim of this chapter is to describe the common concepts, design principles, and basic requirements of interactive theorem provers, and to explore the band width of variations. Having a 'person in the loop', strongly influences the design of the proof tool: proofs must remain comprehensible, - proof rules must be high-level and human-oriented, - persistent proof presentation and visualization becomes very important."
The nationwide research project Deduktion', funded by the Deutsche Forschungsgemeinschaft (DFG)' for a period of six years, brought together almost all research groups within Germany engaged in the field of automated reasoning. Intensive cooperation and exchange of ideas led to considerable progress both in the theoretical foundations and in the application of deductive knowledge. This three-volume book covers these original contributions moulded into the state of the art of automated deduction. The three volumes are intended to document and advance a development in the field of automated deduction that can now be observed all over the world. Rather than restricting the interest to purely academic research, the focus now is on the investigation of problems derived from realistic applications. In fact industrial applications are already pursued on a trial basis. In consequence the emphasis of the volumes is not on the presentation of the theoretical foundations of logical deduction as such, as in a handbook; rather the books present the concepts and methods now available in automated deduction in a form which can be easily accessed by scientists working in applications outside of the field of deduction. This reflects the strong conviction that automated deduction is on the verge of being fully included in the evolution of technology. Volume I focuses on basic research in deduction and on the knowledge on which modern deductive systems are based. Volume II presents techniques of implementation and details about system building. Volume III deals with applications of deductive techniques mainly, but not exclusively, to mathematics and the verification of software. Each chapter was read by two referees, one an international expert from abroad and the other a knowledgeable participant in the national project. It has been accepted for inclusion on the basis of these review reports. Audience: Researchers and developers in software engineering, formal methods, certification, verification, validation, specification of complex systems and software, expert systems, natural language processing.
The already broad range of applications of ring theory has been enhanced in the eighties by the increasing interest in algebraic structures of considerable complexity, the so-called class of quantum groups. One of the fundamental properties of quantum groups is that they are modelled by associative coordinate rings possessing a canonical basis, which allows for the use of algorithmic structures based on Groebner bases to study them. This book develops these methods in a self-contained way, concentrating on an in-depth study of the notion of a vast class of non-commutative rings (encompassing most quantum groups), the so-called Poincar -Birkhoff-Witt rings. We include algorithms which treat essential aspects like ideals and (bi)modules, the calculation of homological dimension and of the Gelfand-Kirillov dimension, the Hilbert-Samuel polynomial, primality tests for prime ideals, etc.
'A Geometry of Approximation' addresses Rough Set Theory, a field of interdisciplinary research first proposed by Zdzislaw Pawlak in 1982, and focuses mainly on its logic-algebraic interpretation. The theory is embedded in a broader perspective that includes logical and mathematical methodologies pertaining to the theory, as well as related epistemological issues. Any mathematical technique that is introduced in the book is preceded by logical and epistemological explanations. Intuitive justifications are also provided, insofar as possible, so that the general perspective is not lost. Such an approach endows the present treatise with a unique character. Due to this uniqueness in the treatment of the subject, the book will be useful to researchers, graduate and pre-graduate students from various disciplines, such as computer science, mathematics and philosophy. It features an impressive number of examples supported by about 40 tables and 230 figures. The comprehensive index of concepts turns the book into a sort of encyclopaedia for researchers from a number of fields. 'A Geometry of Approximation' links many areas of academic pursuit without losing track of its focal point, Rough Sets.
This book constitutes the refereed proceedings of the 7th Conference on Computability in Europe, CiE 2011, held in Sofia, Bulgaria, in June/July 2011. The 22 revised papers presented together with 11 invited lectures were carefully reviewed and selected with an acceptance rate of under 40%. The papers cover the topics computability in analysis, algebra, and geometry; classical computability theory; natural computing; relations between the physical world and formal models of computability; theory of transfinite computations; and computational linguistics.
This book constitutes the refereed proceedings of the 10th International Conference on Typed Lambda Calculi and Applications, TLCA 2011, held in Novi Sad, Serbia, in June 2011 as part of RDP 2011, the 6th Federated Conference on Rewriting, Deduction, and Programming. The 15 revised full papers presented were carefully reviewed and selected from 44 submissions. The papers provide prevailing research results on all current aspects of typed lambda calculi, ranging from theoretical and methodological issues to applications in various contexts addressing a wide variety of topics such as proof-theory, semantics, implementation, types, and programming.
There is no branch of mathematics, however abstract, which may not some day be applied to phenomena of the real world. - Nikolai Ivanovich Lobatchevsky This book is an extensively-revised and expanded version of "The Theory of Semirings, with Applicationsin Mathematics and Theoretical Computer Science" [Golan, 1992], first published by Longman. When that book went out of print, it became clear - in light of the significant advances in semiring theory over the past years and its new important applications in such areas as idempotent analysis and the theory of discrete-event dynamical systems - that a second edition incorporating minor changes would not be sufficient and that a major revision of the book was in order. Therefore, though the structure of the first "dition was preserved, the text was extensively rewritten and substantially expanded. In particular, references to many interesting and applications of semiring theory, developed in the past few years, had to be added. Unfortunately, I find that it is best not to go into these applications in detail, for that would entail long digressions into various domains of pure and applied mathematics which would only detract from the unity of the volume and increase its length considerably. However, I have tried to provide an extensive collection of examples to arouse the reader's interest in applications, as well as sufficient citations to allow the interested reader to locate them. For the reader's convenience, an index to these citations is given at the end of the book .
Algebra has moved well beyond the topics discussed in standard undergraduate texts on 'modern algebra'. Those books typically dealt with algebraic structures such as groups, rings and fields: still very important concepts! However Quantum Groups: A Path to Current Algebra is written for the reader at ease with at least one such structure and keen to learn the latest algebraic concepts and techniques. A key to understanding these new developments is categorical duality. A quantum group is a vector space with structure. Part of the structure is standard: a multiplication making it an 'algebra'. Another part is not in those standard books at all: a comultiplication, which is dual to multiplication in the precise sense of category theory, making it a 'coalgebra'. While coalgebras, bialgebras and Hopf algebras have been around for half a century, the term 'quantum group', along with revolutionary new examples, was launched by Drinfel'd in 1986. |
You may like...
Child Rights and Displacement in East…
Cherie C. Enns, Willibard J. Kombe
Hardcover
R4,493
Discovery Miles 44 930
Earth's Magnetosphere - Formed by the…
Wayne Keith, Walter Heikkila
Paperback
R3,346
Discovery Miles 33 460
Star Clusters in the Era of Large…
Andre Moitinho, Joao Alves
Hardcover
R4,036
Discovery Miles 40 360
Groups of Galaxies in the Nearby…
Ivo Saviane, Valentin D. Ivanov, …
Hardcover
R1,459
Discovery Miles 14 590
|