![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
An in-depth look at soft computing methods and their applications in the human sciences, such as the social and the behavioral sciences. Soft computing methods - including fuzzy systems, neural networks, evolutionary computing and probabilistic reasoning - are state-of-the-art methods in theory formation and model construction. The powerful application areas of these methods in the human sciences are demonstrated, including the replacement of statistical models by simpler numerical or linguistic soft computing models and the use of computer simulations with approximate and linguistic constituents. "Dr. Niskanen's work opens new vistas in application of soft
computing, fuzzy logic and fuzzy set theory to the human sciences.
This book is likely to be viewed in retrospect as a landmark in its
field"
In the eyes of the editors, this book will be considered a success if it can convince its readers of the following: that it is warranted to dream of a realistic and full-fledged theory of mathematical practices, in the plural. If such a theory is possible, it would mean that a number of presently existing fierce oppositions between philosophers, sociologists, educators, and other parties involved, are in fact illusory.
The theory of Boolean algebras was created in 1847 by the English mat- matician George Boole. He conceived it as a calculus (or arithmetic) suitable for a mathematical analysis of logic. The form of his calculus was rather di?erent from the modern version, which came into being during the - riod 1864-1895 through the contributions of William Stanley Jevons, Aug- tus De Morgan, Charles Sanders Peirce, and Ernst Schr. oder. A foundation of the calculus as an abstract algebraic discipline, axiomatized by a set of equations, and admitting many di?erent interpretations, was carried out by Edward Huntington in 1904. Only with the work of Marshall Stone and Alfred Tarski in the 1930s, however, did Boolean algebra free itself completely from the bonds of logic and become a modern mathematical discipline, with deep theorems and - portantconnections toseveral otherbranchesofmathematics, includingal- bra,analysis, logic, measuretheory, probability andstatistics, settheory, and topology. For instance, in logic, beyond its close connection to propositional logic, Boolean algebra has found applications in such diverse areas as the proof of the completeness theorem for ?rst-order logic, the proof of the Lo ' s conjecture for countable ? rst-order theories categorical in power, and proofs of the independence of the axiom of choice and the continuum hypothesis ? in set theory. In analysis, Stone's discoveries of the Stone-Cech compac- ?cation and the Stone-Weierstrass approximation theorem were intimately connected to his study of Boolean algebras.
We are happy to present the second volume of the Handbook of Defeasible Reasoning and Uncertainty Management Systems. Uncertainty pervades the real world and must therefore be addressed by every system that attempts to represent reality. The representation of un certainty is a major concern of philosophers, logicians, artificial intelligence researchers and computer sciencists, psychologists, statisticians, economists and engineers. The present Handbook volumes provide frontline coverage of this area. This Handbook was produced in the style of previous handbook series like the Handbook of Philosophical Logic, the Handbook of Logic in Computer Science, the Handbook of Logic in Artificial Intelligence and Logic Programming, and can be seen as a companion to them in covering the wide applications of logic and reasoning. We hope it will answer the needs for adequate representations of uncertainty. This Handbook series grew out of the ESPRIT Basic Research Project DRUMS II, where the acronym is made out of the Handbook series title. This project was financially supported by the European Union and regroups 20 major European research teams working in the general domain of uncer tainty. As a fringe benefit of the DRUMS project, the research community was able to create this Handbook series, relying on the DRUMS partici pants as the core of the authors for the Handbook together with external international experts."
Paul Williams, a leading authority on modeling in integer programming, has written a concise, readable introduction to the science and art of using modeling in logic for integer programming. Written for graduate and postgraduate students, as well as academics and practitioners, the book is divided into four chapters that all avoid the typical format of definitions, theorems and proofs and instead introduce concepts and results within the text through examples. References are given at the end of each chapter to the more mathematical papers and texts on the subject, and exercises are included to reinforce and expand on the material in the chapter. Methods of solving with both logic and IP are given and their connections are described. Applications in diverse fields are discussed, and Williams shows how IP models can be expressed as satisfiability problems and solved as such.
The larger part of Yearbook 6 of the Institute Vienna Circle constitutes the proceedings of a symposium on Alfred Tarski and his influence on and interchanges with the Vienna Circle, especially those on and with Rudolf Carnap and Kurt Goedel. It is the first time that this topic has been treated on such a scale and in such depth. Attention is mainly paid to the origins, development and subsequent role of Tarski's definition of truth. Some contributions are primarily historical, others analyze logical aspects of the concept of truth. Contributors include Anita and Saul Feferman, Jan Wolenski, Jan Tarski and Hans Sluga. Several Polish logicians contributed: Gzegorczyk, Wojcicki, Murawski and Rojszczak. The volume presents entirely new biographical material on Tarski, both from his Polish period and on his influential career in the United States: at Harvard, in Princeton, at Hunter, and at the University of California at Berkeley. The high point of the analysis involves Tarski's influence on Carnap's evolution from a narrow syntactical view of language, to the ontologically more sophisticated but more controversial semantical view. Another highlight involves the interchange between Tarski and Goedel on the connection between truth and proof and on the nature of metalanguages. The concluding part of Yearbook 6 includes documentation, book reviews and a summary of current activities of the Institute Vienna Circle. Jan Tarski introduces letters written by his father to Goedel; Paolo Parrini reports on the Vienna Circle's influence in Italy; several reviews cover recent books on logical empiricism, on Goedel, on cosmology, on holistic approaches in Germany, and on Mauthner.
The contents of this volume range from expository papers on several aspects of number theory, intended for general readers (Steinhaus property of planar regions; experiments with computers; Diophantine approximation; number field sieve), to a collection of research papers for specialists, which are at prestigious journal level. Thus, Number Theory and Its Applications leads the reader in many ways not only to the state of the art of number theory but also to its rich garden.
At first glance, Robinson's original form of nonstandard analysis appears nonconstructive in essence, because it makes a rather unrestricted use of classical logic and set theory and, in particular, of the axiom of choice. Recent developments, however, have given rise to the hope that the distance between constructive and nonstandard mathematics is actually much smaller than it appears. So the time was ripe for the first meeting dedicated simultaneously to both ways of doing mathematics and to the current and future reunion of these seeming opposites. Consisting of peer-reviewed research and survey articles written on the occasion of such an event, this volume offers views of the continuum from various standpoints. Including historical and philosophical issues, the topics of the contributions range from the foundations, the practice, and the applications of constructive and nonstandard mathematics, to the interplay of these areas and the development of a unified theory. This book will be of interest to mathematicians, logicians, and philosophers, as well as theoretical computer scientists, physicists, and economists who are interested in theories of the continuum and in constructive or nonstandard mathematics. The major part is accessible for the non-expert professional reader, from graduate student to academic level. "
The A-calculus was invented by Church in the 1930s with the purpose of sup plying a logical foundation for logic and mathematics 25]. Its use by Kleene as a coding for computable functions makes it the first programming lan guage, in an abstract sense, exactly as the Thring machine can be considered the first computer machine 57]. The A-calculus has quite a simple syntax (with just three formation rules for terms) and a simple operational seman tics (with just one operation, substitution), and so it is a very basic setting for studying computation properties. The first contact between A-calculus and real programming languages was in the years 1956-1960, when McCarthy developed the LISP programming language, inspired from A-calculus, which is the first "functional" program ming language, Le., where functions are first-dass citizens 66]. But the use of A-calculus as an abstract paradigm for programming languages started later as the work of three important scientists: Strachey, Landin and B6hm."
This book is based on lectures delivered at Harvard in the Spring of 1991 and at the University of Utah during the academic year 1992-93. Formally, the book assumes only general algebraic knowledge (rings, modules, groups, Lie algebras, functors etc.). It is helpful, however, to know some basics of algebraic geometry and representation theory. Each chapter begins with its own introduction, and most sections even have a short overview. The purpose of what follows is to explain the spirit of the book and how different parts are linked together without entering into details. The point of departure is the notion of the left spectrum of an associative ring, and the first natural steps of general theory of noncommutative affine, quasi-affine, and projective schemes. This material is presented in Chapter I. Further developments originated from the requirements of several important examples I tried to understand, to begin with the first Weyl algebra and the quantum plane. The book reflects these developments as I worked them out in reallife and in my lectures. In Chapter 11, we study the left spectrum and irreducible representations of a whole lot of rings which are of interest for modern mathematical physics. The dasses of rings we consider indude as special cases: quantum plane, algebra of q-differential operators, (quantum) Heisenberg and Weyl algebras, (quantum) enveloping algebra ofthe Lie algebra sl(2) , coordinate algebra of the quantum group SL(2), the twisted SL(2) of Woronowicz, so called dispin algebra and many others.
The aim of this book is to give self-contained proofs of all basic results concerning the infinite-valued proposition al calculus of Lukasiewicz and its algebras, Chang's MV -algebras. This book is for self-study: with the possible exception of Chapter 9 on advanced topics, the only prere- quisite for the reader is some acquaintance with classical propositional logic, and elementary algebra and topology. In this book it is not our aim to give an account of Lukasiewicz's motivations for adding new truth values: readers interested in this topic will find appropriate references in Chapter 10. Also, we shall not explain why Lukasiewicz infinite-valued propositionallogic is a ba- sic ingredient of any logical treatment of imprecise notions: Hajek's book in this series on Trends in Logic contains the most authorita- tive explanations. However, in order to show that MV-algebras stand to infinite-valued logic as boolean algebras stand to two-valued logic, we shall devote Chapter 5 to Ulam's game of Twenty Questions with lies/errors, as a natural context where infinite-valued propositions, con- nectives and inferences are used. While several other semantics for infinite-valued logic are known in the literature-notably Giles' game- theoretic semantics based on subjective probabilities-still the transi- tion from two-valued to many-valued propositonallogic can hardly be modelled by anything simpler than the transformation of the familiar game of Twenty Questions into Ulam game with lies/errors.
I am very happy to have this opportunity to introduce Luca Vigano's book on Labelled Non-Classical Logics. I put forward the methodology of labelled deductive systems to the participants of Logic Colloquium'90 (Labelled Deductive systems, a Position Paper, In J. Oikkonen and J. Vaananen, editors, Logic Colloquium '90, Volume 2 of Lecture Notes in Logic, pages 66-68, Springer, Berlin, 1993), in an attempt to bring labelling as a recognised and significant component of our logic culture. It was a response to earlier isolated uses of labels by various distinguished authors, as a means to achieve local proof theoretic goals. Labelling was used in many different areas such as resource labelling in relevance logics, prefix tableaux in modal logics, annotated logic programs in logic programming, proof tracing in truth maintenance systems, and various side annotations in higher-order proof theory, arithmetic and analysis. This widespread local use of labels was an indication of an underlying logical pattern, namely the simultaneous side-by-side manipulation of several kinds of logical information. It was clear that there was a need to establish the labelled deductive systems methodology. Modal logic is one major area where labelling can be developed quickly and sys tematically with a view of demonstrating its power and significant advantage. In modal logic the labels can play a double role."
Nowadays algebra is understood basically as the general theory of algebraic oper ations and relations. It is characterised by a considerable intrinsic naturalness of its initial notions and problems, the unity of its methods, and a breadth that far exceeds that of its basic concepts. It is more often that its power begins to be displayed when one moves outside its own limits. This characteristic ability is seen when one investigates not only complete operations, but partial operations. To a considerable extent these are related to algebraic operators and algebraic operations. The tendency to ever greater generality is amongst the reasons that playa role in explaining this development. But other important reasons play an even greater role. Within this same theory of total operations (that is, operations defined everywhere), there persistently arises in its different sections a necessity of examining the emergent feature of various partial operations. It is particularly important that this has been found in those parts of algebra it brings together and other areas of mathematics it interacts with as well as where algebra finds applica tion at the very limits of mathematics. In this connection we mention the theory of the composition of mappings, category theory, the theory of formal languages and the related theory of mathematical linguistics, coding theory, information theory, and algebraic automata theory. In all these areas (as well as in others) from time to time there arises the need to consider one or another partial operation."
Recent years have been blessed with an abundance of logical systems, arising from a multitude of applications. A logic can be characterised in many different ways. Traditionally, a logic is presented via the following three components: 1. an intuitive non-formal motivation, perhaps tie it in to some application area 2. a semantical interpretation 3. a proof theoretical formulation. There are several types of proof theoretical methodologies, Hilbert style, Gentzen style, goal directed style, labelled deductive system style, and so on. The tableau methodology, invented in the 1950s by Beth and Hintikka and later per fected by Smullyan and Fitting, is today one of the most popular, since it appears to bring together the proof-theoretical and the semantical approaches to the pre of a logical system and is also very intuitive. In many universities it is sentation the style first taught to students. Recently interest in tableaux has become more widespread and a community crystallised around the subject. An annual tableaux conference is being held and proceedings are published. The present volume is a Handbook a/Tableaux pre senting to the community a wide coverage of tableaux systems for a variety of logics. It is written by active members of the community and brings the reader up to frontline research. It will be of interest to any formal logician from any area."
After the pioneering works by Robbins {1944, 1945) and Choquet (1955), the notation of a set-valued random variable (called a random closed set in literatures) was systematically introduced by Kendall {1974) and Matheron {1975). It is well known that the theory of set-valued random variables is a natural extension of that of general real-valued random variables or random vectors. However, owing to the topological structure of the space of closed sets and special features of set-theoretic operations ( cf. Beer [27]), set-valued random variables have many special properties. This gives new meanings for the classical probability theory. As a result of the development in this area in the past more than 30 years, the theory of set-valued random variables with many applications has become one of new and active branches in probability theory. In practice also, we are often faced with random experiments whose outcomes are not numbers but are expressed in inexact linguistic terms.
Soft computing encompasses various computational methodologies,
which, unlike conventional algorithms, are tolerant of imprecision,
uncertainty, and partial truth. Soft computing technologies offer
adaptability as a characteristic feature and thus permit the
tracking of a problem through a changing environment. Besides some
recent developments in areas like rough sets and probabilistic
networks, fuzzy logic, evolutionary algorithms, and artificial
neural networks are core ingredients of soft computing, which are
all bio-inspired and can easily be combined synergetically.
Homology is a powerful tool used by mathematicians to study the properties of spaces and maps that are insensitive to small perturbations. This book uses a computer to develop a combinatorial computational approach to the subject. The core of the book deals with homology theory and its computation. Following this is a section containing extensions to further developments in algebraic topology, applications to computational dynamics, and applications to image processing. Included are exercises and software that can be used to compute homology groups and maps. The book will appeal to researchers and graduate students in mathematics, computer science, engineering, and nonlinear dynamics.
These notes were first used in an introductory course team taught by the authors at Appalachian State University to advanced undergraduates and beginning graduates. The text was written with four pedagogical goals in mind: offer a variety of topics in one course, get to the main themes and tools as efficiently as possible, show the relationships between the different topics, and include recent results to convince students that mathematics is a living discipline.
This monograph details several important advances in the direction of a practical proofs-as-programs paradigm, which constitutes a set of approaches to developing programs from proofs in constructive logic with applications to industrial-scale, complex software engineering problems. One of the books central themes is a general, abstract framework for developing new systems of programs synthesis by adapting proofs-as-programs to new contexts.
This volume provides a series of tutorials on mathematical structures which recently have gained prominence in physics, ranging from quantum foundations, via quantum information, to quantum gravity. These include the theory of monoidal categories and corresponding graphical calculi, Girard 's linear logic, Scott domains, lambda calculus and corresponding logics for typing, topos theory, and more general process structures. Most of these structures are very prominent in computer science; the chapters here are tailored towards an audience of physicists.
This book describes new methods for building intelligent systems using type-2 fuzzy logic and soft computing (SC) techniques. The authors extend the use of fuzzy logic to a higher order, which is called type-2 fuzzy logic. Combining type-2 fuzzy logic with traditional SC techniques, we can build powerful hybrid intelligent systems that can use the advantages that each technique offers. This book is intended to be a major reference tool and can be used as a textbook.
Thisbook isnotatextbook tobecomeacquainted with thelaws ofnature. An elementaryknowledgeaboutlawsofnature, inparticularthelawsofphysics, is presupposed. Thebookisratherintendedtoprovideaclari?cationofconcepts and properties of the laws of nature. The authors would like to emphasise that this book has been developed - created - as a real teamwork. Although the chapters (and in some cases parts of the chapters) were originally written by one of the two authors, all of them were discussed thoroughly and in detail and have been revised and complemented afterwards. Even if both authors were in agreement on most of the foundational issues discussed in the book, they did not feel it necessary to balance every viewpoint. Thus some individual and personal di?erence or emphasis will still be recognisable from the chapters written by the di?erent authors. In this sense the authors feel speci?cally responsible for the chapters as follows: Mittelstaedt for Chaps. 4, 9. 3, 10, 11. 2, 12, 13 and Weingartner for Chaps. 1, 2, 3, 5, 7, 8. 2, 9. 2, 9. 4. The remaining parts are joint sections. Most of the chapters are formulated as questions and they begin with arguments pro and contra. Then a detailed answer is proposed which contains a systematic discussion of the question. This is the respective main part of the chapter. It sometimes begins with a survey of the problem by giving some important answers to it from history (cf. Chaps. 6 and 9
This book provides an overview of type theory. The first part of the book is historical, yet at the same time, places historical systems in the modern setting. The second part deals with modern type theory as it developed since the 1940s, and with the role of propositions as types (or proofs as terms. The third part proposes new systems that bring more advantages together.
This book is an example of fruitful interaction between (non-classical) propo sitionallogics and (classical) model theory which was made possible due to categorical logic. Its main aim consists in investigating the existence of model completions for equational theories arising from propositional logics (such as the theory of Heyting algebras and various kinds of theories related to proposi tional modal logic ). The existence of model-completions turns out to be related to proof-theoretic facts concerning interpretability of second order propositional logic into ordinary propositional logic through the so-called 'Pitts' quantifiers' or 'bisimulation quantifiers'. On the other hand, the book develops a large number of topics concerning the categorical structure of finitely presented al gebras, with related applications to propositional logics, both standard (like Beth's theorems) and new (like effectiveness of internal equivalence relations, projectivity and definability of dual connectives such as difference). A special emphasis is put on sheaf representation, showing that much of the nice categor ical structure of finitely presented algebras is in fact only a restriction of natural structure in sheaves. Applications to the theory of classifying toposes are also covered, yielding new examples. The book has to be considered mainly as a research book, reporting recent and often completely new results in the field; we believe it can also be fruitfully used as a complementary book for graduate courses in categorical and algebraic logic, universal algebra, model theory, and non-classical logics. 1."
THIRTY FIVE YEARS OF AUTOMATING MATHEMATICS: DEDICATED TO 35 YEARS OF DE BRUIJN'S AUTOMATH N. G. de Bruijn was a well established mathematician before deciding in 1967 at the age of 49 to work on a new direction related to Automating Mathematics. By then, his contributions in mathematics were numerous and extremely influential. His book on advanced asymptotic methods, North Holland 1958, was a classic and was subsequently turned into a book in the well known Dover book series. His work on combinatorics yielded influential notions and theorems of which we mention the de Bruijn-sequences of 1946 and the de Bruijn-Erdos theorem of 1948. De Bruijn's contributions to mathematics also included his work on generalized function theory, analytic number theory, optimal control, quasicrystals, the mathematical analysis of games and much more. In the 1960s de Bruijn became fascinated by the new computer technology and as a result, decided to start the new AUTOMATH project where he could check, with the help of the computer, the correctness of books of mathematics. In each area that de Bruijn approached, he shed a new light and was known for his originality and for making deep intellectual contributions. And when it came to automating mathematics, he again did it his way and introduced the highly influential AUTOMATH. In the past decade he has also been working on theories of the human brain." |
![]() ![]() You may like...
Distributions - Theory and Applications
J. J. Duistermaat, Johan A.C. Kolk
Hardcover
R2,896
Discovery Miles 28 960
Intelligence Science II - Third IFIP TC…
Zhongzhi Shi, Cyriel Pennartz, …
Hardcover
R1,617
Discovery Miles 16 170
Developments in Functional Equations and…
Janusz Brzdek, Krzysztof Cieplinski, …
Hardcover
R3,687
Discovery Miles 36 870
Spread Spectrum CDMA Systems for…
Savo G. Glisic, Branka Vucetic
Hardcover
R2,519
Discovery Miles 25 190
Skew PBW Extensions - Ring and…
William Fajardo, Claudia Gallego, …
Hardcover
R3,729
Discovery Miles 37 290
Recent Advances and Applications in…
Ming-Chih Hung, Yi-Hwa Wu
Hardcover
R3,347
Discovery Miles 33 470
|