![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
In the aftermath of the discoveries in foundations of mathematiC's there was surprisingly little effect on mathematics as a whole. If one looks at stan dard textbooks in different mathematical disciplines, especially those closer to what is referred to as applied mathematics, there is little trace of those developments outside of mathematical logic and model theory. But it seems fair to say that there is a widespread conviction that the principles embodied in the Zermelo - Fraenkel theory with Choice (ZFC) are a correct description of the set theoretic underpinnings of mathematics. In most textbooks of the kind referred to above, there is, of course, no discussion of these matters, and set theory is assumed informally, although more advanced principles like Choice or sometimes Replacement are often mentioned explicitly. This implicitly fixes a point of view of the mathemat ical universe which is at odds with the results in foundations. For example most mathematicians still take it for granted that the real number system is uniquely determined up to isomorphism, which is a correct point of view as long as one does not accept to look at "unnatural" interpretations of the membership relation."
We are invited to deal with mathematical activity in a sys tematic way [ ... ] one does expect and look for pleasant surprises in this requirement of a novel combination of psy chology, logic, mathematics and technology. Hao Wang, 1970, quoted from(Wang, 1970). The field of mathematics has been a key application area for automated theorem proving from the start, in fact the very first automatically found the orem was that the sum of two even numbers is even (Davis, 1983). The field of automated deduction has witnessed considerable progress and in the last decade, automated deduction methods have made their way into many areas of research and product development in computer science. For instance, deduction systems are increasingly used in software and hardware verification to ensure the correctness of computer hardware and computer programs with respect to a given specification. Logic programming, while still falling somewhat short of its expectations, is now widely used, deduc tive databases are well-developed and logic-based description and analysis of hard-and software is commonplace today.
The present collection of papers derives from a philosophy conference organised in the Sicilian town of M ussomeli in September 1991. The con ference aimed at providing an analysis of certain aspects of the thought of Michael Dummett, whose contributions have been very influential in several aspects of the philosophical debate continuing within the analyt ical tradition. Logic, the philosophy of mathematics, the interpretation of Frege's philosophy, and metaphysics are only some of the areas within which Dummett's ideas have been fruitful over the years. The papers contained in this book, and Dummett's replies, will, it is hoped, not merely offer a partial reconstruction of a philosopher's life work, but provide an exciting and challenging vantage point from which to look at some of the main problems of contemporary philosophy. The First International Philosophy Conference of M ussomeli - this is what the conference was called - was an extraordinary event in many ways. The quality of the papers presented, the international reputa tion of many of the participants, the venue itself, together with the unavoidable, and sometimes quite funny, organisational hiccups, made that meeting memorable. Perhaps principally memorable was the warmth and sympathy of the people of Mussomeli who strongly supported and encouraged this initia tive. A special mention is also due to the City Council Administrators, who spared no effort to make the Conference a success."
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows."
This ambitious exposition by Malik and Mordeson on the fuzzification of discrete structures not only supplies a solid basic text on this key topic, but also serves as a viable tool for learning basic fuzzy set concepts "from the ground up" due to its unusual lucidity of exposition. While the entire presentation of this book is in a completely traditional setting, with all propositions and theorems provided totally rigorous proofs, the readability of the presentation is not compromised in any way; in fact, the many ex cellently chosen examples illustrate the often tricky concepts the authors address. The book's specific topics - including fuzzy versions of decision trees, networks, graphs, automata, etc. - are so well presented, that it is clear that even those researchers not primarily interested in these topics will, after a cursory reading, choose to return to a more in-depth viewing of its pages. Naturally, when I come across such a well-written book, I not only think of how much better I could have written my co-authored monographs, but naturally, how this work, as distant as it seems to be from my own area of interest, could nevertheless connect with such. Before presenting the briefest of some ideas in this direction, let me state that my interest in fuzzy set theory (FST) has been, since about 1975, in connecting aspects of FST directly with corresponding probability concepts. One chief vehicle in carrying this out involves the concept of random sets."
This is the first treatment in book format of proof-theoretic transformations - known as proof interpretations - that focuses on applications to ordinary mathematics. It covers both the necessary logical machinery behind the proof interpretations that are used in recent applications as well as - via extended case studies - carrying out some of these applications in full detail. This subject has historical roots in the 1950s. This book for the first time tells the whole story.
The two internationally renowned authors elucidate the structure of "fast" parallel computation. Its complexity is emphasised through a variety of techniques ranging from finite combinatorics, probability theory and finite group theory to finite model theory and proof theory. Non-uniform computation models are studied in the form of Boolean circuits; uniform ones in a variety of forms. Steps in the investigation of non-deterministic polynomial time are surveyed as is the complexity of various proof systems. Providing a survey of research in the field, the book will benefit advanced undergraduates and graduate students as well as researchers.
The idea about this book has evolved during the process of its preparation as some of the results have been achieved in parallel with its writing. One reason for this is that in this area of research results are very quickly updated. Another is, possibly, that a strong, unchallenged theoretical basis in this field still does not fully exist. From other hand, the rate of innovation, competition and demand from different branches of industry (from biotech industry to civil and building engineering, from market forecasting to civil aviation, from robotics to emerging e-commerce) is increasingly pressing for more customised solutions based on learning consumers behaviour. A highly interdisciplinary and rapidly innovating field is forming which focus is the design of intelligent, self-adapting systems and machines. It is on the crossroads of control theory, artificial and computational intelligence, different engineering disciplines borrowing heavily from the biology and life sciences. It is often called intelligent control, soft computing or intelligent technology. Some other branches have appeared recently like intelligent agents (which migrated from robotics to different engineering fields), data fusion, knowledge extraction etc., which are inherently related to this field. The core is the attempts to enhance the abilities of the classical control theory in order to have more adequate, flexible, and adaptive models and control algorithms.
When we learn from books or daily experience, we make associations and draw inferences on the basis of information that is insufficient for under standing. One example of insufficient information may be a small sample derived from observing experiments. With this perspective, the need for de veloping a better understanding of the behavior of a small sample presents a problem that is far beyond purely academic importance. During the past 15 years considerable progress has been achieved in the study of this issue in China. One distinguished result is the principle of in formation diffusion. According to this principle, it is possible to partly fill gaps caused by incomplete information by changing crisp observations into fuzzy sets so that one can improve the recognition of relationships between input and output. The principle of information diffusion has been proven suc cessful for the estimation of a probability density function. Many successful applications reflect the advantages of this new approach. It also supports an argument that fuzzy set theory can be used not only in "soft" science where some subjective adjustment is necessary, but also in "hard" science where all data are recorded."
This book describes new methods for building intelligent systems using type-2 fuzzy logic and soft computing (SC) techniques. The authors extend the use of fuzzy logic to a higher order, which is called type-2 fuzzy logic. Combining type-2 fuzzy logic with traditional SC techniques, we can build powerful hybrid intelligent systems that can use the advantages that each technique offers. This book is intended to be a major reference tool and can be used as a textbook.
We do not perceive the present as it is and in totality, nor do we infer the future from the present with any high degree of dependability, nor yet do we accurately know the consequences of our own actions. In addition, there is a fourth source of error to be taken into account, for we do not execute actions in the precise form in which they are imaged and willed. Frank H. Knight [R4.34, p. 202] The "degree" of certainty of confidence felt in the conclusion after it is reached cannot be ignored, for it is of the greatest practical signi- cance. The action which follows upon an opinion depends as much upon the amount of confidence in that opinion as it does upon fav- ableness of the opinion itself. The ultimate logic, or psychology, of these deliberations is obscure, a part of the scientifically unfathomable mystery of life and mind. Frank H. Knight [R4.34, p. 226-227] With some inaccuracy, description of uncertain consequences can be classified into two categories, those which use exclusively the language of probability distributions and those which call for some other principle, either to replace or supplement.
In the mid-1960's I had the pleasure of attending a talk by Lotfi Zadeh at which he presented some of his basic (and at the time, recent) work on fuzzy sets. Lotfi's algebra of fuzzy subsets of a set struck me as very nice; in fact, as a graduate student in the mid-1950's, I had suggested similar ideas about continuous-truth-valued propositional calculus (inffor "and," sup for "or") to my advisor, but he didn't go for it (and in fact, confused it with the foundations of probability theory), so I ended up writing a thesis in a more conventional area of mathematics (differential algebra). I especially enjoyed Lotfi's discussion of fuzzy convexity; I remember talking to him about possible ways of extending this work, but I didn't pursue this at the time. I have elsewhere told the story of how, when I saw C. L. Chang's 1968 paper on fuzzy topological spaces, I was impelled to try my hand at fuzzi fying algebra. This led to my 1971 paper "Fuzzy groups," which became the starting point of an entire literature on fuzzy algebraic structures. In 1974 King-Sun Fu invited me to speak at a U. S. -Japan seminar on Fuzzy Sets and their Applications, which was to be held that summer in Berkeley."
It is the business of science not to create laws, but to discover them. We do not originate the constitution of our own minds, greatly as it may be in our power to modify their character. And as the laws of the human intellect do not depend upon our will, so the forms of science, of (1. 1) which they constitute the basis, are in all essential regards independent of individual choice. George Boole 10, p. llJ 1. 1 Comparison with Traditional Logic The logic of this book is a probability logic built on top of a yes-no or 2-valued logic. It is divided into two parts, part I: BP Logic, and part II: M Logic. 'BP' stands for 'Bayes Postulate'. This postulate says that in the absence of knowl edge concerning a probability distribution over a universe or space one should assume 1 a uniform distribution. 2 The M logic of part II does not make use of Bayes postulate or of any other postulates or axioms. It relies exclusively on purely deductive reasoning following from the definition of probabilities. The M logic goes an important step further than the BP logic in that it can distinguish between certain types of information supply sentences which have the same representation in the BP logic as well as in traditional first order logic, although they clearly have different meanings (see example 6. 1. 2; also comments to the Paris-Rome problem of eqs. (1. 8), (1. 9) below)."
This volume collects together a number of important papers concerning both the method of abstraction generally and the use of particular abstraction principles to reconstruct central areas of mathematics along logicist lines. Gottlob Frege's original logicist project was, in effect, refuted by Russell's paradox. Crispin Wright has recently revived Frege s enterprise, however, providing a philosophical and technical framework within which a reconstruction of arithmetic is possible. While the Neo-Fregean project has received extensive attention and discussion, the present volume is unique in presenting a thoroughgoing examination of the mathematical aspects of the neo-logicist project (and the particular philosophical issues arising from these technical concerns). Attention is focused on extending the Neo-Fregean treatment to all of mathematics, with the reconstruction of real analysis from various cut- or cauchy-sequence-related abstraction principles and the reconstruction of set theory from various restricted versions of Basic Law V as case studies. As a result, the volume provides a test of the scope and limits of the neo-logicist project, detailing what has been accomplished and outlining the desiderata still outstanding. All papers in the anthology have their origins in presentations at Arche events, thus providing a volume that is both a survey of the cutting edge in research on the technical aspects of abstraction and a catalogue of the work in this area that has been supported in various ways by Arche."
This is an overview of the current state of knowledge along with open problems and perspectives, clarified in such fields as non-standard inferences in description logics, logic of provability, logical dynamics and computability theory. The book includes contributions concerning the role of logic today, including unexpected aspects of contemporary logic and the application of logic. This book will be of interest to logicians and mathematicians in general.
Homology is a powerful tool used by mathematicians to study the properties of spaces and maps that are insensitive to small perturbations. This book uses a computer to develop a combinatorial computational approach to the subject. The core of the book deals with homology theory and its computation. Following this is a section containing extensions to further developments in algebraic topology, applications to computational dynamics, and applications to image processing. Included are exercises and software that can be used to compute homology groups and maps. The book will appeal to researchers and graduate students in mathematics, computer science, engineering, and nonlinear dynamics.
In the last two decades modal logic has undergone an explosive growth, to thepointthatacompletebibliographyofthisbranchoflogic, supposingthat someone were capable to compile it, would ?ll itself a ponderous volume. What is impressive in the growth of modal logic has not been so much the quick accumulation of results but the richness of its thematic dev- opments. In the 1960s, when Kripke semantics gave new credibility to the logic of modalities? which was already known and appreciated in the Ancient and Medieval times? no one could have foreseen that in a short time modal logic would become a lively source of ideas and methods for analytical philosophers, historians of philosophy, linguists, epistemologists and computer scientists. The aim which oriented the composition of this book was not to write a new manual of modal logic (there are a lot of excellent textbooks on the market, and the expert reader will realize how much we bene?ted from manyofthem)buttoo?ertoeveryreader, evenwithnospeci?cbackground in logic, a conceptually linear path in the labyrinth of the current panorama of modal logic. The notion which in our opinion looked suitable to work as a compass in this enterprise was the notion of multimodality, or, more speci?cally, the basic idea of grounding systems on languages admitting more than one primitive modal opera
This book reflects the progress made in the forty years since the appearance of Abraham Robinson 's revolutionary book Nonstandard Analysis in the foundations of mathematics and logic, number theory, statistics and probability, in ordinary, partial and stochastic differential equations and in education. The contributions are clear and essentially self-contained.
The capabilities of modern technology are rapidly increasing, spurred on to a large extent by the tremendous advances in communications and computing. Automated vehicles and global wireless connections are some examples of these advances. In order to take advantage of such enhanced capabilities, our need to model and manipulate our knowledge of the geophysical world, using compatible representations, is also rapidly increasing. In response to this one fundamental issue of great concern in modern geographical research is how to most effectively capture the physical world around us in systems like geographical information systems (GIS). Making this task even more challenging is the fact that uncertainty plays a pervasive role in the representation, analysis and use of geospatial information. The types of uncertainty that appear in geospatial information systems are not the just simple randomness of observation, as in weather data, but are manifested in many other forms including imprecision, incompleteness and granularization. Describing the uncertainty of the boundaries of deserts and mountains clearly require different tools than those provided by probability theory. The multiplicity of modalities of uncertainty appearing in GIS requires a variety of formalisms to model these uncertainties. In light of this it is natural that fuzzy set theory has become a topic of intensive interest in many areas of geographical research and applications This volume, Fuzzy Modeling with Spatial Information for Geographic Problems, provides many stimulating examples of advances in geographical research based on approaches using fuzzy sets and related technologies.
Both modern mathematical music theory and computer science are strongly influenced by the theory of categories and functors. One outcome of this research is the data format of denotators, which is based on set-valued presheaves over the category of modules and diaffine homomorphisms. The functorial approach of denotators deals with generalized points in the form of arrows and allows the construction of a universal concept architecture. This architecture is ideal for handling all aspects of music, especially for the analysis and composition of highly abstract musical works. This book presents an introduction to the theory of module categories and the theory of denotators, as well as the design of a software system, called Rubato Composer, which is an implementation of the category-theoretic concept framework. The application is written in portable Java and relies on plug-in components, so-called rubettes, which may be combined in data flow networks for the generation and manipulation of denotators. The Rubato Composer system is open to arbitrary extension and is freely available under the GPL license. It allows the developer to build specialized rubettes for tasks that are of interest to composers, who in turn combine them to create music. It equally serves music theorists, who use them to extract information from and manipulate musical structures. They may even develop new theories by experimenting with the many parameters that are at their disposal thanks to the increased flexibility of the functorial concept architecture. Two contributed chapters by Guerino Mazzola and Florian Thalmann illustrate the application of the theory as well as the software in the development of compositional tools and the creation of a musical work with the help of the Rubato framework.
This biography attempts to shed light on all facets of Zermelo's life and achievements. Personal and scientific aspects are kept separate as far as coherence allows, in order to enable the reader to follow the one or the other of these threads. The presentation of his work explores motivations, aims, acceptance, and influence. Selected proofs and information gleaned from unpublished notes and letters add to the analysis.
The theory of Boolean algebras was created in 1847 by the English mat- matician George Boole. He conceived it as a calculus (or arithmetic) suitable for a mathematical analysis of logic. The form of his calculus was rather di?erent from the modern version, which came into being during the - riod 1864-1895 through the contributions of William Stanley Jevons, Aug- tus De Morgan, Charles Sanders Peirce, and Ernst Schr. oder. A foundation of the calculus as an abstract algebraic discipline, axiomatized by a set of equations, and admitting many di?erent interpretations, was carried out by Edward Huntington in 1904. Only with the work of Marshall Stone and Alfred Tarski in the 1930s, however, did Boolean algebra free itself completely from the bonds of logic and become a modern mathematical discipline, with deep theorems and - portantconnections toseveral otherbranchesofmathematics, includingal- bra,analysis, logic, measuretheory, probability andstatistics, settheory, and topology. For instance, in logic, beyond its close connection to propositional logic, Boolean algebra has found applications in such diverse areas as the proof of the completeness theorem for ?rst-order logic, the proof of the Lo ' s conjecture for countable ? rst-order theories categorical in power, and proofs of the independence of the axiom of choice and the continuum hypothesis ? in set theory. In analysis, Stone's discoveries of the Stone-Cech compac- ?cation and the Stone-Weierstrass approximation theorem were intimately connected to his study of Boolean algebras.
In the eyes of the editors, this book will be considered a success if it can convince its readers of the following: that it is warranted to dream of a realistic and full-fledged theory of mathematical practices, in the plural. If such a theory is possible, it would mean that a number of presently existing fierce oppositions between philosophers, sociologists, educators, and other parties involved, are in fact illusory.
No scientific theory has caused more puzzlement and confusion than quantum theory. Physics is supposed to help us to understand the world, but quantum theory makes it seem a very strange place. This book is about how mathematical innovation can help us gain deeper insight into the structure of the physical world. Chapters by top researchers in the mathematical foundations of physics explore new ideas, especially novel mathematical concepts, at the cutting edge of future physics. These creative developments in mathematics may catalyze the advances that enable us to understand our current physical theories, especially quantum theory. The authors bring diverse perspectives, unified only by the attempt to introduce fresh concepts that will open up new vistas in our understanding of future physics.
Over the last decade and particularly in recent years, the macroscopic porous media theory has made decisive progress concerning the fundamentals of the theory and the development of mathematical models in various fields of engineering and biomechanics. This progress has attracted some attention, and therefore conferences devoted almost exclusively to the macrosopic porous media theory have been organized in order to collect all findings, to present new results, and to discuss new trends. Many important contributions have also been published in national and international journals, which have brought the porous media theory, in some parts, to a close. Therefore, the time seems to be ripe to review the state of the art and to show new trends in the continuum mechanical treatment of saturated and unsaturated capillary and non-capillary porous solids. This book addresses postgraduate students and scientists working in engineering, physics, and mathematics. It provides an outline of modern theory of porous media and shows some trends in theory and in applications. |
![]() ![]() You may like...
The Public School Arithmetic - Based on…
J a (James Alexander) 18 McLellan, A F (Albert Flintoft) Ames
Hardcover
R935
Discovery Miles 9 350
Primary Maths for Scotland Textbook 1C…
Craig Lowther, Antoinette Irwin, …
Paperback
|