![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
One can distinguish, roughly speaking, two different approaches to the philosophy of mathematics. On the one hand, some philosophers (and some mathematicians) take the nature and the results of mathematicians' activities as given, and go on to ask what philosophical morals one might perhaps find in their story. On the other hand, some philosophers, logicians and mathematicians have tried or are trying to subject the very concepts which mathematicians are using in their work to critical scrutiny. In practice this usually means scrutinizing the logical and linguistic tools mathematicians wield. Such scrutiny can scarcely help relying on philosophical ideas and principles. In other words it can scarcely help being literally a study of language, truth and logic in mathematics, albeit not necessarily in the spirit of AJ. Ayer. As its title indicates, the essays included in the present volume represent the latter approach. In most of them one of the fundamental concepts in the foundations of mathematics and logic is subjected to a scrutiny from a largely novel point of view. Typically, it turns out that the concept in question is in need of a revision or reconsideration or at least can be given a new twist. The results of such a re-examination are not primarily critical, however, but typically open up new constructive possibilities. The consequences of such deconstructions and reconstructions are often quite sweeping, and are explored in the same paper or in others.
Simplicity theory is an extension of stability theory to a wider class of structures, containing, among others, the random graph, pseudo-finite fields, and fields with a generic automorphism. Following Kim's proof of forking symmetry' which implies a good behaviour of model-theoretic independence, this area of model theory has been a field of intense study. It has necessitated the development of some important new tools, most notably the model-theoretic treatment of hyperimaginaries (classes modulo type-definable equivalence relations). It thus provides a general notion of independence (and of rank in the supersimple case) applicable to a wide class of algebraic structures. The basic theory of forking independence is developed, and its properties in a simple structure are analyzed. No prior knowledge of stability theory is assumed; in fact many stability-theoretic results follow either from more general propositions, or are developed in side remarks. Audience: This book is intended both as an introduction to simplicity theory accessible to graduate students with some knowledge of model theory, and as a reference work for research in the field.
The first edition of the monograph Information and Randomness: An Algorithmic Perspective by Crist ian Calude was published in 1994. In my Foreword I said: "The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. . . . The present book by Calude fits very well in our series. Much original research is presented. . . making the approach richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and, thus, it is not necessary to consult other sources for classroom instruction. " The vigorous growth in the study of algorithmic information theory has continued during the past few years, which is clearly visible in the present second edition. Many new results, examples, exercises and open prob lems have been added. The additions include two entirely new chapters: "Computably Enumerable Random Reals" and "Randomness and Incom pleteness." The really comprehensive new bibliography makes the book very valuable for a researcher. The new results about the characterization of computably enumerable random reals, as well as the fascinating Omega Numbers, should contribute much to the value of the book as a textbook. The author has been directly involved in these results that have appeared in the prestigious journals Nature, New Scientist and Pour la Science."
The subject of Time has a wide intellectual appeal across different dis ciplines. This has shown in the variety of reactions received from readers of the first edition of the present Book. Many have reacted to issues raised in its philosophical discussions, while some have even solved a number of the open technical questions raised in the logical elaboration of the latter. These results will be recorded below, at a more convenient place. In the seven years after the first publication, there have been some noticeable newer developments in the logical study of Time and temporal expressions. As far as Temporal Logic proper is concerned, it seems fair to say that these amount to an increase in coverage and sophistication, rather than further break-through innovation. In fact, perhaps the most significant sources of new activity have been the applied areas of Linguistics and Computer Science (including Artificial Intelligence), where many intriguing new ideas have appeared presenting further challenges to temporal logic. Now, since this Book has a rather tight composition, it would have been difficult to interpolate this new material without endangering intelligibility."
Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.
The encounter, in the late seventies, between the theory of triangular norms, issuing frorn stochastic geornetry, especially the works of Menger, Schweizer and Sklar, on the one band, and the theory of fuzzy sets due to Zadeh, 10n the other band has been very fruitful. Triangular norms have proved to be ready-rnade mathematical rnodels of fuzzy set intersections and have shed light on the algebraic foundations of fuzzy sets. One basic idea behind the study of triangular norms is to solve functional equations that stern frorn prescribed axioms describing algebraic properties such as associativity. Alternative operations such as rneans have been characterized in a similar way by Kolmogorov, for instance, and the rnethods for solving functional equations are now weil established thanks to the efforts of Aczel, among others. One can say without overstaternent that the introduction of triangular norms in fuzzy sets has strongly influenced further developrnents in fuzzy set theory, and has significantly contributed to its better acceptance in pure and applied rnathematics circles. The book by Fodor and Roubens systematically exploits the benefits of this encounter in the- analysis of fuzzy relations. The authors apply functional equation rnethods to notions such as equivalence relations, and various kinds of orderings, for the purpose of preference rnodelling. Centtal to this book is the rnultivalued extension of the well-known result claiming that any relation expressing weak preference can be separated into three cornponents respectively describing strict preference, indifference and incomparability.
This volume contains a collection of research papers centered around the concept of quantifier. Recently this concept has become the central point of research in logic. It is one of the important logical concepts whose exact domain and applications have so far been insufficiently explored, especially in the area of inferential and semantic properties of languages. It should thus remain the central point of research in the future. Moreover, during the last twenty years generalized quantifiers and logical technics based on them have proved their utility in various applications. The example of natu rallanguage semantics has been partcularly striking. For a long time it has been belived that elementary logic also called first-order logic was an ade quate theory of logical forms of natural language sentences. Recently it has been accepted that semantics of many natural language constructions can not be properly represented in elementary logic. It has turned out, however, that they can be described by means of generalized quantifiers. As far as computational applications oflogic are concerned, particulary interesting are semantics restricted to finite models. Under this restriction elementary logic looses several of its advantages such as axiomatizability and compactness. And for various purposes we can use equally well some semantically richer languages of which generalized quantifiers offer the most universal methods of describing extensions of elementary logic. Moreover we can look at generalized quantifiers as an explication of some specific mathematical concepts, e. g."
This volume contains a selection of papers presented at a Seminar on Intensional Logic held at the University of Amsterdam during the period September 1990-May 1991. Modal logic, either as a topic or as a tool, is common to most of the papers in this volume. A number of the papers are con cerned with what may be called well-known or traditional modal systems, but, as a quick glance through this volume will reveal, this by no means implies that they walk the beaten tracks. In deed, such contributions display new directions, new results, and new techniques to obtain familiar results. Other papers in this volume are representative examples of a current trend in modal logic: the study of extensions or adaptations of the standard sys tems that have been introduced to overcome various shortcomings of the latter, especially their limited expressive power. Finally, there is another major theme that can be discerned in the vol ume, a theme that may be described by the slogan 'representing changing information. ' Papers falling under this heading address long-standing issues in the area, or present a systematic approach, while a critical survey and a report contributing new techniques are also included. The bulk of the papers on pure modal logic deal with theoreti calor even foundational aspects of modal systems."
0.1. General remarks. For any algebraic system A, the set SubA of all subsystems of A partially ordered by inclusion forms a lattice. This is the subsystem lattice of A. (In certain cases, such as that of semigroups, in order to have the right always to say that SubA is a lattice, we have to treat the empty set as a subsystem.) The study of various inter-relationships between systems and their subsystem lattices is a rather large field of investigation developed over many years. This trend was formed first in group theory; basic relevant information up to the early seventies is contained in the book [Suz] and the surveys [K Pek St], [Sad 2], [Ar Sad], there is also a quite recent book [Schm 2]. As another inspiring source, one should point out a branch of mathematics to which the book [Baer] was devoted. One of the key objects of examination in this branch is the subspace lattice of a vector space over a skew field. A more general approach deals with modules and their submodule lattices. Examining subsystem lattices for the case of modules as well as for rings and algebras (both associative and non-associative, in particular, Lie algebras) began more than thirty years ago; there are results on this subject also for lattices, Boolean algebras and some other types of algebraic systems, both concrete and general. A lot of works including several surveys have been published here.
This monograph develops projective geometries and provides a systematic treatment of morphisms. It introduces a new fundamental theorem and its applications describing morphisms of projective geometries in homogeneous coordinates by semilinear maps. Other topics treated include three equivalent definitions of projective geometries and their correspondence with certain lattices; quotients of projective geometries and isomorphism theorems; and recent results in dimension theory.
In 1953, exactly 50 years ago to this day, the first volume of
Studia Logica appeared under the auspices of The Philosophical
Committee of The Polish Academy of Sciences. Now, five decades
later the present volume is dedicated to a celebration of this 50th
Anniversary of Studia Logica. The volume features a series of
papers by distinguished scholars reflecting both the aim and scope
of this journal for symbolic logic.
This is a monograph about logic. Specifically, it presents the mathe matical theory of the logic of bunched implications, BI: I consider Bl's proof theory, model theory and computation theory. However, the mono graph is also about informatics in a sense which I explain. Specifically, it is about mathematical models of resources and logics for reasoning about resources. I begin with an introduction which presents my (background) view of logic from the point of view of informatics, paying particular attention to three logical topics which have arisen from the development of logic within informatics: * Resources as a basis for semantics; * Proof-search as a basis for reasoning; and * The theory of representation of object-logics in a meta-logic. The ensuing development represents a logical theory which draws upon the mathematical, philosophical and computational aspects of logic. Part I presents the logical theory of propositional BI, together with a computational interpretation. Part II presents a corresponding devel opment for predicate BI. In both parts, I develop proof-, model- and type-theoretic analyses. I also provide semantically-motivated compu tational perspectives, so beginning a mathematical theory of resources. I have not included any analysis, beyond conjecture, of properties such as decidability, finite models, games or complexity. I prefer to leave these matters to other occasions, perhaps in broader contexts.
In this book I argue that a reason for the limited success of various studies under the general heading of cybernetics is failure to appreciate the importance of con- nuity, in a simple metrical sense of the term. It is with particular, but certainly not exclusive, reference to the Arti cial Intelligence (AI) effort that the shortcomings of established approaches are most easily seen. One reason for the relative failure of attempts to analyse and model intelligence is the customary assumption that the processing of continuous variables and the manipulation of discrete concepts should be considered separately, frequently with the assumption that continuous processing plays no part in thought. There is much evidence to the contrary incl- ing the observation that the remarkable ability of people and animals to learn from experience nds similar expression in tasks of both discrete and continuous nature and in tasks that require intimate mixing of the two. Such tasks include everyday voluntary movement while preserving balance and posture, with competitive games and athletics offering extreme examples. Continuous measures enter into many tasks that are usually presented as discrete. In tasks of pattern recognition, for example, there is often a continuous measure of the similarity of an imposed pattern to each of a set of paradigms, of which the most similar is selected. The importance of continuity is also indicated by the fact that adjectives and adverbs in everyday verbal communication have comparative and superlative forms.
Constructibility and complexity play central roles in recent research in computer science, mathematics and physics. For example, scientists are investigating the complexity of computer programs, constructive proofs in mathematics and the randomness of physical processes. But there are different approaches to the explication of these concepts. This volume presents important research on the state of this discussion, especially as it refers to quantum mechanics. This foundational debate' in computer science, mathematics and physics was already fully developed in 1930 in the Vienna Circle. A special section is devoted to its real founder Hans Hahn, referring to his contribution to the history and philosophy of science. The documentation section presents articles on the early Philipp Frank and on the Vienna Circle in exile. Reviews cover important recent literature on logical empiricism and related topics.
In the summer of 1991 the Department of Mathematics and Statistics of the Universite de Montreal was fortunate to host the NATO Advanced Study Institute "Algebras and Orders" as its 30th Seminaire de mathematiques superieures (SMS), a summer school with a long tradition and well-established reputation. This book contains the contributions of the invited speakers. Universal algebra- which established itself only in the 1930's- grew from traditional algebra (e.g., groups, modules, rings and lattices) and logic (e.g., propositional calculus, model theory and the theory of relations). It started by extending results from these fields but by now it is a well-established and dynamic discipline in its own right. One of the objectives of the ASI was to cover a broad spectrum of topics in this field, and to put in evidence the natural links to, and interactions with, boolean algebra, lattice theory, topology, graphs, relations, automata, theoretical computer science and (partial) orders. The theory of orders is a relatively young and vigorous discipline sharing certain topics as well as many researchers and meetings with universal algebra and lattice theory. W. Taylor surveyed the abstract clone theory which formalizes the process of compos ing operations (i.e., the formation of term operations) of an algebra as a special category with countably many objects, and leading naturally to the interpretation and equivalence of varieties."
hiS volume in the Synthese Library Series is the result of a conference T held at the University of Roskilde, Denmark, October 31st-November 1st, 1997. The aim was to provide a forum within which philosophers, math ematicians, logicians and historians of mathematics could exchange ideas pertaining to the historical and philosophical development of proof theory. Hence the conference was called Proof Theory: History and Philosophical Significance. To quote from the conference abstract: Proof theory was developed as part of Hilberts Programme. According to Hilberts Programme one could provide mathematics with a firm and se cure foundation by formalizing all of mathematics and subsequently prove consistency of these formal systems by finitistic means. Hence proof theory was developed as a formal tool through which this goal should be fulfilled. It is well known that Hilbert's Programme in its original form was unfeasible mainly due to Gtldel's incompleteness theorems. Additionally it proved impossible to formalize all of mathematics and impossible to even prove the consistency of relatively simple formalized fragments of mathematics by finitistic methods. In spite of these problems, Gentzen showed that by extending Hilbert's proof theory it would be possible to prove the consistency of interesting formal systems, perhaps not by finitis tic methods but still by methods of minimal strength. This generalization of Hilbert's original programme has fueled modern proof theory which is a rich part of mathematical logic with many significant implications for the philosophy of mathematics."
This book is devoted to some results from the classical Point Set Theory and their applications to certain problems in mathematical analysis of the real line. Notice that various topics from this theory are presented in several books and surveys. From among the most important works devoted to Point Set Theory, let us first of all mention the excellent book by Oxtoby [83] in which a deep analogy between measure and category is discussed in detail. Further, an interesting general approach to problems concerning measure and category is developed in the well-known monograph by Morgan [79] where a fundamental concept of a category base is introduced and investigated. We also wish to mention that the monograph by Cichon, W";glorz and the author [19] has recently been published. In that book, certain classes of subsets of the real line are studied and various cardinal valued functions (characteristics) closely connected with those classes are investigated. Obviously, the IT-ideal of all Lebesgue measure zero subsets of the real line and the IT-ideal of all first category subsets of the same line are extensively studied in [19], and several relatively new results concerning this topic are presented. Finally, it is reasonable to notice here that some special sets of points, the so-called singular spaces, are considered in the classi
without a properly developed inconsistent calculus based on infinitesimals, then in consistent claims from the history of the calculus might well simply be symptoms of confusion. This is addressed in Chapter 5. It is further argued that mathematics has a certain primacy over logic, in that paraconsistent or relevant logics have to be based on inconsistent mathematics. If the latter turns out to be reasonably rich then paraconsistentism is vindicated; while if inconsistent mathematics has seri ous restriytions then the case for being interested in inconsistency-tolerant logics is weakened. (On such restrictions, see this chapter, section 3. ) It must be conceded that fault-tolerant computer programming (e. g. Chapter 8) finds a substantial and important use for paraconsistent logics, albeit with an epistemological motivation (see this chapter, section 3). But even here it should be noted that if inconsistent mathematics turned out to be functionally impoverished then so would inconsistent databases. 2. Summary In Chapter 2, Meyer's results on relevant arithmetic are set out, and his view that they have a bearing on G8del's incompleteness theorems is discussed. Model theory for nonclassical logics is also set out so as to be able to show that the inconsistency of inconsistent theories can be controlled or limited, but in this book model theory is kept in the background as much as possible. This is then used to study the functional properties of various equational number theories."
This book presents a specific and unified approach to Knowledge Discovery and Data Mining, termed IFN for Information Fuzzy Network methodology. Data Mining (DM) is the science of modelling and generalizing common patterns from large sets of multi-type data. DM is a part of KDD, which is the overall process for Knowledge Discovery in Databases. The accessibility and abundance of information today makes this a topic of particular importance and need. The book has three main parts complemented by appendices as well as software and project data that are accessible from the book's web site (http: //www.eng.tau.ac.iV-maimonlifn-kdg ). Part I (Chapters 1-4) starts with the topic of KDD and DM in general and makes reference to other works in the field, especially those related to the information theoretic approach. The remainder of the book presents our work, starting with the IFN theory and algorithms. Part II (Chapters 5-6) discusses the methodology of application and includes case studies. Then in Part III (Chapters 7-9) a comparative study is presented, concluding with some advanced methods and open problems. The IFN, being a generic methodology, applies to a variety of fields, such as manufacturing, finance, health care, medicine, insurance, and human resources. The appendices expand on the relevant theoretical background and present descriptions of sample projects (including detailed results)."
To our wives, Masha and Marian Interest in the so-called completely integrable systems with infinite num ber of degrees of freedom was aroused immediately after publication of the famous series of papers by Gardner, Greene, Kruskal, Miura, and Zabusky [75, 77, 96, 18, 66, 19J (see also [76]) on striking properties of the Korteweg-de Vries (KdV) equation. It soon became clear that systems of such a kind possess a number of characteristic properties, such as infinite series of symmetries and/or conservation laws, inverse scattering problem formulation, L - A pair representation, existence of prolongation structures, etc. And though no satisfactory definition of complete integrability was yet invented, a need of testing a particular system for these properties appeared. Probably one of the most efficient tests of this kind was first proposed by Lenard [19]' who constructed a recursion operator for symmetries of the KdV equation. It was a strange operator, in a sense: being formally integro-differential, its action on the first classical symmetry (x-translation) was well-defined and produced the entire series of higher KdV equations; but applied to the scaling symmetry, it gave expressions containing terms of the type J u dx which had no adequate interpretation in the framework of the existing theories. It is not surprising that P. Olver wrote "The de duction of the form of the recursion operator (if it exists) requires a certain amount of inspired guesswork. . . " [80, p.
1. BASIC CONCEPTS OF INTERACTIVE THEOREM PROVING Interactive Theorem Proving ultimately aims at the construction of powerful reasoning tools that let us (computer scientists) prove things we cannot prove without the tools, and the tools cannot prove without us. Interaction typi cally is needed, for example, to direct and control the reasoning, to speculate or generalize strategic lemmas, and sometimes simply because the conjec ture to be proved does not hold. In software verification, for example, correct versions of specifications and programs typically are obtained only after a number of failed proof attempts and subsequent error corrections. Different interactive theorem provers may actually look quite different: They may support different logics (first-or higher-order, logics of programs, type theory etc.), may be generic or special-purpose tools, or may be tar geted to different applications. Nevertheless, they share common concepts and paradigms (e.g. architectural design, tactics, tactical reasoning etc.). The aim of this chapter is to describe the common concepts, design principles, and basic requirements of interactive theorem provers, and to explore the band width of variations. Having a 'person in the loop', strongly influences the design of the proof tool: proofs must remain comprehensible, - proof rules must be high-level and human-oriented, - persistent proof presentation and visualization becomes very important."
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solution, numerical methods for solving integral equations of the second kind, and boundary integral equations for planar regions. The presentation of each topic is meant to be an introduction with certain degree of depth. Comprehensive references on a particular topic are listed at the end of each chapter for further reading and study. Because of the relevance in solving real world problems, multivariable polynomials are playing an ever more important role in research and applications. In this third editon, a new chapter on this topic has been included and some major changes are made on two chapters from the previous edition. In addition, there are numerous minor changes throughout the entire text and new exercises are added. Review of earlier edition: ..".the book is clearly written, quite pleasant to read, and contains a lot of important material; and the authors have done an excellent job at balancing theoretical developments, interesting examples and exercises, numerical experiments, and bibliographical references." R. Glowinski, SIAM Review, 2003
Lattice-valued Logic aims at establishing the logical foundation for uncertain information processing routinely performed by humans and artificial intelligence systems. In this textbook for the first time a general introduction on lattice-valued logic is given. It systematically summarizes research from the basic notions up to recent results on lattice implication algebras, lattice-valued logic systems based on lattice implication algebras, as well as the corresponding reasoning theories and methods. The book provides the suitable theoretical logical background of lattice-valued logic systems and supports newly designed intelligent uncertain-information-processing systems and a wide spectrum of intelligent learning tasks.
Recent major advances in model theory include connections between model theory and Diophantine and real analytic geometry, permutation groups, and finite algebras. The present book contains lectures on recent results in algebraic model theory, covering topics from the following areas: geometric model theory, the model theory of analytic structures, permutation groups in model theory, the spectra of countable theories, and the structure of finite algebras. Audience: Graduate students in logic and others wishing to keep abreast of current trends in model theory. The lectures contain sufficient introductory material to be able to grasp the recent results presented.
Reasoning under uncertainty is always based on a specified language or for malism, including its particular syntax and semantics, but also on its associated inference mechanism. In the present volume of the handbook the last aspect, the algorithmic aspects of uncertainty calculi are presented. Theory has suffi ciently advanced to unfold some generally applicable fundamental structures and methods. On the other hand, particular features of specific formalisms and ap proaches to uncertainty of course still influence strongly the computational meth ods to be used. Both general as well as specific methods are included in this volume. Broadly speaking, symbolic or logical approaches to uncertainty and nu merical approaches are often distinguished. Although this distinction is somewhat misleading, it is used as a means to structure the present volume. This is even to some degree reflected in the two first chapters, which treat fundamental, general methods of computation in systems designed to represent uncertainty. It has been noted early by Shenoy and Shafer, that computations in different domains have an underlying common structure. Essentially pieces of knowledge or information are to be combined together and then focused on some particular question or domain. This can be captured in an algebraic structure called valuation algebra which is described in the first chapter. Here the basic operations of combination and focus ing (marginalization) of knowledge and information is modeled abstractly subject to simple axioms." |
You may like...
Ranked Set Sampling Models and Methods
Carlos N. Bouza-Herrera
Hardcover
R5,333
Discovery Miles 53 330
|