0
Your cart

Your cart is empty

Browse All Departments
Price
  • R100 - R250 (261)
  • R250 - R500 (626)
  • R500+ (2,122)
  • -
Status
Format
Author / Contributor
Publisher

Books > Science & Mathematics > Mathematics > Mathematical foundations > General

Language, Truth and Logic in Mathematics (Paperback, Softcover reprint of hardcover 1st ed. 1997): Jaakko Hintikka Language, Truth and Logic in Mathematics (Paperback, Softcover reprint of hardcover 1st ed. 1997)
Jaakko Hintikka
R4,011 Discovery Miles 40 110 Ships in 18 - 22 working days

One can distinguish, roughly speaking, two different approaches to the philosophy of mathematics. On the one hand, some philosophers (and some mathematicians) take the nature and the results of mathematicians' activities as given, and go on to ask what philosophical morals one might perhaps find in their story. On the other hand, some philosophers, logicians and mathematicians have tried or are trying to subject the very concepts which mathematicians are using in their work to critical scrutiny. In practice this usually means scrutinizing the logical and linguistic tools mathematicians wield. Such scrutiny can scarcely help relying on philosophical ideas and principles. In other words it can scarcely help being literally a study of language, truth and logic in mathematics, albeit not necessarily in the spirit of AJ. Ayer. As its title indicates, the essays included in the present volume represent the latter approach. In most of them one of the fundamental concepts in the foundations of mathematics and logic is subjected to a scrutiny from a largely novel point of view. Typically, it turns out that the concept in question is in need of a revision or reconsideration or at least can be given a new twist. The results of such a re-examination are not primarily critical, however, but typically open up new constructive possibilities. The consequences of such deconstructions and reconstructions are often quite sweeping, and are explored in the same paper or in others.

Simple Theories (Paperback, Softcover reprint of hardcover 1st ed. 2000): Frank O. Wagner Simple Theories (Paperback, Softcover reprint of hardcover 1st ed. 2000)
Frank O. Wagner
R1,408 Discovery Miles 14 080 Ships in 18 - 22 working days

Simplicity theory is an extension of stability theory to a wider class of structures, containing, among others, the random graph, pseudo-finite fields, and fields with a generic automorphism. Following Kim's proof of forking symmetry' which implies a good behaviour of model-theoretic independence, this area of model theory has been a field of intense study. It has necessitated the development of some important new tools, most notably the model-theoretic treatment of hyperimaginaries (classes modulo type-definable equivalence relations). It thus provides a general notion of independence (and of rank in the supersimple case) applicable to a wide class of algebraic structures. The basic theory of forking independence is developed, and its properties in a simple structure are analyzed. No prior knowledge of stability theory is assumed; in fact many stability-theoretic results follow either from more general propositions, or are developed in side remarks. Audience: This book is intended both as an introduction to simplicity theory accessible to graduate students with some knowledge of model theory, and as a reference work for research in the field.

Information and Randomness - An Algorithmic Perspective (Paperback, Softcover reprint of hardcover 2nd ed. 2002): Cristian S.... Information and Randomness - An Algorithmic Perspective (Paperback, Softcover reprint of hardcover 2nd ed. 2002)
Cristian S. Calude
R1,687 Discovery Miles 16 870 Ships in 18 - 22 working days

The first edition of the monograph Information and Randomness: An Algorithmic Perspective by Crist ian Calude was published in 1994. In my Foreword I said: "The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. . . . The present book by Calude fits very well in our series. Much original research is presented. . . making the approach richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and, thus, it is not necessary to consult other sources for classroom instruction. " The vigorous growth in the study of algorithmic information theory has continued during the past few years, which is clearly visible in the present second edition. Many new results, examples, exercises and open prob lems have been added. The additions include two entirely new chapters: "Computably Enumerable Random Reals" and "Randomness and Incom pleteness." The really comprehensive new bibliography makes the book very valuable for a researcher. The new results about the characterization of computably enumerable random reals, as well as the fascinating Omega Numbers, should contribute much to the value of the book as a textbook. The author has been directly involved in these results that have appeared in the prestigious journals Nature, New Scientist and Pour la Science."

The Logic of Time - A Model-Theoretic Investigation into the Varieties of  Temporal Ontology and Temporal Discourse (Paperback,... The Logic of Time - A Model-Theoretic Investigation into the Varieties of Temporal Ontology and Temporal Discourse (Paperback, Softcover reprint of hardcover 2nd ed. 1991)
Johan Van Benthem
R4,016 Discovery Miles 40 160 Ships in 18 - 22 working days

The subject of Time has a wide intellectual appeal across different dis ciplines. This has shown in the variety of reactions received from readers of the first edition of the present Book. Many have reacted to issues raised in its philosophical discussions, while some have even solved a number of the open technical questions raised in the logical elaboration of the latter. These results will be recorded below, at a more convenient place. In the seven years after the first publication, there have been some noticeable newer developments in the logical study of Time and temporal expressions. As far as Temporal Logic proper is concerned, it seems fair to say that these amount to an increase in coverage and sophistication, rather than further break-through innovation. In fact, perhaps the most significant sources of new activity have been the applied areas of Linguistics and Computer Science (including Artificial Intelligence), where many intriguing new ideas have appeared presenting further challenges to temporal logic. Now, since this Book has a rather tight composition, it would have been difficult to interpolate this new material without endangering intelligibility."

Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of... Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of hardcover 1st ed. 1992)
Shi-Yu Huang
R7,688 Discovery Miles 76 880 Ships in 18 - 22 working days

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.

Fuzzy Preference Modelling and Multicriteria Decision Support (Paperback, Softcover reprint of hardcover 1st ed. 1995): J. C.... Fuzzy Preference Modelling and Multicriteria Decision Support (Paperback, Softcover reprint of hardcover 1st ed. 1995)
J. C. Fodor, M. R. Roubens
R4,006 Discovery Miles 40 060 Ships in 18 - 22 working days

The encounter, in the late seventies, between the theory of triangular norms, issuing frorn stochastic geornetry, especially the works of Menger, Schweizer and Sklar, on the one band, and the theory of fuzzy sets due to Zadeh, 10n the other band has been very fruitful. Triangular norms have proved to be ready-rnade mathematical rnodels of fuzzy set intersections and have shed light on the algebraic foundations of fuzzy sets. One basic idea behind the study of triangular norms is to solve functional equations that stern frorn prescribed axioms describing algebraic properties such as associativity. Alternative operations such as rneans have been characterized in a similar way by Kolmogorov, for instance, and the rnethods for solving functional equations are now weil established thanks to the efforts of Aczel, among others. One can say without overstaternent that the introduction of triangular norms in fuzzy sets has strongly influenced further developrnents in fuzzy set theory, and has significantly contributed to its better acceptance in pure and applied rnathematics circles. The book by Fodor and Roubens systematically exploits the benefits of this encounter in the- analysis of fuzzy relations. The authors apply functional equation rnethods to notions such as equivalence relations, and various kinds of orderings, for the purpose of preference rnodelling. Centtal to this book is the rnultivalued extension of the well-known result claiming that any relation expressing weak preference can be separated into three cornponents respectively describing strict preference, indifference and incomparability.

Quantifiers: Logics, Models and Computation - Volume Two: Contributions (Paperback, Softcover reprint of 1st ed. 1995): Michal... Quantifiers: Logics, Models and Computation - Volume Two: Contributions (Paperback, Softcover reprint of 1st ed. 1995)
Michal Krynicki, M. Mostowski, L.W. Szczerba
R2,651 Discovery Miles 26 510 Ships in 18 - 22 working days

This volume contains a collection of research papers centered around the concept of quantifier. Recently this concept has become the central point of research in logic. It is one of the important logical concepts whose exact domain and applications have so far been insufficiently explored, especially in the area of inferential and semantic properties of languages. It should thus remain the central point of research in the future. Moreover, during the last twenty years generalized quantifiers and logical technics based on them have proved their utility in various applications. The example of natu rallanguage semantics has been partcularly striking. For a long time it has been belived that elementary logic also called first-order logic was an ade quate theory of logical forms of natural language sentences. Recently it has been accepted that semantics of many natural language constructions can not be properly represented in elementary logic. It has turned out, however, that they can be described by means of generalized quantifiers. As far as computational applications oflogic are concerned, particulary interesting are semantics restricted to finite models. Under this restriction elementary logic looses several of its advantages such as axiomatizability and compactness. And for various purposes we can use equally well some semantically richer languages of which generalized quantifiers offer the most universal methods of describing extensions of elementary logic. Moreover we can look at generalized quantifiers as an explication of some specific mathematical concepts, e. g."

Diamonds and Defaults - Studies in Pure and Applied Intensional Logic (Paperback, Softcover reprint of the original 1st ed.... Diamonds and Defaults - Studies in Pure and Applied Intensional Logic (Paperback, Softcover reprint of the original 1st ed. 1993)
Maarten de Rijke
R4,038 Discovery Miles 40 380 Ships in 18 - 22 working days

This volume contains a selection of papers presented at a Seminar on Intensional Logic held at the University of Amsterdam during the period September 1990-May 1991. Modal logic, either as a topic or as a tool, is common to most of the papers in this volume. A number of the papers are con cerned with what may be called well-known or traditional modal systems, but, as a quick glance through this volume will reveal, this by no means implies that they walk the beaten tracks. In deed, such contributions display new directions, new results, and new techniques to obtain familiar results. Other papers in this volume are representative examples of a current trend in modal logic: the study of extensions or adaptations of the standard sys tems that have been introduced to overcome various shortcomings of the latter, especially their limited expressive power. Finally, there is another major theme that can be discerned in the vol ume, a theme that may be described by the slogan 'representing changing information. ' Papers falling under this heading address long-standing issues in the area, or present a systematic approach, while a critical survey and a report contributing new techniques are also included. The bulk of the papers on pure modal logic deal with theoreti calor even foundational aspects of modal systems."

Semigroups and Their Subsemigroup Lattices (Paperback, Softcover reprint of hardcover 1st ed. 1996): L. N Shevrin, A. J... Semigroups and Their Subsemigroup Lattices (Paperback, Softcover reprint of hardcover 1st ed. 1996)
L. N Shevrin, A. J Ovsyannikov
R2,680 Discovery Miles 26 800 Ships in 18 - 22 working days

0.1. General remarks. For any algebraic system A, the set SubA of all subsystems of A partially ordered by inclusion forms a lattice. This is the subsystem lattice of A. (In certain cases, such as that of semigroups, in order to have the right always to say that SubA is a lattice, we have to treat the empty set as a subsystem.) The study of various inter-relationships between systems and their subsystem lattices is a rather large field of investigation developed over many years. This trend was formed first in group theory; basic relevant information up to the early seventies is contained in the book [Suz] and the surveys [K Pek St], [Sad 2], [Ar Sad], there is also a quite recent book [Schm 2]. As another inspiring source, one should point out a branch of mathematics to which the book [Baer] was devoted. One of the key objects of examination in this branch is the subspace lattice of a vector space over a skew field. A more general approach deals with modules and their submodule lattices. Examining subsystem lattices for the case of modules as well as for rings and algebras (both associative and non-associative, in particular, Lie algebras) began more than thirty years ago; there are results on this subject also for lattices, Boolean algebras and some other types of algebraic systems, both concrete and general. A lot of works including several surveys have been published here.

Modern Projective Geometry (Paperback, Softcover reprint of the original 1st ed. 2000): Claude-Alain Faure, Alfred Froelicher Modern Projective Geometry (Paperback, Softcover reprint of the original 1st ed. 2000)
Claude-Alain Faure, Alfred Froelicher
R5,167 Discovery Miles 51 670 Ships in 18 - 22 working days

This monograph develops projective geometries and provides a systematic treatment of morphisms. It introduces a new fundamental theorem and its applications describing morphisms of projective geometries in homogeneous coordinates by semilinear maps. Other topics treated include three equivalent definitions of projective geometries and their correspondence with certain lattices; quotients of projective geometries and isomorphism theorems; and recent results in dimension theory.

Trends in Logic - 50 Years of Studia Logica (Paperback, Softcover reprint of hardcover 1st ed. 2003): Vincent F Hendricks,... Trends in Logic - 50 Years of Studia Logica (Paperback, Softcover reprint of hardcover 1st ed. 2003)
Vincent F Hendricks, Jacek Malinowski
R4,039 Discovery Miles 40 390 Ships in 18 - 22 working days

In 1953, exactly 50 years ago to this day, the first volume of Studia Logica appeared under the auspices of The Philosophical Committee of The Polish Academy of Sciences. Now, five decades later the present volume is dedicated to a celebration of this 50th Anniversary of Studia Logica. The volume features a series of papers by distinguished scholars reflecting both the aim and scope of this journal for symbolic logic.
The Anniversary volume offers contributions from J. van Benthem, W. Buszkowski, M.L. Dalla Chiara, M. Fitting, J.M. Font, R. Giuntini, R. Goldblatt, V. Marra, D. Mundici, R. Leporini, S.P. Odintsov, H. Ono, G. Priest, H. Wansing, V.R. Wojcicki and J. Zygmunt.

The Semantics and Proof Theory of the Logic of Bunched Implications (Paperback, Softcover reprint of hardcover 1st ed. 2002):... The Semantics and Proof Theory of the Logic of Bunched Implications (Paperback, Softcover reprint of hardcover 1st ed. 2002)
David J. Pym
R4,025 Discovery Miles 40 250 Ships in 18 - 22 working days

This is a monograph about logic. Specifically, it presents the mathe matical theory of the logic of bunched implications, BI: I consider Bl's proof theory, model theory and computation theory. However, the mono graph is also about informatics in a sense which I explain. Specifically, it is about mathematical models of resources and logics for reasoning about resources. I begin with an introduction which presents my (background) view of logic from the point of view of informatics, paying particular attention to three logical topics which have arisen from the development of logic within informatics: * Resources as a basis for semantics; * Proof-search as a basis for reasoning; and * The theory of representation of object-logics in a meta-logic. The ensuing development represents a logical theory which draws upon the mathematical, philosophical and computational aspects of logic. Part I presents the logical theory of propositional BI, together with a computational interpretation. Part II presents a corresponding devel opment for predicate BI. In both parts, I develop proof-, model- and type-theoretic analyses. I also provide semantically-motivated compu tational perspectives, so beginning a mathematical theory of resources. I have not included any analysis, beyond conjecture, of properties such as decidability, finite models, games or complexity. I prefer to leave these matters to other occasions, perhaps in broader contexts.

A Missing Link in Cybernetics - Logic and Continuity (Paperback, Softcover reprint of hardcover 1st ed. 2009): Alex M. Andrew A Missing Link in Cybernetics - Logic and Continuity (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Alex M. Andrew
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

In this book I argue that a reason for the limited success of various studies under the general heading of cybernetics is failure to appreciate the importance of con- nuity, in a simple metrical sense of the term. It is with particular, but certainly not exclusive, reference to the Arti cial Intelligence (AI) effort that the shortcomings of established approaches are most easily seen. One reason for the relative failure of attempts to analyse and model intelligence is the customary assumption that the processing of continuous variables and the manipulation of discrete concepts should be considered separately, frequently with the assumption that continuous processing plays no part in thought. There is much evidence to the contrary incl- ing the observation that the remarkable ability of people and animals to learn from experience nds similar expression in tasks of both discrete and continuous nature and in tasks that require intimate mixing of the two. Such tasks include everyday voluntary movement while preserving balance and posture, with competitive games and athletics offering extreme examples. Continuous measures enter into many tasks that are usually presented as discrete. In tasks of pattern recognition, for example, there is often a continuous measure of the similarity of an imposed pattern to each of a set of paradigms, of which the most similar is selected. The importance of continuity is also indicated by the fact that adjectives and adverbs in everyday verbal communication have comparative and superlative forms.

The Foundational Debate - Complexity and Constructivity in Mathematics and Physics (Paperback, Softcover reprint of hardcover... The Foundational Debate - Complexity and Constructivity in Mathematics and Physics (Paperback, Softcover reprint of hardcover 1st ed. 1995)
Werner DePauli- Schimanovich, Eckehart Koehler, F. Stadler
R4,045 Discovery Miles 40 450 Ships in 18 - 22 working days

Constructibility and complexity play central roles in recent research in computer science, mathematics and physics. For example, scientists are investigating the complexity of computer programs, constructive proofs in mathematics and the randomness of physical processes. But there are different approaches to the explication of these concepts. This volume presents important research on the state of this discussion, especially as it refers to quantum mechanics. This foundational debate' in computer science, mathematics and physics was already fully developed in 1930 in the Vienna Circle. A special section is devoted to its real founder Hans Hahn, referring to his contribution to the history and philosophy of science. The documentation section presents articles on the early Philipp Frank and on the Vienna Circle in exile. Reviews cover important recent literature on logical empiricism and related topics.

Algebras and Orders (Paperback, Softcover reprint of hardcover 1st ed. 1993): Ivo G. Rosenberg, Gert Sabidussi Algebras and Orders (Paperback, Softcover reprint of hardcover 1st ed. 1993)
Ivo G. Rosenberg, Gert Sabidussi
R12,690 Discovery Miles 126 900 Ships in 18 - 22 working days

In the summer of 1991 the Department of Mathematics and Statistics of the Universite de Montreal was fortunate to host the NATO Advanced Study Institute "Algebras and Orders" as its 30th Seminaire de mathematiques superieures (SMS), a summer school with a long tradition and well-established reputation. This book contains the contributions of the invited speakers. Universal algebra- which established itself only in the 1930's- grew from traditional algebra (e.g., groups, modules, rings and lattices) and logic (e.g., propositional calculus, model theory and the theory of relations). It started by extending results from these fields but by now it is a well-established and dynamic discipline in its own right. One of the objectives of the ASI was to cover a broad spectrum of topics in this field, and to put in evidence the natural links to, and interactions with, boolean algebra, lattice theory, topology, graphs, relations, automata, theoretical computer science and (partial) orders. The theory of orders is a relatively young and vigorous discipline sharing certain topics as well as many researchers and meetings with universal algebra and lattice theory. W. Taylor surveyed the abstract clone theory which formalizes the process of compos ing operations (i.e., the formation of term operations) of an algebra as a special category with countably many objects, and leading naturally to the interpretation and equivalence of varieties."

Proof Theory - History and Philosophical Significance (Paperback, Softcover reprint of hardcover 1st ed. 2000): Vincent F... Proof Theory - History and Philosophical Significance (Paperback, Softcover reprint of hardcover 1st ed. 2000)
Vincent F Hendricks, Stig Andur Pedersen, Klaus Frovin Jorgensen
R2,647 Discovery Miles 26 470 Ships in 18 - 22 working days

hiS volume in the Synthese Library Series is the result of a conference T held at the University of Roskilde, Denmark, October 31st-November 1st, 1997. The aim was to provide a forum within which philosophers, math ematicians, logicians and historians of mathematics could exchange ideas pertaining to the historical and philosophical development of proof theory. Hence the conference was called Proof Theory: History and Philosophical Significance. To quote from the conference abstract: Proof theory was developed as part of Hilberts Programme. According to Hilberts Programme one could provide mathematics with a firm and se cure foundation by formalizing all of mathematics and subsequently prove consistency of these formal systems by finitistic means. Hence proof theory was developed as a formal tool through which this goal should be fulfilled. It is well known that Hilbert's Programme in its original form was unfeasible mainly due to Gtldel's incompleteness theorems. Additionally it proved impossible to formalize all of mathematics and impossible to even prove the consistency of relatively simple formalized fragments of mathematics by finitistic methods. In spite of these problems, Gentzen showed that by extending Hilbert's proof theory it would be possible to prove the consistency of interesting formal systems, perhaps not by finitis tic methods but still by methods of minimal strength. This generalization of Hilbert's original programme has fueled modern proof theory which is a rich part of mathematical logic with many significant implications for the philosophy of mathematics."

Applications of Point Set Theory in Real Analysis (Paperback, Softcover reprint of hardcover 1st ed. 1998): A.B. Kharazishvili Applications of Point Set Theory in Real Analysis (Paperback, Softcover reprint of hardcover 1st ed. 1998)
A.B. Kharazishvili
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

This book is devoted to some results from the classical Point Set Theory and their applications to certain problems in mathematical analysis of the real line. Notice that various topics from this theory are presented in several books and surveys. From among the most important works devoted to Point Set Theory, let us first of all mention the excellent book by Oxtoby [83] in which a deep analogy between measure and category is discussed in detail. Further, an interesting general approach to problems concerning measure and category is developed in the well-known monograph by Morgan [79] where a fundamental concept of a category base is introduced and investigated. We also wish to mention that the monograph by Cichon, W";glorz and the author [19] has recently been published. In that book, certain classes of subsets of the real line are studied and various cardinal valued functions (characteristics) closely connected with those classes are investigated. Obviously, the IT-ideal of all Lebesgue measure zero subsets of the real line and the IT-ideal of all first category subsets of the same line are extensively studied in [19], and several relatively new results concerning this topic are presented. Finally, it is reasonable to notice here that some special sets of points, the so-called singular spaces, are considered in the classi

Inconsistent Mathematics (Paperback, Softcover reprint of hardcover 1st ed. 1995): C. E. Mortensen Inconsistent Mathematics (Paperback, Softcover reprint of hardcover 1st ed. 1995)
C. E. Mortensen
R1,408 Discovery Miles 14 080 Ships in 18 - 22 working days

without a properly developed inconsistent calculus based on infinitesimals, then in consistent claims from the history of the calculus might well simply be symptoms of confusion. This is addressed in Chapter 5. It is further argued that mathematics has a certain primacy over logic, in that paraconsistent or relevant logics have to be based on inconsistent mathematics. If the latter turns out to be reasonably rich then paraconsistentism is vindicated; while if inconsistent mathematics has seri ous restriytions then the case for being interested in inconsistency-tolerant logics is weakened. (On such restrictions, see this chapter, section 3. ) It must be conceded that fault-tolerant computer programming (e. g. Chapter 8) finds a substantial and important use for paraconsistent logics, albeit with an epistemological motivation (see this chapter, section 3). But even here it should be noted that if inconsistent mathematics turned out to be functionally impoverished then so would inconsistent databases. 2. Summary In Chapter 2, Meyer's results on relevant arithmetic are set out, and his view that they have a bearing on G8del's incompleteness theorems is discussed. Model theory for nonclassical logics is also set out so as to be able to show that the inconsistency of inconsistent theories can be controlled or limited, but in this book model theory is kept in the background as much as possible. This is then used to study the functional properties of various equational number theories."

Knowledge Discovery and Data Mining - The Info-Fuzzy Network (IFN) Methodology (Paperback, Softcover reprint of hardcover 1st... Knowledge Discovery and Data Mining - The Info-Fuzzy Network (IFN) Methodology (Paperback, Softcover reprint of hardcover 1st ed. 2001)
O. Maimon, M. Last
R2,624 Discovery Miles 26 240 Ships in 18 - 22 working days

This book presents a specific and unified approach to Knowledge Discovery and Data Mining, termed IFN for Information Fuzzy Network methodology. Data Mining (DM) is the science of modelling and generalizing common patterns from large sets of multi-type data. DM is a part of KDD, which is the overall process for Knowledge Discovery in Databases. The accessibility and abundance of information today makes this a topic of particular importance and need. The book has three main parts complemented by appendices as well as software and project data that are accessible from the book's web site (http: //www.eng.tau.ac.iV-maimonlifn-kdg ). Part I (Chapters 1-4) starts with the topic of KDD and DM in general and makes reference to other works in the field, especially those related to the information theoretic approach. The remainder of the book presents our work, starting with the IFN theory and algorithms. Part II (Chapters 5-6) discusses the methodology of application and includes case studies. Then in Part III (Chapters 7-9) a comparative study is presented, concluding with some advanced methods and open problems. The IFN, being a generic methodology, applies to a variety of fields, such as manufacturing, finance, health care, medicine, insurance, and human resources. The appendices expand on the relevant theoretical background and present descriptions of sample projects (including detailed results)."

Symmetries and Recursion Operators for Classical and Supersymmetric Differential Equations (Paperback, Softcover reprint of... Symmetries and Recursion Operators for Classical and Supersymmetric Differential Equations (Paperback, Softcover reprint of hardcover 1st ed. 2000)
I.S. Krasil'shchik, P. H. Kersten
R4,041 Discovery Miles 40 410 Ships in 18 - 22 working days

To our wives, Masha and Marian Interest in the so-called completely integrable systems with infinite num ber of degrees of freedom was aroused immediately after publication of the famous series of papers by Gardner, Greene, Kruskal, Miura, and Zabusky [75, 77, 96, 18, 66, 19J (see also [76]) on striking properties of the Korteweg-de Vries (KdV) equation. It soon became clear that systems of such a kind possess a number of characteristic properties, such as infinite series of symmetries and/or conservation laws, inverse scattering problem formulation, L - A pair representation, existence of prolongation structures, etc. And though no satisfactory definition of complete integrability was yet invented, a need of testing a particular system for these properties appeared. Probably one of the most efficient tests of this kind was first proposed by Lenard [19]' who constructed a recursion operator for symmetries of the KdV equation. It was a strange operator, in a sense: being formally integro-differential, its action on the first classical symmetry (x-translation) was well-defined and produced the entire series of higher KdV equations; but applied to the scaling symmetry, it gave expressions containing terms of the type J u dx which had no adequate interpretation in the framework of the existing theories. It is not surprising that P. Olver wrote "The de duction of the form of the recursion operator (if it exists) requires a certain amount of inspired guesswork. . . " [80, p.

Automated Deduction - A Basis for Applications Volume I Foundations - Calculi and Methods Volume II Systems and Implementation... Automated Deduction - A Basis for Applications Volume I Foundations - Calculi and Methods Volume II Systems and Implementation Techniques Volume III Applications (Paperback, Softcover reprint of hardcover 1st ed. 1998)
Wolfgang Bibel, P.H. Schmitt
R5,186 Discovery Miles 51 860 Ships in 18 - 22 working days

1. BASIC CONCEPTS OF INTERACTIVE THEOREM PROVING Interactive Theorem Proving ultimately aims at the construction of powerful reasoning tools that let us (computer scientists) prove things we cannot prove without the tools, and the tools cannot prove without us. Interaction typi cally is needed, for example, to direct and control the reasoning, to speculate or generalize strategic lemmas, and sometimes simply because the conjec ture to be proved does not hold. In software verification, for example, correct versions of specifications and programs typically are obtained only after a number of failed proof attempts and subsequent error corrections. Different interactive theorem provers may actually look quite different: They may support different logics (first-or higher-order, logics of programs, type theory etc.), may be generic or special-purpose tools, or may be tar geted to different applications. Nevertheless, they share common concepts and paradigms (e.g. architectural design, tactics, tactical reasoning etc.). The aim of this chapter is to describe the common concepts, design principles, and basic requirements of interactive theorem provers, and to explore the band width of variations. Having a 'person in the loop', strongly influences the design of the proof tool: proofs must remain comprehensible, - proof rules must be high-level and human-oriented, - persistent proof presentation and visualization becomes very important."

Theoretical Numerical Analysis - A Functional Analysis Framework (Paperback, Softcover reprint of hardcover 3rd ed. 2009):... Theoretical Numerical Analysis - A Functional Analysis Framework (Paperback, Softcover reprint of hardcover 3rd ed. 2009)
Kendall Atkinson, Weimin Han
R1,841 Discovery Miles 18 410 Ships in 18 - 22 working days

This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solution, numerical methods for solving integral equations of the second kind, and boundary integral equations for planar regions. The presentation of each topic is meant to be an introduction with certain degree of depth. Comprehensive references on a particular topic are listed at the end of each chapter for further reading and study.

Because of the relevance in solving real world problems, multivariable polynomials are playing an ever more important role in research and applications. In this third editon, a new chapter on this topic has been included and some major changes are made on two chapters from the previous edition. In addition, there are numerous minor changes throughout the entire text and new exercises are added.

Review of earlier edition:

..".the book is clearly written, quite pleasant to read, and contains a lot of important material; and the authors have done an excellent job at balancing theoretical developments, interesting examples and exercises, numerical experiments, and bibliographical references."

R. Glowinski, SIAM Review, 2003

Lattice-Valued Logic - An Alternative Approach to Treat Fuzziness and Incomparability (Paperback, Softcover reprint of... Lattice-Valued Logic - An Alternative Approach to Treat Fuzziness and Incomparability (Paperback, Softcover reprint of hardcover 1st ed. 2003)
Yang Xu, Da Ruan, Keyun Qin, Jun Liu
R4,043 Discovery Miles 40 430 Ships in 18 - 22 working days

Lattice-valued Logic aims at establishing the logical foundation for uncertain information processing routinely performed by humans and artificial intelligence systems. In this textbook for the first time a general introduction on lattice-valued logic is given. It systematically summarizes research from the basic notions up to recent results on lattice implication algebras, lattice-valued logic systems based on lattice implication algebras, as well as the corresponding reasoning theories and methods. The book provides the suitable theoretical logical background of lattice-valued logic systems and supports newly designed intelligent uncertain-information-processing systems and a wide spectrum of intelligent learning tasks.

Algebraic Model Theory (Paperback, Softcover reprint of hardcover 1st ed. 1997): Bradd T. Hart, A. Lachlan, Matthew A. Valeriote Algebraic Model Theory (Paperback, Softcover reprint of hardcover 1st ed. 1997)
Bradd T. Hart, A. Lachlan, Matthew A. Valeriote
R4,012 Discovery Miles 40 120 Ships in 18 - 22 working days

Recent major advances in model theory include connections between model theory and Diophantine and real analytic geometry, permutation groups, and finite algebras. The present book contains lectures on recent results in algebraic model theory, covering topics from the following areas: geometric model theory, the model theory of analytic structures, permutation groups in model theory, the spectra of countable theories, and the structure of finite algebras. Audience: Graduate students in logic and others wishing to keep abreast of current trends in model theory. The lectures contain sufficient introductory material to be able to grasp the recent results presented.

Handbook of Defeasible Reasoning and Uncertainty Management Systems - Algorithms for Uncertainty and Defeasible Reasoning... Handbook of Defeasible Reasoning and Uncertainty Management Systems - Algorithms for Uncertainty and Defeasible Reasoning (Paperback, Softcover reprint of hardcover 1st ed. 2001)
Dov M. Gabbay, Philippe Smets
R5,206 Discovery Miles 52 060 Ships in 18 - 22 working days

Reasoning under uncertainty is always based on a specified language or for malism, including its particular syntax and semantics, but also on its associated inference mechanism. In the present volume of the handbook the last aspect, the algorithmic aspects of uncertainty calculi are presented. Theory has suffi ciently advanced to unfold some generally applicable fundamental structures and methods. On the other hand, particular features of specific formalisms and ap proaches to uncertainty of course still influence strongly the computational meth ods to be used. Both general as well as specific methods are included in this volume. Broadly speaking, symbolic or logical approaches to uncertainty and nu merical approaches are often distinguished. Although this distinction is somewhat misleading, it is used as a means to structure the present volume. This is even to some degree reflected in the two first chapters, which treat fundamental, general methods of computation in systems designed to represent uncertainty. It has been noted early by Shenoy and Shafer, that computations in different domains have an underlying common structure. Essentially pieces of knowledge or information are to be combined together and then focused on some particular question or domain. This can be captured in an algebraic structure called valuation algebra which is described in the first chapter. Here the basic operations of combination and focus ing (marginalization) of knowledge and information is modeled abstractly subject to simple axioms."

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Statistical Methodologies
Jan Peter Hessling Hardcover R3,070 Discovery Miles 30 700
The Dinosaur Craft Book - 15 Things a…
Laura Minter Paperback R180 R138 Discovery Miles 1 380
Imtiaz Sooliman And The Gift Of The…
Shafiq Morton Paperback  (1)
R360 R332 Discovery Miles 3 320
Falling Monuments, Reluctant Ruins - The…
Hilton Judin Paperback R395 R365 Discovery Miles 3 650
Terrygami, 15 Cloth Toy and Ornament…
Terry Cleveland Crowley Hardcover R778 Discovery Miles 7 780
Australian Toys: A Collection
Luke Jones Hardcover R1,224 Discovery Miles 12 240
Animated Animal Toys in Wood - 20…
David Wakefield Paperback R484 R391 Discovery Miles 3 910
Hands-On ZigBee - Implementing 802.15.4…
Fred Eady Paperback R1,437 Discovery Miles 14 370
Basic mathematics for economics students…
Derek Yu Paperback R420 Discovery Miles 4 200
Ranked Set Sampling Models and Methods
Carlos N. Bouza-Herrera Hardcover R5,333 Discovery Miles 53 330

 

Partners