![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
The transition from school mathematics to university mathematics is seldom straightforward. Students are faced with a disconnect between the algorithmic and informal attitude to mathematics at school, versus a new emphasis on proof, based on logic, and a more abstract development of general concepts, based on set theory. The authors have many years' experience of the potential difficulties involved, through teaching first-year undergraduates and researching the ways in which students and mathematicians think. The book explains the motivation behind abstract foundational material based on students' experiences of school mathematics, and explicitly suggests ways students can make sense of formal ideas. This second edition takes a significant step forward by not only making the transition from intuitive to formal methods, but also by reversing the process- using structure theorems to prove that formal systems have visual and symbolic interpretations that enhance mathematical thinking. This is exemplified by a new chapter on the theory of groups. While the first edition extended counting to infinite cardinal numbers, the second also extends the real numbers rigorously to larger ordered fields. This links intuitive ideas in calculus to the formal epsilon-delta methods of analysis. The approach here is not the conventional one of 'nonstandard analysis', but a simpler, graphically based treatment which makes the notion of an infinitesimal natural and straightforward. This allows a further vision of the wider world of mathematical thinking in which formal definitions and proof lead to amazing new ways of defining, proving, visualising and symbolising mathematics beyond previous expectations.
Transactions are a concept related to the logical database as seen from the perspective of database application programmers: a transaction is a sequence of database actions that is to be executed as an atomic unit of work. The processing of transactions on databases is a well- established area with many of its foundations having already been laid in the late 1970s and early 1980s. The unique feature of this textbook is that it bridges the gap between the theory of transactions on the logical database and the implementation of the related actions on the underlying physical database. The authors relate the logical database, which is composed of a dynamically changing set of data items with unique keys, and the underlying physical database with a set of fixed-size data and index pages on disk. Their treatment of transaction processing builds on the "do-redo-undo" recovery paradigm, and all methods and algorithms presented are carefully designed to be compatible with this paradigm as well as with write-ahead logging, steal-and-no-force buffering, and fine-grained concurrency control. Chapters 1 to 6 address the basics needed to fully appreciate transaction processing on a centralized database system within the context of our transaction model, covering topics like ACID properties, database integrity, buffering, rollbacks, isolation, and the interplay of logical locks and physical latches. Chapters 7 and 8 present advanced features including deadlock-free algorithms for reading, inserting and deleting tuples, while the remaining chapters cover additional advanced topics extending on the preceding foundational chapters, including multi-granular locking, bulk actions, versioning, distributed updates, and write-intensive transactions. This book is primarily intended as a text for advanced undergraduate or graduate courses on database management in general or transaction processing in particular.
This book studies the universal constructions and properties in categories of commutative algebras, bringing out the specific properties that make commutative algebra and algebraic geometry work. Two universal constructions are presented and used here for the first time. The author shows that the concepts and constructions arising in commutative algebra and algebraic geometry are not bound so tightly to the absolute universe of rings, but possess a universality that is independent of them and can be interpreted in various categories of discourse. This brings new flexibility to classical commutative algebra and affords the possibility of extending the domain of validity and the application of the vast number of results obtained in classical commutative algebra. This innovative and original work will interest mathematicians in a range of specialities, including algebraists, categoricians, and algebraic geometers.
Mild Cognitive Impairment (MCI) has been identified as an important clinical transition between normal aging and the early stages of Alzheimer's disease (AD). Since treatments for AD are most likely to be most effective early in the course of the disease, MCI has become a topic of great importance and has been investigated in different populations of interest in many countries. This book brings together these differing perspectives on MCI for the first time. This volume provides a comprehensive resource for clinicians, researchers, and students involved in the study, diagnosis, treatment, and rehabilitation of people with MCI. Clinical investigators initially defined mild cognitive impairment (MCI) as a transitional condition between normal aging and the early stages of Alzheimer's disease (AD). Because the prevalence of AD increases with age and very large numbers of older adults are affected worldwide, these clinicians saw a pressing need to identify AD as early as possible. It is at this very early stage in the disease course that treatments to slow the progress and control symptoms are likely to be most effective. Since the first introduction of MCI, research interest has grown exponentially, and the utility of the concept has been investigated from a variety of perspectives in different populations of interest (e.g., clinical samples, volunteers, population-based screening) in many different countries. Much variability in findings has resulted. Although it has been acknowledged that the differences observed between samples may be 'legitimate variations', there has been no attempt to understand what it is we have learned about MCI (i.e., common features and differences) from each of these perspectives. This book brings together information about MCI in different populations from around the world. Mild Cognitive Impairment will be an important resource for any clinician, researcher, or student involved in the study, detection, treatment, and rehabilitation of people with MCI.
The nature of truth in mathematics is a problem which has exercised the minds of thinkers from at least the time of the ancient Greeks. The great advances in mathematics and philosophy in the twentieth century--and in particular the proof of Gödel's theorem and the development of the notion of independence in mathematics--have led to new viewpoints on his question. This book is the result of the interaction of a number of outstanding mathematicians and philosophers--including Yurii Manin, Vaughan Jones, and Per Martin-Löf--and their discussions of this problem. It provides an overview of the forefront of current thinking, and is a valuable introduction and reference for researchers in the area.
The book is primarily intended as a textbook on modern algebra for undergraduate mathematics students. It is also useful for those who are interested in supplementary reading at a higher level. The text is designed in such a way that it encourages independent thinking and motivates students towards further study. The book covers all major topics in group, ring, vector space and module theory that are usually contained in a standard modern algebra text. In addition, it studies semigroup, group action, Hopf's group, topological groups and Lie groups with their actions, applications of ring theory to algebraic geometry, and defines Zariski topology, as well as applications of module theory to structure theory of rings and homological algebra. Algebraic aspects of classical number theory and algebraic number theory are also discussed with an eye to developing modern cryptography. Topics on applications to algebraic topology, category theory, algebraic geometry, algebraic number theory, cryptography and theoretical computer science interlink the subject with different areas. Each chapter discusses individual topics, starting from the basics, with the help of illustrative examples. This comprehensive text with a broad variety of concepts, applications, examples, exercises and historical notes represents a valuable and unique resource.
* The ELS model of enterprise security is endorsed by the Secretary of the Air Force for Air Force computing systems and is a candidate for DoD systems under the Joint Information Environment Program. * The book is intended for enterprise IT architecture developers, application developers, and IT security professionals. * This is a unique approach to end-to-end security and fills a niche in the market.
In this book the authors present an alternative set theory dealing with a more relaxed notion of infiniteness, called finitely supported mathematics (FSM). It has strong connections to the Fraenkel-Mostowski (FM) permutative model of Zermelo-Fraenkel (ZF) set theory with atoms and to the theory of (generalized) nominal sets. More exactly, FSM is ZF mathematics rephrased in terms of finitely supported structures, where the set of atoms is infinite (not necessarily countable as for nominal sets). In FSM, 'sets' are replaced either by `invariant sets' (sets endowed with some group actions satisfying a finite support requirement) or by `finitely supported sets' (finitely supported elements in the powerset of an invariant set). It is a theory of `invariant algebraic structures' in which infinite algebraic structures are characterized by using their finite supports. After explaining the motivation for using invariant sets in the experimental sciences as well as the connections with the nominal approach, admissible sets and Gandy machines (Chapter 1), the authors present in Chapter 2 the basics of invariant sets and show that the principles of constructing FSM have historical roots both in the definition of Tarski `logical notions' and in the Erlangen Program of Klein for the classification of various geometries according to invariants under suitable groups of transformations. Furthermore, the consistency of various choice principles is analyzed in FSM. Chapter 3 examines whether it is possible to obtain valid results by replacing the notion of infinite sets with the notion of invariant sets in the classical ZF results. The authors present techniques for reformulating ZF properties of algebraic structures in FSM. In Chapter 4 they generalize FM set theory by providing a new set of axioms inspired by the theory of amorphous sets, and so defining the extended Fraenkel-Mostowski (EFM) set theory. In Chapter 5 they define FSM semantics for certain process calculi (e.g., fusion calculus), and emphasize the links to the nominal techniques used in computer science. They demonstrate a complete equivalence between the new FSM semantics (defined by using binding operators instead of side conditions for presenting the transition rules) and the known semantics of these process calculi. The book is useful for researchers and graduate students in computer science and mathematics, particularly those engaged with logic and set theory.
Greek, Indian and Arabic Logic marks the initial appearance of the
multi-volume Handbook of the History of Logic. Additional volumes
will be published when ready, rather than in strict chronological
order. Soon to appear are The Rise of Modern Logic: From Leibniz to
Frege. Also in preparation are Logic From Russell to Godel, The
Emergence of Classical Logic, Logic and the Modalities in the
Twentieth Century, and The Many-Valued and Non-Monotonic Turn in
Logic. Further volumes will follow, including Mediaeval and
Renaissance Logic and Logic: A History of its Central.
Set theory is concerned with the foundation of mathematics. In the original formulations of set theory, there were paradoxes contained in the idea of the "set of all sets". Current standard theory (Zermelo-Fraenkel) avoids these paradoxes by restricting the way sets may be formed by other sets, specifically to disallow the possibility of forming the set of all sets. In the 1930s, Quine proposed a different form of set theory in which the set of all sets - the universal set - is allowed, but other restrictions are placed on these axioms. Since then, the steady interest expressed in these non-standard set theories has been boosted by their relevance to computer science. The second edition still concentrates largely on Quine's New Foundations, reflecting the author's belief that this provides the richest and most mysterious of the various systems dealing with set theories with a universal set. Also included is an expanded and completely revised account of the set theories of Church-Oswald and Mitchell, with descriptions of permutation models and extensions that preserve power sets. Dr Foster here presents the reader with a useful and readable introduction for those interested in this topic, and a reference work for those already involved in this area.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
Stephen Cole Kleene was one of the greatest logicians of the twentieth century and this book is the influential textbook he wrote to teach the subject to the next generation. It was first published in 1952, some twenty years after the publication of Gadel's paper on the incompleteness of arithmetic, which marked, if not the beginning of modern logic, at least a turning point after which oenothing was ever the same. Kleene was an important figure in logic, and lived a long full life of scholarship and teaching. The 1930s was a time of creativity and ferment in the subject, when the notion of aEUROoecomputableaEURO moved from the realm of philosophical speculation to the realm of science. This was accomplished by the work of Kurt Gade1, Alan Turing, and Alonzo Church, who gave three apparently different precise definitions of aEUROoecomputableaEURO . When they all turned out to be equivalent, there was a collective realization that this was indeed the oeright notion. Kleene played a key role in this process. One could say that he was oethere at the beginning of modern logic. He showed the equivalence of lambda calculus with Turing machines and with Gadel's recursion equations, and developed the modern machinery of partial recursive functions. This textbook played an invaluable part in educating the logicians of the present. It played an important role in their own logical education.
In this monograph we introduce and examine four new temporal logic formalisms that can be used as specification languages for the automated verification of the reliability of hardware and software designs with respect to a desired behavior. The work is organized in two parts. In the first part two logics for computations, the graded computation tree logic and the computation tree logic with minimal model quantifiers are discussed. These have proved to be useful in describing correct executions of monolithic closed systems. The second part focuses on logics for strategies, strategy logic and memoryful alternating-time temporal logic, which have been successfully applied to formalize several properties of interactive plays in multi-entities systems modeled as multi-agent games.
This book provides a critical examination of how the choice of what to believe is represented in the standard model of belief change. In particular the use of possible worlds and infinite remainders as objects of choice is critically examined. Descriptors are introduced as a versatile tool for expressing the success conditions of belief change, addressing both local and global descriptor revision. The book presents dynamic descriptors such as Ramsey descriptors that convey how an agent's beliefs tend to be changed in response to different inputs. It also explores sentential revision and demonstrates how local and global operations of revision by a sentence can be derived as a special case of descriptor revision. Lastly, the book examines revocation, a generalization of contraction in which a specified sentence is removed in a process that may possibly also involve the addition of some new information to the belief set.
This volume offers a wide range of both reconstructions of Nikolai Vasiliev's original logical ideas and their implementations in the modern logic and philosophy. A collection of works put together through the international workshop "Nikolai Vasiliev's Logical Legacy and the Modern Logic," this book also covers foundations of logic in the light of Vasiliev's contradictory ontology. Chapters range from a look at the Heuristic and Conceptual Background of Vasiliev's Imaginary Logic to Generalized Vasiliev-style Propositions. It includes works which cover Imaginary and Non-Aristotelian Logics, Inconsistent Set Theory and the Expansion of Mathematical Thinking, Plurivalent Logic, and the Impact of Vasiliev's Imaginary Logic on Epistemic Logic. The Russian logician, Vasiliev, was widely recognized as one of the forerunners of modern non-classical logic. His "imaginary logic" developed in some of his work at the beginning of 20th century is often considered to be one of the first systems of paraconsistent and multi-valued logic. The novelty of his logical project has opened up prospects for modern logic as well as for non-classical science in general. This volume contains a selection of papers written by modern specialists in the field and deals with various aspects of Vasiliev's logical ideas. The logical legacy of Nikolai Vasiliev can serve as a promising source for developing an impressive range of philosophical interpretations, as it marries promising technical innovations with challenging philosophical insights.
The book has two parts: In the first, after a review of some seminal classical accounts of laws and explanations, a new account is proposed for distinguishing between laws and accidental generalizations (LAG). Among the new consequences of this proposal it is proved that any explanation of a contingent generalization shows that the generalization is not accidental. The second part involves physical theories, their modality, and their explanatory power. In particular, it is shown that (1) Each theory has a theoretical implication structure associated with it, such that there are new physical modal operators on these structures and also special modal entities that are in these structures. A special subset of the physical modals, the nomic modals are associated with the laws of theories. (2) The familiar idea that theories always explain laws by deduction of them has to be seriously modified in light of the fact that there are a host of physical theories (including for example, Newtonian Classical mechanics, Hamiltonian, and Lagrangian theory, and probability theory) that we believe are schematic (they do not have any truth value). Nevertheless, we think that there is a kind of non-deductive explanation and generality that they achieve by subsumtion under a schema.
A comprehensive one-year graduate (or advanced undergraduate)
course in mathematical logic and foundations of mathematics. No
previous knowledge of logic is required; the book is suitable for
self-study. Many exercises (with hints) are included.
This monograph provides a self-contained and easy-to-read
introduction to non-commutative multiple-valued logic algebras; a
subject which has attracted much interest in the past few years
because of its impact on information science, artificial
intelligence and other subjects.
The Handbook of the History of Logic is a multi-volume research
instrument that brings to the development of logic the best in
modern techniques of historical and interpretative scholarship. It
is the first work in English in which the history of logic is
presented so extensively. The volumes are numerous and large.
Authors have been given considerable latitude to produce chapters
of a length, and a level of detail, that would lay fair claim on
the ambitions of the project to be a definitive research work.
Authors have been carefully selected with this aim in mind. They
and the Editors join in the conviction that a knowledge of the
history of logic is nothing but beneficial to the subject's
present-day research programmes. One of the attractions of the
Handbook's several volumes is the emphasis they give to the
enduring relevance of developments in logic throughout the ages,
including some of the earliest manifestations of the subject.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
|
You may like...
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R569
Discovery Miles 5 690
Fuzzy Systems - Concepts, Methodologies…
Information Reso Management Association
Hardcover
R9,420
Discovery Miles 94 200
Mathematics For Computation (M4c)
Marco Benini, Olaf Beyersdorff, …
Hardcover
R3,569
Discovery Miles 35 690
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,271
Discovery Miles 52 710
|