![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
This text presents methods of modern set theory as tools that can be usefully applied to other areas of mathematics. The author describes numerous applications in abstract geometry and real analysis and, in some cases, in topology and algebra. The book begins with a tour of the basics of set theory, culminating in a proof of Zorn's Lemma and a discussion of some of its applications. The author then develops the notions of transfinite induction and descriptive set theory, with applications to the theory of real functions. The final part of the book presents the tools of "modern" set theory: Martin's Axiom, the Diamond Principle, and elements of forcing. Written primarily as a text for beginning graduate or advanced level undergraduate students, this book should also interest researchers wanting to learn more about set theoretical techniques applicable to their fields.
The refereed proceedings of the 24th International Conference on Applications and Theory of Petri Nets, ICATPN 2003, held in Eindhoven, The Netherlands, in June 2003. The 25 revised full papers presented together with 6 invited contributions were carefully reviewed and selected from 77 submissions. All current issues on research and development in the area of Petri nets are addressed, in particular concurrent systems design and analysis, model checking, networking, business process modeling, formal methods in software engineering, agent systems, systems specification, systems validation, discrete event systems, protocols, and prototyping.
The study of stable groups connects model theory, algebraic geometry and group theory. It analyzes groups that possess a certain general dependence relation and tries to derive structural properties from this. These may be group-theoretic (nilpotency or solubility of a given group), algebro-geometric (identification of a group as an algebraic group), or model-theoretic (description of the definable sets). In this book, the general theory of stable groups is developed from the beginning (including a chapter on preliminaries in group theory and model theory), concentrating on the model- and group-theoretic aspects. It brings together the various extensions of the original finite rank theory under a unified perspective and provides a coherent exposition of the current knowledge in the field.
Type theory is one of the most important tools in the design of higher-level programming languages, such as ML. This book introduces and teaches its techniques by focusing on one particularly neat system and studying it in detail. By concentrating on the principles that make the theory work in practice, the author covers all the key ideas without getting involved in the complications of more advanced systems. This book takes a type-assignment approach to type theory, and the system considered is the simplest polymorphic one. The author covers all the basic ideas, including the system's relation to propositional logic, and gives a careful treatment of the type-checking algorithm that lies at the heart of every such system. Also featured are two other interesting algorithms that until now have been buried in inaccessible technical literature. The mathematical presentation is rigorous but clear, making it the first book at this level that can be used as an introduction to type theory for computer scientists.
This book constitutes the refereed proceedings of the 5th Italian Conference on Algorithms and Computation, CIAC 2003, held in Rome, Italy in May 2003. The 23 revised full papers presented were carefully reviewed and selected from 57 submissions. Among the topics addressed are complexity, complexity theory, geometric computing, matching, online algorithms, combinatorial optimization, computational graph theory, approximation algorithms, network algorithms, routing, and scheduling.
The completion of the first draft of the human genome has led to an explosion of interest in genetics and molecular biology. The view of the genome as a network of interacting computational components is well-established, but researchers are now trying to reverse the analogy, by using living organisms to construct logic circuits. The potential applications for such technologies is huge, ranging from bio-sensors, through industrial applications to drug delivery and diagnostics. This book would be the first to deal with the implementation of this technology, describing several working experimental demonstrations using cells as components of logic circuits, building toward computers incorporating biological components in their functioning.
This is a softcover reprint of the English translation of 1968 of N. Bourbaki's, Th orie des Ensembles (1970).
This volume surveys recent interactions between model theory and other branches of mathematics, notably group theory. Beginning with an introductory chapter describing relevant background material, the book contains contributions from many leading international figures in this area. Topics described include automorphism groups of algebraically closed fields, the model theory of pseudo-finite fields and applications to the subgroup structure of finite Chevalley groups. Model theory of modules, and aspects of model theory of various classes of groups, including free groups, are also discussed. The book also contains the first comprehensive survey of finite covers. Many new proofs and simplifications of recent results are presented and the articles contain many open problems. This book will be a suitable guide for graduate students and a useful reference for researchers working in model theory and algebra.
The completion of the first draft of the human genome has led to an explosion of interest in genetics and molecular biology. The view of the genome as a network of interacting computational components is well-established, but researchers are now trying to reverse the analogy, by using living organisms to construct logic circuits. The potential applications for such technologies is huge, ranging from bio-sensors, through industrial applications to drug delivery and diagnostics. This book would be the first to deal with the implementation of this technology, describing several working experimental demonstrations using cells as components of logic circuits, building toward computers incorporating biological components in their functioning.
This book gives an introduction to theories of computability from a mathematically sophisticated point of view. It treats not only 'the' theory of computability (created by Alan Turing and others in the 1930s), but also a variety of other theories (of Boolean functions, automata and formal languages). These are addressed from the classical perspective of their generation by grammars and from the modern perspective as rational cones. The treatment of the classical theory of computable functions and relations takes the form of a tour through basic recursive function theory, starting with an axiomatic foundation and developing the essential methods in order to survey the most memorable results of the field. This authoritative account by one of the leading lights of the subject will prove exceptionally useful reading for graduate students, and researchers in theoretical computer science and mathematics.
Biomolecular computing is an interdisciplinary ?eld that draws together mol- ular biology, DNA nanotechnology, chemistry, physics, computer science and mathematics. Theannualinternationalmeeting onDNA-based computationhas been an exciting forum where scientists of di?erent backgrounds who share a common interest in biomolecular computing can meet and discuss their latest results. The central goal of this conference is to bring together experimentalists and theoreticians whose insights can calibrate each others' approaches. The 9th Annual International Meeting on DNA Based Computers was held during June 1-4, 2003 in the University of Wisconsin, Madison, USA. The meeting had 106 registered participants from 12 countries around the world. On the ?rst day of the meeting, we had three tutorials: the ?rst was on self-assembly of DNA nano structures which focused on the basic techniques of using designed DNA nano molecules to be self-assembled onto larger structures for computational purposes. This tutorial was given by Hao Yan of Duke U- versity. The second tutorial was given by Chengde Mao of Purdue University in which Dr. Mao presented basic DNA biochemistry that was designed for non experimentalists. The third tutorial was given by Max Garzon of the Univ- sity of Memphis. Dr. Garzon gave a lecture on computational complexity which was tailored for non-computer scientists. The next three days were for invited plenary lectures, and regular oral and poster presentations. Invited plenary l- turesweregivenbyHelenBermanofRutgersUniversity(USA), GiancarloMauri of the University of Milan (Italy), Guenter von Kiedrowski of Ruhr University (Germany), and Sorin Istrail of Celera/Applied Biosystems. Theorganizerssoughtto attractthemostsigni?cantrecentresearchwith the highestimpactonthedevelopmentofthediscipline.
The automatic verification of large parts of mathematics has been an aim of many mathematicians from Leibniz to Hilbert. While Gödel's first incompleteness theorem showed that no computer program could automatically prove certain true theorems in mathematics, the advent of electronic computers and sophisticated software means in practice there are many quite effective systems for automated reasoning that can be used for checking mathematical proofs. This book describes the use of a computer program to check the proofs of several celebrated theorems in metamathematics including those of Gödel and Church-Rosser. The computer verification using the Boyer-Moore theorem prover yields precise and rigorous proofs of these difficult theorems. It also demonstrates the range and power of automated proof checking technology. The mechanization of metamathematics itself has important implications for automated reasoning, because metatheorems can be applied as labor-saving devices to simplify proof construction.
This book devotes significant attention to the doctrine of naturalism and its relevance to disputes within the philosophy of mathematics.
The refereed proceedings of the 6th International Conference on Typed Lambda Calculi and Applications, TLCA 2003, held in Valencia, Spain in June 2003. The 21 revised full papers presented were carefully reviewed and selected from 40 submissions. The volume reports research results on all current aspects of typed lambda calculi, ranging from theoretical and methodological issues to the application of proof assistants.
The approach to probability theory followed in this book (which
differs radically from the usual one, based on a measure-theoretic
framework) characterizes probability as a linear operator rather
than as a measure, and is based on the concept of coherence, which
can be framed in the most general view of conditional probability.
It is a flexible' and unifying tool suited for handling, e.g.,
partial probability assessments (not requiring that the set of all
possible outcomes' be endowed with a previously given algebraic
structure, such as a Boolean algebra), and conditional
independence, in a way that avoids all the inconsistencies related
to logical dependence (so that a theory referring to graphical
models more general than those usually considered in bayesian
networks can be derived). Moreover, it is possible to encompass
other approaches to uncertain reasoning, such as fuzziness,
possibility functions, and default reasoning.
The subject of this book is the reasoning under uncertainty based on sta tistical evidence, where the word reasoning is taken to mean searching for arguments in favor or against particular hypotheses of interest. The kind of reasoning we are using is composed of two aspects. The first one is inspired from classical reasoning in formal logic, where deductions are made from a knowledge base of observed facts and formulas representing the domain spe cific knowledge. In this book, the facts are the statistical observations and the general knowledge is represented by an instance of a special kind of sta tistical models called functional models. The second aspect deals with the uncertainty under which the formal reasoning takes place. For this aspect, the theory of hints [27] is the appropriate tool. Basically, we assume that some uncertain perturbation takes a specific value and then logically eval uate the consequences of this assumption. The original uncertainty about the perturbation is then transferred to the consequences of the assumption. This kind of reasoning is called assumption-based reasoning. Before going into more details about the content of this book, it might be interesting to look briefly at the roots and origins of assumption-based reasoning in the statistical context. In 1930, R. A. Fisher [17] defined the notion of fiducial distribution as the result of a new form of argument, as opposed to the result of the older Bayesian argument.
The theory of sets of multiples, a subject which lies at the intersection of analytic and probabilistic number theory, has seen much development since the publication of 'Sequences' by Halberstam and Roth nearly thirty years ago. The area is rich in problems, many of them still unsolved or arising from current work. The author sets out to give a coherent, essentially self-contained account of the existing theory and at the same time to bring the reader to the frontiers of research. One of the fascinations of the theory is the variety of methods applicable to it, which include Fourier analysis, group theory, high and ultra-low moments, probability and elementary inequalities, as well as several branches of number theory. This Tract is the first devoted to the subject, and will be of value to number theorists, whether they be research workers or graduate students.
This volume is based on the papers that were presented at the International Conference "Model-Based Reasoning: Scientific Discovery, Technological Innovation, Values' (MBR'01), held at the Collegio Ghislieri, University of Pavia, Pavia, Italy, in May 2001. The previous volume Model-Based Reasoning in Scientific Discovery, edited by L. Magnani, N.J. Nersessian, and P. Thagard was based on the papers presented at the first" model-based reasoning' international conference, held at the same venue in December 1998. The presentations given at the Conference explore how scientific thinking uses models and exploratory reasoning to produce creative changes in theories and concepts. Some address the problem of model-based reasoning in ethics, especially pertaining to science and technology, and stress some aspects of model-based reasoning in technological innovation. The study of diagnostic, visual, spatial, analogical, and temporal reasoning has demonstrated that there are many ways of performing intelligent and creative reasoning that cannot be described with the help only of traditional notions of reasoning such as classical logic. Understanding the contribution of modeling practices to discovery and conceptual change in science requires expanding scientific reasoning to include complex forms of creative reasoning that are not always successful and can lead to incorrect solutions. The study of these heuristic ways of reasoning is situated at the crossroads of philosophy, artificial intelligence, cognitive psychology, and logic; that is, at the heart of cognitive science. There are several key ingredients common to the various forms of model-based reasoning. The term model' comprises both internal and external representations. The models are intended as interpretations of target physical systems, processes, phenomena, or situations. The models are retrieved or constructed on the basis of potentially satisfying salient constraints of the target domain. Moreover, in the modeling process, various forms of abstraction are used. Evaluation and adaptation take place in light of structural, causal, and/or functional constraints. Model simulation can be used to produce new states and enable evaluation of behaviors and other factors. The various contributions of the book are written by interdisciplinary researchers who are active in the area of creative reasoning in science and technology, and are logically and computationally oriented: the most recent results and achievements about the topics above are illustrated in detail in the papers.
This volume contains the Proceedings of ICFCA 2004, the 2nd International Conference on Formal Concept Analysis. The ICFCA conference series aims to be the premier forum for the publication of advances in applied lattice and order theory and in particular scienti?c advances related to formal concept analysis. Formal concept analysis emerged in the 1980s from e?orts to restructure lattice theory to promote better communication between lattice theorists and potentialusersoflatticetheory.Sincethen, the?eldhasdevelopedintoagrowing research area in its own right with a thriving theoretical community and an increasing number of applications in data and knowledge processing including data visualization, information retrieval, machine learning, data analysis and knowledge management. In terms of theory, formal concept analysis has been extended into attribute exploration, Boolean judgment, contextual logic and so on to create a powerful general framework for knowledge representation and reasoning. This conference aims to unify theoretical and applied practitioners who use formal concept an- ysis, drawing on the ?elds of mathematics, computer and library sciences and software engineering. The theme of the 2004 conference was 'Concept Lattices" to acknowledge the colloquial term used for the line diagrams that appear in almost every paper in this volume. ICFCA 2004 included tutorial sessions, demonstrating the practical bene?ts of formal concept analysis, and highlighted developments in the foundational theory and standards. The conference showcased the increasing variety of formal concept analysis software and included eight invited lectures from distinguished speakersinthe?eld.Sevenoftheeightinvitedspeakerssubmittedaccompanying papers and these were reviewed and appear in this volume.
Since its birth, Model Theory has been developing a number of methods and concepts that have their intrinsic relevance, but also provide fruitful and notable applications in various fields of Mathematics. It is a lively and fertile research area which deserves the attention of the mathematical world. This volume: A Guide to Classical and Modern Model Theory is for trainees and professional model theorists, mathematicians working in Algebra and Geometry and young people with a basic knowledge of logic.
Bob Hale and Crispin Wright draw together here the key writings in which they have worked out their distinctive approach to the fundamental questions: what is mathematics about, and how do we know it? The volume features much new material: introduction, postscript, bibliographies, and a new essay on a key problem. The Reason's Proper Study is the strongest presentation yet of the controversial neo-Fregean view that mathematical knowledge may be based a priori on logic and definitional abstraction principles. It will prove indispensable reading not just to philosophers of mathematics but to all who are interested in the fundamental metaphysical and epistemological issues which the programme raises.
This book constitutes the refereed proceedings of the 18th International Conference on Logic Programming, ICLP 2002, held in Copenhagen, Denmark, in July/August 2002.The 29 revised full papers presented together with two invited contributions and 13 posters were carefully reviewed and selected from 82 submissions. All current aspects of logic programming and computational logic are addressed.
This book constitutes the refereed proceedings of the 23rd International Conference on Application and Theory of Petri Nets, ICATPN 2002, held in Adelaide, Australia, in June 2002.The 18 regular papers and one tool presentation presented together with six invited paper were carefully reviewed and selected from 45 submissions. All current issues on research and development of Petri nets are addressed, in particular concurrent systems analysis, model validation, business process management, reactive systems, workflow processes, wireless transaction protocols.
This book constitutes the thoroughly refereed post-proceedings of the 7th International Workshop on DNA-Based Computers, DNA7, held in Tampa, Florida, USA, in June 2001.The 26 revised full papers presented together with 9 poster papers were carefully reviewed and selected from 44 submissions. The papers are organized in topical sections on experimental tools, theoretical tools, probabilistic computational models, computer simulation and sequence design, algorithms, experimental solutions, nano-tech devices, biomimetic tools, new computing models, and splicing systems and membranes.
The fundamental ideas concerning computation and recursion naturally find their place at the interface between logic and theoretical computer science. The contributions in this book, by leaders in the field, provide a picture of current ideas and methods in the ongoing investigations into the pure mathematical foundations of computability theory. The topics range over computable functions, enumerable sets, degree structures, complexity, subrecursiveness, domains and inductive inference. A number of the articles contain introductory and background material which it is hoped will make this volume an invaluable resource. |
You may like...
The Complete Hitchhiker's Guide to the…
Douglas Adams
Paperback
|