![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
This volume surveys recent interactions between model theory and other branches of mathematics, notably group theory. Beginning with an introductory chapter describing relevant background material, the book contains contributions from many leading international figures in this area. Topics described include automorphism groups of algebraically closed fields, the model theory of pseudo-finite fields and applications to the subgroup structure of finite Chevalley groups. Model theory of modules, and aspects of model theory of various classes of groups, including free groups, are also discussed. The book also contains the first comprehensive survey of finite covers. Many new proofs and simplifications of recent results are presented and the articles contain many open problems. This book will be a suitable guide for graduate students and a useful reference for researchers working in model theory and algebra.
Biomolecular computing is an interdisciplinary ?eld that draws together mol- ular biology, DNA nanotechnology, chemistry, physics, computer science and mathematics. Theannualinternationalmeeting onDNA-based computationhas been an exciting forum where scientists of di?erent backgrounds who share a common interest in biomolecular computing can meet and discuss their latest results. The central goal of this conference is to bring together experimentalists and theoreticians whose insights can calibrate each others' approaches. The 9th Annual International Meeting on DNA Based Computers was held during June 1-4, 2003 in the University of Wisconsin, Madison, USA. The meeting had 106 registered participants from 12 countries around the world. On the ?rst day of the meeting, we had three tutorials: the ?rst was on self-assembly of DNA nano structures which focused on the basic techniques of using designed DNA nano molecules to be self-assembled onto larger structures for computational purposes. This tutorial was given by Hao Yan of Duke U- versity. The second tutorial was given by Chengde Mao of Purdue University in which Dr. Mao presented basic DNA biochemistry that was designed for non experimentalists. The third tutorial was given by Max Garzon of the Univ- sity of Memphis. Dr. Garzon gave a lecture on computational complexity which was tailored for non-computer scientists. The next three days were for invited plenary lectures, and regular oral and poster presentations. Invited plenary l- turesweregivenbyHelenBermanofRutgersUniversity(USA), GiancarloMauri of the University of Milan (Italy), Guenter von Kiedrowski of Ruhr University (Germany), and Sorin Istrail of Celera/Applied Biosystems. Theorganizerssoughtto attractthemostsigni?cantrecentresearchwith the highestimpactonthedevelopmentofthediscipline.
This book gives an introduction to theories of computability from a mathematically sophisticated point of view. It treats not only 'the' theory of computability (created by Alan Turing and others in the 1930s), but also a variety of other theories (of Boolean functions, automata and formal languages). These are addressed from the classical perspective of their generation by grammars and from the modern perspective as rational cones. The treatment of the classical theory of computable functions and relations takes the form of a tour through basic recursive function theory, starting with an axiomatic foundation and developing the essential methods in order to survey the most memorable results of the field. This authoritative account by one of the leading lights of the subject will prove exceptionally useful reading for graduate students, and researchers in theoretical computer science and mathematics.
This book devotes significant attention to the doctrine of naturalism and its relevance to disputes within the philosophy of mathematics.
The refereed proceedings of the 6th International Conference on Typed Lambda Calculi and Applications, TLCA 2003, held in Valencia, Spain in June 2003. The 21 revised full papers presented were carefully reviewed and selected from 40 submissions. The volume reports research results on all current aspects of typed lambda calculi, ranging from theoretical and methodological issues to the application of proof assistants.
The automatic verification of large parts of mathematics has been an aim of many mathematicians from Leibniz to Hilbert. While Gödel's first incompleteness theorem showed that no computer program could automatically prove certain true theorems in mathematics, the advent of electronic computers and sophisticated software means in practice there are many quite effective systems for automated reasoning that can be used for checking mathematical proofs. This book describes the use of a computer program to check the proofs of several celebrated theorems in metamathematics including those of Gödel and Church-Rosser. The computer verification using the Boyer-Moore theorem prover yields precise and rigorous proofs of these difficult theorems. It also demonstrates the range and power of automated proof checking technology. The mechanization of metamathematics itself has important implications for automated reasoning, because metatheorems can be applied as labor-saving devices to simplify proof construction.
The approach to probability theory followed in this book (which
differs radically from the usual one, based on a measure-theoretic
framework) characterizes probability as a linear operator rather
than as a measure, and is based on the concept of coherence, which
can be framed in the most general view of conditional probability.
It is a flexible' and unifying tool suited for handling, e.g.,
partial probability assessments (not requiring that the set of all
possible outcomes' be endowed with a previously given algebraic
structure, such as a Boolean algebra), and conditional
independence, in a way that avoids all the inconsistencies related
to logical dependence (so that a theory referring to graphical
models more general than those usually considered in bayesian
networks can be derived). Moreover, it is possible to encompass
other approaches to uncertain reasoning, such as fuzziness,
possibility functions, and default reasoning.
The subject of this book is the reasoning under uncertainty based on sta tistical evidence, where the word reasoning is taken to mean searching for arguments in favor or against particular hypotheses of interest. The kind of reasoning we are using is composed of two aspects. The first one is inspired from classical reasoning in formal logic, where deductions are made from a knowledge base of observed facts and formulas representing the domain spe cific knowledge. In this book, the facts are the statistical observations and the general knowledge is represented by an instance of a special kind of sta tistical models called functional models. The second aspect deals with the uncertainty under which the formal reasoning takes place. For this aspect, the theory of hints [27] is the appropriate tool. Basically, we assume that some uncertain perturbation takes a specific value and then logically eval uate the consequences of this assumption. The original uncertainty about the perturbation is then transferred to the consequences of the assumption. This kind of reasoning is called assumption-based reasoning. Before going into more details about the content of this book, it might be interesting to look briefly at the roots and origins of assumption-based reasoning in the statistical context. In 1930, R. A. Fisher [17] defined the notion of fiducial distribution as the result of a new form of argument, as opposed to the result of the older Bayesian argument.
This volume is based on the papers that were presented at the International Conference "Model-Based Reasoning: Scientific Discovery, Technological Innovation, Values' (MBR'01), held at the Collegio Ghislieri, University of Pavia, Pavia, Italy, in May 2001. The previous volume Model-Based Reasoning in Scientific Discovery, edited by L. Magnani, N.J. Nersessian, and P. Thagard was based on the papers presented at the first" model-based reasoning' international conference, held at the same venue in December 1998. The presentations given at the Conference explore how scientific thinking uses models and exploratory reasoning to produce creative changes in theories and concepts. Some address the problem of model-based reasoning in ethics, especially pertaining to science and technology, and stress some aspects of model-based reasoning in technological innovation. The study of diagnostic, visual, spatial, analogical, and temporal reasoning has demonstrated that there are many ways of performing intelligent and creative reasoning that cannot be described with the help only of traditional notions of reasoning such as classical logic. Understanding the contribution of modeling practices to discovery and conceptual change in science requires expanding scientific reasoning to include complex forms of creative reasoning that are not always successful and can lead to incorrect solutions. The study of these heuristic ways of reasoning is situated at the crossroads of philosophy, artificial intelligence, cognitive psychology, and logic; that is, at the heart of cognitive science. There are several key ingredients common to the various forms of model-based reasoning. The term model' comprises both internal and external representations. The models are intended as interpretations of target physical systems, processes, phenomena, or situations. The models are retrieved or constructed on the basis of potentially satisfying salient constraints of the target domain. Moreover, in the modeling process, various forms of abstraction are used. Evaluation and adaptation take place in light of structural, causal, and/or functional constraints. Model simulation can be used to produce new states and enable evaluation of behaviors and other factors. The various contributions of the book are written by interdisciplinary researchers who are active in the area of creative reasoning in science and technology, and are logically and computationally oriented: the most recent results and achievements about the topics above are illustrated in detail in the papers.
This volume contains the Proceedings of ICFCA 2004, the 2nd International Conference on Formal Concept Analysis. The ICFCA conference series aims to be the premier forum for the publication of advances in applied lattice and order theory and in particular scienti?c advances related to formal concept analysis. Formal concept analysis emerged in the 1980s from e?orts to restructure lattice theory to promote better communication between lattice theorists and potentialusersoflatticetheory.Sincethen, the?eldhasdevelopedintoagrowing research area in its own right with a thriving theoretical community and an increasing number of applications in data and knowledge processing including data visualization, information retrieval, machine learning, data analysis and knowledge management. In terms of theory, formal concept analysis has been extended into attribute exploration, Boolean judgment, contextual logic and so on to create a powerful general framework for knowledge representation and reasoning. This conference aims to unify theoretical and applied practitioners who use formal concept an- ysis, drawing on the ?elds of mathematics, computer and library sciences and software engineering. The theme of the 2004 conference was 'Concept Lattices" to acknowledge the colloquial term used for the line diagrams that appear in almost every paper in this volume. ICFCA 2004 included tutorial sessions, demonstrating the practical bene?ts of formal concept analysis, and highlighted developments in the foundational theory and standards. The conference showcased the increasing variety of formal concept analysis software and included eight invited lectures from distinguished speakersinthe?eld.Sevenoftheeightinvitedspeakerssubmittedaccompanying papers and these were reviewed and appear in this volume.
Since its birth, Model Theory has been developing a number of methods and concepts that have their intrinsic relevance, but also provide fruitful and notable applications in various fields of Mathematics. It is a lively and fertile research area which deserves the attention of the mathematical world. This volume: A Guide to Classical and Modern Model Theory is for trainees and professional model theorists, mathematicians working in Algebra and Geometry and young people with a basic knowledge of logic.
Bob Hale and Crispin Wright draw together here the key writings in which they have worked out their distinctive approach to the fundamental questions: what is mathematics about, and how do we know it? The volume features much new material: introduction, postscript, bibliographies, and a new essay on a key problem. The Reason's Proper Study is the strongest presentation yet of the controversial neo-Fregean view that mathematical knowledge may be based a priori on logic and definitional abstraction principles. It will prove indispensable reading not just to philosophers of mathematics but to all who are interested in the fundamental metaphysical and epistemological issues which the programme raises.
The theory of sets of multiples, a subject which lies at the intersection of analytic and probabilistic number theory, has seen much development since the publication of 'Sequences' by Halberstam and Roth nearly thirty years ago. The area is rich in problems, many of them still unsolved or arising from current work. The author sets out to give a coherent, essentially self-contained account of the existing theory and at the same time to bring the reader to the frontiers of research. One of the fascinations of the theory is the variety of methods applicable to it, which include Fourier analysis, group theory, high and ultra-low moments, probability and elementary inequalities, as well as several branches of number theory. This Tract is the first devoted to the subject, and will be of value to number theorists, whether they be research workers or graduate students.
This book constitutes the refereed proceedings of the 18th International Conference on Logic Programming, ICLP 2002, held in Copenhagen, Denmark, in July/August 2002.The 29 revised full papers presented together with two invited contributions and 13 posters were carefully reviewed and selected from 82 submissions. All current aspects of logic programming and computational logic are addressed.
Now in paperback, Topology via Logic is an advanced textbook on topology for computer scientists. Based on a course given by the author to postgraduate students of computer science at Imperial College, it has three unusual features. First, the introduction is from the locale viewpoint, motivated by the logic of finite observations: this provides a more direct approach than the traditional one based on abstracting properties of open sets in the real line. Second, the methods of locale theory are freely exploited. Third, there is substantial discussion of some computer science applications. Although books on topology aimed at mathematics exist, no book has been written specifically for computer scientists. As computer scientists become more aware of the mathematical foundations of their discipline, it is appropriate that such topics are presented in a form of direct relevance and applicability. This book goes some way towards bridging the gap.
This book constitutes the refereed proceedings of the 23rd International Conference on Application and Theory of Petri Nets, ICATPN 2002, held in Adelaide, Australia, in June 2002.The 18 regular papers and one tool presentation presented together with six invited paper were carefully reviewed and selected from 45 submissions. All current issues on research and development of Petri nets are addressed, in particular concurrent systems analysis, model validation, business process management, reactive systems, workflow processes, wireless transaction protocols.
This book constitutes the thoroughly refereed post-proceedings of the 7th International Workshop on DNA-Based Computers, DNA7, held in Tampa, Florida, USA, in June 2001.The 26 revised full papers presented together with 9 poster papers were carefully reviewed and selected from 44 submissions. The papers are organized in topical sections on experimental tools, theoretical tools, probabilistic computational models, computer simulation and sequence design, algorithms, experimental solutions, nano-tech devices, biomimetic tools, new computing models, and splicing systems and membranes.
This book constitutes the thoroughly refereed post-proceedings of the 5th International Conference on Developments in Language Theory, DLT 2001, held in Vienna, Austria, in July 2001.The 24 revised full papers presented together with 10 revised invited papers were carefully selected during two rounds of reviewing and revision from a total of 64 papers submitted. Among the topics covered are grammars and acceptors, efficient algorithms for languages, combinatorial and algebraic properties, decision problems, relations to complexity theory, logic, picture description and analysis, DNA computing, cryptography, and concurrency.
This book provides an accessible introduction to the most important features of formal languages and automata theory - core topics on computer science degree schemes worldwide. It focuses on the key concepts, illustrating potentially intimidating material through diagrams and pictorial representations, and this edition will include new and expanded coverage of topics such as: reduction and simplification of material on Turing machines; complexity and O notation; propositional logic and first order predicate logic. Aimed primarily at computer scientists rather than mathematicians, algorithms and proofs are presented informally through examples, and there are numerous exercises (many with solutions) and an extensive glossary. This book will be invaluable to students of computer science but it will also prove essential reading to all practitioners needing to know about formal methods.
This book contains a selection of papers presented at the ?rst annual workshop of the TYPES Working Group (Computer-Assisted Reasoning Based on Type Theory, EU IST project 29001), which was held 8th 12th of December, 2000 at the University of Durham, Durham, UK. It was attended by about 80 researchers. The workshop follows a series of meetings organised in 1993, 1994, 1995, 1996, 1998, and 1999 under the auspices of the Esprit BRA6435 and the - prit Working Group 21900 for the previous TYPES projects. Those proceedings were also published in the LNCS series, edited by Henk Barendregt and Tobias Nipkow (Vol. 806, 1993), by Peter Dybjer, Bengt Nordstr]om, and Jan Smith (Vol. 996, 1994), by Stefano Berardi and Mario Coppo (Vol. 1158, 1995), by Christine Paulin-Mohring and Eduardo Gimenez (Vol. 1512, 1996), by Thorsten Altenkirch, Wolfgang Naraschewski, and Bernhard Reus (Vol. 1657, 1998), and by Thierry Coquand, Peter Dybjer, Bengt Nordstr]om, and Jan Smith (Vol. 1956, 1999). The Esprit BRA6453 was itself a continuation of the former Esprit - tion 3245, Logical Frameworks: Design, Implementation, and Experiments. The articles from the annual workshops under that Action were edited by Gerard Huet and Gordon Plotkin in the books Logical Frameworks and Logical En- ronments, both published by Cambridge University Press. Acknowledgements We are very grateful to members of Durham s Computer Assisted Reasoning Group, especially Robert Kiessling, for helping to organise the workshop. Robert s contribution was key to the success of the meeting."
The fundamental ideas concerning computation and recursion naturally find their place at the interface between logic and theoretical computer science. The contributions in this book, by leaders in the field, provide a picture of current ideas and methods in the ongoing investigations into the pure mathematical foundations of computability theory. The topics range over computable functions, enumerable sets, degree structures, complexity, subrecursiveness, domains and inductive inference. A number of the articles contain introductory and background material which it is hoped will make this volume an invaluable resource.
This book presents an up-to-date, unified treatment of research in bounded arithmetic and complexity of propositional logic with emphasis on independence proofs and lower bound proofs. The author discusses the deep connections between logic and complexity theory and lists a number of intriguing open problems. An introduction to the basics of logic and complexity is followed by discussion of important results in propositional proof systems and systems of bounded arithmetic. Then more advanced topics are treated, including polynomial simulations and conservativity results, various witnessing theorems, the translation of bounded formulas (and their proofs) into propositional ones, the method of random partial restrictions and its applications, simple independence proofs, complete systems of partial relations, lower bounds to the size of constant-depth propositional proofs, the approximation method and the method of Boolean valuations, combinatorics and complexity theory within bounded arithmetic, and relations to complexity issues of predicate calculus. Students and researchers in mathematical logic and complexity theory will find his comprehensive treatment an excellent guide to this expanding interdisciplinary area.
This book constitutes the refereed proceedings of the Second International Conference of B and Z Users, ZB 2002, held in Grenoble, France in January 2002. The 24 papers presented together with three invited contributions were carefully reviewed and selected for inclusion in the book. The book documents the recent advances for the Z formal specification notion and for the B method; the full scope is covered, ranging from foundational and theoretical issues to advanced applications, tools, and case studies.
Stig Kanger (1924-1988) made important contributions to logic and formal philosophy. Kanger's most original achievements were in the areas of general proof theory, the semantics of modal and deontic logic, and the logical analysis of the concept of rights. But he contributed significantly to action theory, preference logic and the theory of measurement as well. This is the second of two volumes dedicated to the work of Stig Kanger. The first volume is a complete collection of Kanger's philosophical papers. The present volume contains critical essays on the various aspects of Kanger's work as well as some biographical sketches. Lennart A...qvist, Jan Berg, Brian Chellas, Anatoli Degtyarev, Lars Gustafsson, SAren HalldA(c)n, Kaj BA, rge Hansen, Sven Ove Hansson, Risto Hilpinen, Jaakko Hintikka, Ghita HolmstrAm-Hintikka, Lars Lindahl, Sten LindstrAm, Ingmar PArn, Dag Prawitz, Wlodek Rabinowicz, Krister Segerberg, Amartya Sen, SAren Stenlund, GAran Sundholm, and Andrei Voronkov have contributed to this volume.
Stig Kanger (1924-1988) made important contributions to logic and formal philosophy. Kanger's dissertation Provability in Logic, 1957, contained significant results in proof theory as well as the first fully worked out model-theoretic interpretation of quantified modal logic. It is generally accepted nowadays that Kanger was one of the originators of possible worlds semantics for modal logic. Kanger's most original achievements were in the areas of general proof theory, the semantics of modal and deontic logic, and the logical analysis of the concept of rights. He also contributed to action theory, preference logic, and the theory of measurement. This is the first of two volumes dedicated to the work of Stig Kanger. The present volume is a complete collection of Kanger's philosophical papers. The second volume contains critical essays on Kanger's work, as well as biographical essays on Kanger written by colleagues and friends. |
You may like...
Intelligent Natural Language Processing…
Khaled Shaalan, Aboul Ella Hassanien, …
Hardcover
R7,860
Discovery Miles 78 600
The Gnostic Heresies of the First and…
Henry Longueville Mansel
Paperback
R535
Discovery Miles 5 350
|