![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
This monograph details several important advances in the area known as the proofs-as-programs paradigm, a set of approaches to developing programs from proofs in constructive logic. It serves the dual purpose of providing a state-of-the-art overview of the field and detailing tools and techniques to stimulate further research. One of the booka (TM)s central themes is a general, abstract framework for developing new systems of program synthesis by adapting proofs-as-programs to new contexts, which the authors call the Curry--Howard Protocol. This protocol is used to provide two novel applications for industrial-scale, complex software engineering: contractual imperative program synthesis and structured software synthesis. These applications constitute an exemplary justification for the applicability of the protocol to different contexts. The book is intended for graduate students in computer science or mathematics who wish to extend their background in logic and type theory as well as gain experience working with logical frameworks and practical proof systems. In addition, the proofs-as-programs research community, and the wider computational logic, formal methods and software engineering communities will benefit. The applications given in the book should be of interest for researchers working in the target problem domains.
This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical math fundamental oriented approach that is commonly found in mathematical logic textbooks).
Unique selling point: * Industry standard book for merchants, banks, and consulting firms looking to learn more about PCI DSS compliance. Core audience: * Retailers (both physical and electronic), firms who handle credit or debit cards (such as merchant banks and processors), and firms who deliver PCI DSS products and services. Place in the market: * Currently there are no PCI DSS 4.0 books
"In case you are considering to adopt this book for courses with over 50 students, please contact ""[email protected]"" for more information. "
The last three chapters of the book provide an introduction to type theory (higher-order logic). It is shown how various mathematical concepts can be formalized in this very expressive formal language. This expressive notation facilitates proofs of the classical incompleteness and undecidability theorems which are very elegant and easy to understand. The discussion of semantics makes clear the important distinction between standard and nonstandard models which is so important in understanding puzzling phenomena such as the incompleteness theorems and Skolem's Paradox about countable models of set theory. Some of the numerous exercises require giving formal proofs. A computer program called ETPS which is available from the web facilitates doing and checking such exercises. "Audience: " This volume will be of interest to mathematicians, computer scientists, and philosophers in universities, as well as to computer scientists in industry who wish to use higher-order logic for hardware and software specification and verification. "
This book contains leading survey papers on the various aspects of Abduction, both logical and numerical approaches. Abduction is central to all areas of applied reasoning, including artificial intelligence, philosophy of science, machine learning, data mining and decision theory, as well as logic itself.
This book gives an account of the fundamental results in geometric stability theory, a subject that has grown out of categoricity and classification theory. This approach studies the fine structure of models of stable theories, using the geometry of forking; this often achieves global results relevant to classification theory. Topics range from Zilber-Cherlin classification of infinite locally finite homogenous geometries, to regular types, their geometries, and their role in superstable theories. The structure and existence of definable groups is featured prominently, as is work by Hrushovski. The book is unique in the range and depth of material covered and will be invaluable to anyone interested in modern model theory.
Our motivation for gathering the material for this book over aperiod of seven years has been to unify and simplify ideas wh ich appeared in a sizable number of re search articles during the past two decades. More specifically, it has been our aim to provide the categorical foundations for extensive work that was published on the epimorphism- and cowellpoweredness problem, predominantly for categories of topological spaces. In doing so we found the categorical not ion of closure operators interesting enough to be studied for its own sake, as it unifies and describes other significant mathematical notions and since it leads to a never-ending stream of ex amples and applications in all areas of mathematics. These are somewhat arbitrarily restricted to topology, algebra and (a small part of) discrete mathematics in this book, although other areas, such as functional analysis, would provide an equally rich and interesting supply of examples. We also had to restrict the themes in our theoretical exposition. In spite of the fact that closure operators generalize the uni versal closure operations of abelian category theory and of topos- and sheaf theory, we chose to mention these aspects only en passant, in favour of the presentation of new results more closely related to our original intentions. We also needed to refrain from studying topological concepts, such as compactness, in the setting of an arbitrary closure-equipped category, although this topic appears prominently in the published literature involving closure operators."
Fact finding in judicial proceedings is a dynamic process. This collection of papers considers whether computational methods or other formal logical methods developed in disciplines such as artificial intelligence, decision theory, and probability theory can facilitate the study and management of dynamic evidentiary and inferential processes in litigation. The papers gathered here have several epicenters, including (i) the dynamics of judicial proof, (ii) the relationship between artificial intelligence or formal analysis and "common sense," (iii) the logic of factual inference, including (a) the relationship between causality and inference and (b) the relationship between language and factual inference, (iv) the logic of discovery, including the role of abduction and serendipity in the process of investigation and proof of factual matters, and (v) the relationship between decision and inference.
One can distinguish, roughly speaking, two different approaches to the philosophy of mathematics. On the one hand, some philosophers (and some mathematicians) take the nature and the results of mathematicians' activities as given, and go on to ask what philosophical morals one might perhaps find in their story. On the other hand, some philosophers, logicians and mathematicians have tried or are trying to subject the very concepts which mathematicians are using in their work to critical scrutiny. In practice this usually means scrutinizing the logical and linguistic tools mathematicians wield. Such scrutiny can scarcely help relying on philosophical ideas and principles. In other words it can scarcely help being literally a study of language, truth and logic in mathematics, albeit not necessarily in the spirit of AJ. Ayer. As its title indicates, the essays included in the present volume represent the latter approach. In most of them one of the fundamental concepts in the foundations of mathematics and logic is subjected to a scrutiny from a largely novel point of view. Typically, it turns out that the concept in question is in need of a revision or reconsideration or at least can be given a new twist. The results of such a re-examination are not primarily critical, however, but typically open up new constructive possibilities. The consequences of such deconstructions and reconstructions are often quite sweeping, and are explored in the same paper or in others.
Parameterized complexity theory is a recent branch of computational complexity theory that provides a framework for a refined analysis of hard algorithmic problems. The central notion of the theory, fixed-parameter tractability, has led to the development of various new algorithmic techniques and a whole new theory of intractability. This book is a state-of-the-art introduction to both algorithmic techniques for fixed-parameter tractability and the structural theory of parameterized complexity classes, and it presents detailed proofs of recent advanced results that have not appeared in book form before. Several chapters are each devoted to intractability, algorithmic techniques for designing fixed-parameter tractable algorithms, and bounded fixed-parameter tractability and subexponential time complexity. The treatment is comprehensive, and the reader is supported with exercises, notes, a detailed index, and some background on complexity theory and logic. The book will be of interest to computer scientists, mathematicians and graduate students engaged with algorithms and problem complexity.
Recent major advances in model theory include connections between model theory and Diophantine and real analytic geometry, permutation groups, and finite algebras. The present book contains lectures on recent results in algebraic model theory, covering topics from the following areas: geometric model theory, the model theory of analytic structures, permutation groups in model theory, the spectra of countable theories, and the structure of finite algebras. Audience: Graduate students in logic and others wishing to keep abreast of current trends in model theory. The lectures contain sufficient introductory material to be able to grasp the recent results presented.
On the history of the book: In the early 1990s several new methods and perspectives in au- mated deduction emerged. We just mention the superposition calculus, meta-term inference and schematization, deductive decision procedures, and automated model building. It was this last ?eld which brought the authors of this book together. In 1994 they met at the Conference on Automated Deduction (CADE-12) in Nancy and agreed upon the general point of view, that semantics and, in particular, construction of models should play a central role in the ?eld of automated deduction. In the following years the deduction groups of the laboratory LEIBNIZ at IMAG Grenoble and the University of Technology in Vienna organized several bilateral projects promoting this topic. This book emerged as a main result of this cooperation. The authors are aware of the fact, that the book does not cover all relevant methods of automated model building (also called model construction or model generation); instead the book focuses on deduction-based symbolic methods for the construction of Herbrand models developed in the last 12 years. Other methods of automated model building, in particular also ?nite model building, are mainly treated in the ?nal chapter; this chapter is less formal and detailed but gives a broader view on the topic and a comparison of di?erent approaches. Howtoreadthisbook: In the introduction we give an overview of automated deduction in a historical context, taking into account its relationship with the human views on formal and informal proofs.
A practical introduction to the development of proofs and certified programs using Coq. An invaluable tool for researchers, students, and engineers interested in formal methods and the development of zero-fault software.
Henkin-Keisler models emanate from a modification of the Henkin construction introduced by Keisler to motivate the definition of ultraproducts. Keisler modified the Henkin construction at that point at which `new' individual constants are introduced and did so in a way that illuminates a connection between Henkin-Keisler models and ultraproducts. The resulting construction can be viewed both as a specialization of the Henkin construction and as an alternative to the ultraproduct construction. These aspects of the Henkin-Keisler construction are utilized here to present a perspective on ultraproducts and their applications accessible to the reader familiar with Henkin's proof of the completeness of first order logic and naive set theory. This approach culminates in proofs of various forms of the Keisler-Shelah characterizations of elementary equivalence and elementary classes via Henkin-Keisler models. The presentation is self-contained and proofs of more advanced results from set theory are introduced as needed. Audience: Logicians in philosophy, computer science, linguistics and mathematics.
The theory of quasivarieties constitutes an independent direction in algebra and mathematical logic and specializes in a fragment of first-order logic-the so-called universal Horn logic. This treatise uniformly presents the principal directions of the theory from an effective algebraic approach developed by the author himself. A revolutionary exposition, this influential text contains a number of results never before published in book form, featuring in-depth commentary for applications of quasivarieties to graphs, convex geometries, and formal languages. Key features include coverage of the Birkhoff-Mal'tsev problem on the structure of lattices of quasivarieties, helpful exercises, and an extensive list of references.
This book contains the lectures given at the NATO ASI 910820 "Cellular Automata and Cooperative Systems" Meeting which was held at the Centre de Physique des Houches, France, from June 22 to July 2, 1992. This workshop brought together mathematical physicists, theoretical physicists and mathe maticians working in fields related to local interacting systems, cellular and probabilistic automata, statistical physics, and complexity theory, as well as applications of these fields. We would like to thank our sponsors and supporters whose interest and help was essential for the success of the meeting: the NATO Scientific Affairs Division, the DRET (Direction des Recherches, Etudes et Techniques), the Ministere des Affaires Etrangeres, the National Science Foundation. We would also like to thank all the secretaries who helped us during the preparation of the meeting, in particular Maryse Cohen-Solal (CPT, Marseille) and Janice Nowinski (Courant Institute, New York). We are grateful for the fine work of Mrs. Gladys Cavallone in preparing this volume."
A partially ordered group is an algebraic object having the structure of a group and the structure of a partially ordered set which are connected in some natural way. These connections were established in the period between the end of 19th and beginning of 20th century. It was realized that ordered algebraic systems occur in various branches of mathemat ics bound up with its fundamentals. For example, the classification of infinitesimals resulted in discovery of non-archimedean ordered al gebraic systems, the formalization of the notion of real number led to the definition of ordered groups and ordered fields, the construc tion of non-archimedean geometries brought about the investigation of non-archimedean ordered groups and fields. The theory of partially ordered groups was developed by: R. Dedekind, a. Holder, D. Gilbert, B. Neumann, A. I. Mal'cev, P. Hall, G. Birkhoff. These connections between partial order and group operations allow us to investigate the properties of partially ordered groups. For exam ple, partially ordered groups with interpolation property were intro duced in F. Riesz's fundamental paper 1] as a key to his investigations of partially ordered real vector spaces, and the study of ordered vector spaces with interpolation properties were continued by many functional analysts since. The deepest and most developed part of the theory of partially ordered groups is the theory of lattice-ordered groups. In the 40s, following the publications of the works by G. Birkhoff, H. Nakano and P."
Belief change is an emerging field of artificial intelligence and information science dedicated to the dynamics of information and the present book provides a state-of-the-art picture of its formal foundations. It deals with the addition, deletion and combination of pieces of information and, more generally, with the revision, updating and fusion of knowledge bases. The book offers an extensive coverage of, and seeks to reconcile, two traditions in the kinematics of belief that often ignore each other - the symbolic and the numerical (often probabilistic) approaches. Moreover, the work encompasses both revision and fusion problems, even though these two are also commonly investigated by different communities. Finally, the book presents the numerical view of belief change, beyond the probabilistic framework, covering such approaches as possibility theory, belief functions and convex gambles. The work thus presents a unified view of belief change operators, drawing from a widely scattered literature embracing philosophical logic, artificial intelligence, uncertainty modelling and database systems. The material is a clearly organised guide to the literature on the dynamics of epistemic states, knowledge bases and uncertain information, suitable for scholars and graduate students familiar with applied logic, knowledge representation and uncertain reasoning.
This self-contained title demonstrates an important interplay between abstract and concrete operator theory. Key ideas are developed in a step-by-step approach, beginning with required background and historical material, and culminating in the final chapters with state-of-the-art topics. Good examples, bibliography and index make this text a valuable classroom or reference resource.
The IOth International Congress of Logic, Methodology and Philosophy of Science, which took place in Florence in August 1995, offered a vivid and comprehensive picture of the present state of research in all directions of Logic and Philosophy of Science. The final program counted 51 invited lectures and around 700 contributed papers, distributed in 15 sections. Following the tradition of previous LMPS-meetings, some authors, whose papers aroused particular interest, were invited to submit their works for publication in a collection of selected contributed papers. Due to the large number of interesting contributions, it was decided to split the collection into two distinct volumes: one covering the areas of Logic, Foundations of Mathematics and Computer Science, the other focusing on the general Philosophy of Science and the Foundations of Physics. As a leading choice criterion for the present volume, we tried to combine papers containing relevant technical results in pure and applied logic with papers devoted to conceptual analyses, deeply rooted in advanced present-day research. After all, we believe this is part of the genuine spirit underlying the whole enterprise of LMPS studies."
This is the only monograph devoted to the expressibility of finitely axiomatizable theories, a classical subject in mathematical logic. The volume summarizes investigations in the field that have led to much of the current progress, treating systematically all positive results concerning expressibility. Also included in this unique text are solutions to both the Vaught-Morely problem and the Hanf problem, and a number of new natural questions that provide prospects for further development of the theory.
The parametric lambda calculus is a metamodel for reasoning about various kinds of computations. Its syntactic definition is based on the notion of "sets of input values," and different lambda calculi can be obtained from it by instantiating such sets in suitable ways. The parametric lambda calculus is used as a tool for presenting in a uniform way basic notions of programming languages, and for studying with a uniform approach some lambda calculi modeling different kinds of computations, such as call-by-name, both in its lazy and non-lazy versions, and call-by-value. The parametric presentation allows us both to prove in one step all the fundamental properties of different calculi, and to compare them with each other. The book includes some classical results in the field of lambda calculi, but completely rephrased using the parametric approach, together with some new results. The lambda calculi are presented from a computer science viewpoint, with particular emphasis on their semantics, both operational and denotational. This book is dedicated to researchers, and can be used as a textbook for masters or Ph.D. courses on the foundations of computer science.
In opposition to the classical set theory of natural language, Nov k's highly original monograph offers a theory based on alternative and fuzzy sets. This new approach is firmly grounded in semantics and pragmatics, and accounts for the vagueness inherent in natural language-filling a large gap in our current knowledge. The theory will foster fruitful debate among researchers in linguistics and artificial intellegence. |
You may like...
Calculus for Engineering Students…
Jesus Martin Vaquero, Michael Carr, …
Paperback
R2,162
Discovery Miles 21 620
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,271
Discovery Miles 52 710
Electrical Load Forecasting - Modeling…
S. A Soliman, Ahmad Mohammad Al-Kandari
Paperback
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R569
Discovery Miles 5 690
|