![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
The interplay between computability and randomness has been an
active area of research in recent years, reflected by ample funding
in the USA, numerous workshops, and publications on the subject.
The complexity and the randomness aspect of a set of natural
numbers are closely related. Traditionally, computability theory is
concerned with the complexity aspect. However, computability
theoretic tools can also be used to introduce mathematical
counterparts for the intuitive notion of randomness of a set.
Recent research shows that, conversely, concepts and methods
originating from randomness enrich computability theory.
Imre Lakatos's Proofs and Refutations is an enduring classic, which has never lost its relevance. Taking the form of a dialogue between a teacher and some students, the book considers various solutions to mathematical problems and, in the process, raises important questions about the nature of mathematical discovery and methodology. Lakatos shows that mathematics grows through a process of improvement by attempts at proofs and critiques of these attempts, and his work continues to inspire mathematicians and philosophers aspiring to develop a philosophy of mathematics that accounts for both the static and the dynamic complexity of mathematical practice. With a specially commissioned Preface written by Paolo Mancosu, this book has been revived for a new generation of readers.
Intuitionistic logic is presented here as part of familiar classical logic which allows mechanical extraction of programs from proofs. to make the material more accessible, basic techniques are presented first for propositional logic; Part II contains extensions to predicate logic. This material provides an introduction and a safe background for reading research literature in logic and computer science as well as advanced monographs. Readers are assumed to be familiar with basic notions of first order logic. One device for making this book short was inventing new proofs of several theorems. The presentation is based on natural deduction. The topics include programming interpretation of intuitionistic logic by simply typed lambda-calculus (Curry-Howard isomorphism), negative translation of classical into intuitionistic logic, normalization of natural deductions, applications to category theory, Kripke models, algebraic and topological semantics, proof-search methods, interpolation theorem. The text developed from materal for several courses taught at Stanford University in 1992-1999.
The information age owes its existence to a little-known but crucial development, the theoretical study of logic and the foundations of mathematics. The Great Formal Machinery Works draws on original sources and rare archival materials to trace the history of the theories of deduction and computation that laid the logical foundations for the digital revolution. Jan von Plato examines the contributions of figures such as Aristotle; the nineteenth-century German polymath Hermann Grassmann; George Boole, whose Boolean logic would prove essential to programming languages and computing; Ernst Schroder, best known for his work on algebraic logic; and Giuseppe Peano, cofounder of mathematical logic. Von Plato shows how the idea of a formal proof in mathematics emerged gradually in the second half of the nineteenth century, hand in hand with the notion of a formal process of computation. A turning point was reached by 1930, when Kurt Godel conceived his celebrated incompleteness theorems. They were an enormous boost to the study of formal languages and computability, which were brought to perfection by the end of the 1930s with precise theories of formal languages and formal deduction and parallel theories of algorithmic computability. Von Plato describes how the first theoretical ideas of a computer soon emerged in the work of Alan Turing in 1936 and John von Neumann some years later. Shedding new light on this crucial chapter in the history of science, The Great Formal Machinery Works is essential reading for students and researchers in logic, mathematics, and computer science.
Some of our earliest experiences of the conclusive force of an argument come from school mathematics: faced with a mathematical proof, we cannot deny the conclusion once the premises have been accepted. Behind such arguments lies a more general pattern of 'demonstrative arguments' that is studied in the science of logic. Logical reasoning is applied at all levels, from everyday life to advanced sciences, and a remarkable level of complexity is achieved in everyday logical reasoning, even if the principles behind it remain intuitive. Jan von Plato provides an accessible but rigorous introduction to an important aspect of contemporary logic: its deductive machinery. He shows that when the forms of logical reasoning are analysed, it turns out that a limited set of first principles can represent any logical argument. His book will be valuable for students of logic, mathematics and computer science.
Koennen Computer alles? Wenn es so ware, gabe es dieses Buch nicht. Es beweist bestechend logisch, dass selbst die groessten, schnellsten, intelligentesten und teuersten Computer der Welt nur beschrankt leistungsfahig sind. Der Mensch kann noch so viel Geld, Zeit und Know-how investieren, es gibt Computer-Probleme, die er niemals loesen wird. Eine beunruhigende, provokative Botschaft - und doch: wussten wir es nicht eigentlich schon, haben es aber nie wirklich glauben wollen? Der bekannte Computer-Wissenschaftler David Harel vermittelt die mathematischen Fakten spannend, unterhaltsam und allgemeinverstandlich. Mit der Beschranktheit des Computers werden wir an die Grenzen allen Wissens gefuhrt. Grenzen, die den Menschen beflugeln, das Moegliche weiter zu verbessern und selbst aus dem Unmoeglichen Nutzen zu ziehen. Eine brillante tour de force mit uberraschenden Aspekten, die den Leser - ob vorgebildeter Laie oder Fachkundiger - von der ersten bis zur letzten Seite fesselt.
Topos Theory is an important branch of mathematical logic of interest to theoretical computer scientists, logicians and philosophers who study the foundations of mathematics, and to those working in differential geometry and continuum physics. This compendium contains material that was previously available only in specialist journals. This is likely to become the standard reference work for all those interested in the subject.
Descriptive set theory has been one of the main areas of research in set theory for almost a century. This text attempts to present a largely balanced approach, which combines many elements of the different traditions of the subject. It includes a wide variety of examples, exercises (over 400), and applications, in order to illustrate the general concepts and results of the theory. This text provides a first basic course in classical descriptive set theory and covers material with which mathematicians interested in the subject for its own sake or those that wish to use it in their field should be familiar. Over the years, researchers in diverse areas of mathematics, such as logic and set theory, analysis, topology, probability theory, etc., have brought to the subject of descriptive set theory their own intuitions, concepts, terminology and notation.
This text presents topos theory as it has developed from the study of sheaves. Sheaves arose in geometry as coefficients for cohomology and as descriptions of the functions appropriate to various kinds of manifolds (algebraic, analytic, etc.). Sheaves also appear in logic as carriers for models of set theory as well as for the semantics of other types of logic. Grothendieck introduced a topos as a category of sheaves for algebraic geometry. Subsequently, Lawvere and Tierney obtained elementary axioms for such (more general) categories. This introduction to topos theory begins with a number of illustrative examples that explain the origin of these ideas and then describes the sheafification process and the properties of an elementary topos. The applications to axiomatic set theory and the use in forcing (the Independence of the Continuum Hypothesis and of the Axiom of Choice) are then described. Geometric morphisms- like continuous maps of spaces and the construction of classifying topoi, for example those related to local rings and simplicial sets, next appear, followed by the use of locales (pointless spaces) and the construction of topoi related to geometric languages and logic. This is the first text to address all of these varied aspects of topos theory at the graduate student level.
Kurt Goedel (1906-1978) did groundbreaking work that transformed logic and other important aspects of our understanding of mathematics, especially his proof of the incompleteness of formalized arithmetic. This book on different aspects of his work and on subjects in which his ideas have contemporary resonance includes papers from a May 2006 symposium celebrating Goedel's centennial as well as papers from a 2004 symposium. Proof theory, set theory, philosophy of mathematics, and the editing of Goedel's writings are among the topics covered. Several chapters discuss his intellectual development and his relation to predecessors and contemporaries such as Hilbert, Carnap, and Herbrand. Others consider his views on justification in set theory in light of more recent work and contemporary echoes of his incompleteness theorems and the concept of constructible sets.
This handbook with exercises reveals in formalisms, hitherto mainly used for hardware and software design and verification, unexpected mathematical beauty. The lambda calculus forms a prototype universal programming language, which in its untyped version is related to Lisp, and was treated in the first author's classic The Lambda Calculus (1984). The formalism has since been extended with types and used in functional programming (Haskell, Clean) and proof assistants (Coq, Isabelle, HOL), used in designing and verifying IT products and mathematical proofs. In this book, the authors focus on three classes of typing for lambda terms: simple types, recursive types and intersection types. It is in these three formalisms of terms and types that the unexpected mathematical beauty is revealed. The treatment is authoritative and comprehensive, complemented by an exhaustive bibliography, and numerous exercises are provided to deepen the readers' understanding and increase their confidence using types.
Is college worth the cost? Should I worry about arsenic in my rice? Can we recycle pollution? Real questions of personal finance, public health, and social policy require sober, data-driven analyses. This unique text provides students with the tools of quantitative reasoning to answer such questions. The text models how to clarify the question, recognize and avoid bias, isolate relevant factors, gather data, and construct numerical analyses for interpretation. Themes and techniques are repeated across chapters, with a progression in mathematical sophistication over the course of the book, which helps the student get comfortable with the process of thinking in numbers. This textbook includes references to source materials and suggested further reading, making it user-friendly for motivated undergraduate students. The many detailed problems and worked solutions in the text and extensive appendices help the reader learn mathematical areas such as algebra, functions, graphs, and probability. End-of-chapter problem material provides practice for students, and suggested projects are provided with each chapter. A solutions manual is available online for instructors.
We welcome Volume 20, Formal Aspects of Context. Context has always been recognised as strongly relevant to models in language, philosophy, logic and artifi cial intelligence. In recent years theoretical advances in these areas and especially in logic have accelerated the study of context in the international community. An annual conference is held and many researchers have come to realise that many of the old puzzles should be reconsidered with proper attention to context. The volume editors and contributors are from among the most active front-line researchers in the area and the contents shows how wide and vigorous this area is. There are strong scientific connections with earlier volumes in the series. I am confident that the appearance of this book in our series will help secure the study of context as an important area of applied logic. D.M.Gabbay INTRODUCTION This book is a result of the First International and Interdisciplinary Con ference on Modelling and Using Context, which was organised in Rio de Janeiro in January 1997, and contains a selection of the papers presented there, refereed and revised through a process of anonymous peer review. The treatment of contexts as bona-fide objects of logical formalisation has gained wide acceptance in recent years, following the seminal impetus by McCarthy in his 'lUring award address."
This book focuses on one of the major challenges of the newly created scientific domain known as data science: turning data into actionable knowledge in order to exploit increasing data volumes and deal with their inherent complexity. Actionable knowledge has been qualitatively and intensively studied in management, business, and the social sciences but in computer science and engineering, its connection has only recently been established to data mining and its evolution, 'Knowledge Discovery and Data Mining' (KDD). Data mining seeks to extract interesting patterns from data, but, until now, the patterns discovered from data have not always been 'actionable' for decision-makers in Socio-Technical Organizations (STO). With the evolution of the Internet and connectivity, STOs have evolved into Cyber-Physical and Social Systems (CPSS) that are known to describe our world today. In such complex and dynamic environments, the conventional KDD process is insufficient, and additional processes are required to transform complex data into actionable knowledge. Readers are presented with advanced knowledge concepts and the analytics and information fusion (AIF) processes aimed at delivering actionable knowledge. The authors provide an understanding of the concept of 'relation' and its exploitation, relational calculus, as well as the formalization of specific dimensions of knowledge that achieve a semantic growth along the AIF processes. This book serves as an important technical presentation of relational calculus and its application to processing chains in order to generate actionable knowledge. It is ideal for graduate students, researchers, or industry professionals interested in decision science and knowledge engineering.
This is a long-awaited new edition of one of the best known Oxford Logic Guides. The book gives an introduction to intuitionistic mathematics, leading the reader gently through the fundamental mathematical and philosophical concepts. The treatment of various topics, for example Brouwer's proof of the Bar Theorem, valuation systems, and the completeness of intuitionistic first-order logic, have been completely revised.
Logic is often perceived as having little to do with the rest of philosophy, and even less to do with real life. In this lively and accessible introduction, Graham Priest shows how wrong this conception is. He explores the philosophical roots of the subject, explaining how modern formal logic deals with issues ranging from the existence of God and the reality of time to paradoxes of probability and decision theory. Along the way, the basics of formal logic are explained in simple, non-technical terms, showing that logic is a powerful and exciting part of modern philosophy. In this new edition Graham Priest expands his discussion to cover the subjects of algorithms and axioms, and proofs in mathematics. ABOUT THE SERIES: The Very Short Introductions series from Oxford University Press contains hundreds of titles in almost every subject area. These pocket-sized books are the perfect way to get ahead in a new subject quickly. Our expert authors combine facts, analysis, perspective, new ideas, and enthusiasm to make interesting and challenging topics highly readable.
Driven by the question, 'What is the computational content of a (formal) proof?', this book studies fundamental interactions between proof theory and computability. It provides a unique self-contained text for advanced students and researchers in mathematical logic and computer science. Part I covers basic proof theory, computability and Godel's theorems. Part II studies and classifies provable recursion in classical systems, from fragments of Peano arithmetic up to PI11-CA0. Ordinal analysis and the (Schwichtenberg-Wainer) subrecursive hierarchies play a central role and are used in proving the 'modified finite Ramsey' and 'extended Kruskal' independence results for PA and PI11-CA0. Part III develops the theoretical underpinnings of the first author's proof assistant MINLOG. Three chapters cover higher-type computability via information systems, a constructive theory TCF of computable functionals, realizability, Dialectica interpretation, computationally significant quantifiers and connectives and polytime complexity in a two-sorted, higher-type arithmetic with linear logic.
Diese Absicht wurde verstarkt durch den ausseren Umstand, dass in zunehmendem Masse Mathematikstudenten der Munchner Universitat bei mir Logik als Nebenfach wahlten. Da diese Kandidaten meist keine Zeit und Gelegenheit hatten, meine Veranstaltungen zu besuchen, kam der verstandliche Wunsch auf, ich moege "etwas Schriftliches verfassen", das man mit nach Hause nehmen koenne. Hinzu kam schliesslich noch das Wissen um didaktische Nachteile vieler Logik-Bucher. In den meisten von ihnen werden nur spezielle syntaktische und semantische Verfahren behandelt. Wenn z. B. in einem Werk ausschliesslich die axiomatische Methode, in einem weiteren allein das naturliche Schliessen und in einem dritten nur der Kalkul der PositivfNegativ-Teile vorgefuhrt wird, so fallt es selbst einem routinier ten Mathematiker schwer, die Gleichwertigkeit dieser Kalkulisierungen einzusehen. Weichen dann auch noch die Systematisierungen der Se mantik erheblich voneinander ab, so wird ein Nichtmathematiker ver mutlich sogar den Eindruck gewinnen, die fraglichen Bucher handelten von verschiedenen Gegenstanden. Doch dies ist nur die eine Seite der Medaille. In immer mehr Bucher, die das Wort ,Logik' im Titel tragen, werden namlich umgekehrt mehr oder weniger ausfuhrlich Bereiche einbezogen, die zwar fur Untersuchungen zur Logik von Wichtigkeit sind, die jedoch weit uber den Rahmen der Logik hinausfuhren, wie z. B. Rekursionstheorie, axiomatische Mengenlehre oder Hilbertsche Beweis theorie. Zieht man die Grenze einmal so weit, so ist nicht zu erkennen, warum nicht noch viel mehr einbezogen werden sollte. In zunehmendem Masse spielen z. B. algebraische Begriffe eine wichtige Rolle bei logischen Untersuchungen.
This two-volume work bridges the gap between introductory expositions of logic or set theory on one hand, and the research literature on the other. It can be used as a text in an advanced undergraduate or beginning graduate course in mathematics, computer science, or philosophy. The volumes are written in a user-friendly conversational lecture style that makes them equally effective for self-study or class use. Volume II, on formal (ZFC) set theory, incorporates a self-contained 'chapter 0' on proof techniques so that it is based on formal logic, in the style of Bourbaki. The emphasis on basic techniques will provide the reader with a solid foundation in set theory and provides a context for the presentation of advanced topics such as absoluteness, relative consistency results, two expositions of Godel's constructible universe, numerous ways of viewing recursion, and a chapter on Cohen forcing.
Many systems of quantified modal logic cannot be characterised by Kripke's well-known possible worlds semantic analysis. This book shows how they can be characterised by a more general 'admissible semantics', using models in which there is a restriction on which sets of worlds count as propositions. This requires a new interpretation of quantifiers that takes into account the admissibility of propositions. The author sheds new light on the celebrated Barcan Formula, whose role becomes that of legitimising the Kripkean interpretation of quantification. The theory is worked out for systems with quantifiers ranging over actual objects, and over all possibilia, and for logics with existence and identity predicates and definite descriptions. The final chapter develops a new admissible 'cover semantics' for propositional and quantified relevant logic, adapting ideas from the Kripke Joyal semantics for intuitionistic logic in topos theory. This book is for mathematical or philosophical logicians, computer scientists and linguists.
Written by prominent experts in the field, this monograph provides the first comprehensive and unified presentation of the structural, algorithmic, and applied aspects of the theory of Boolean functions. The book focuses on algebraic representations of Boolean functions, especially disjunctive and conjunctive normal form representations. It presents in this framework the fundamental elements of the theory (Boolean equations and satisfiability problems, prime implicants and associated short representations, dualization), an in-depth study of special classes of Boolean functions (quadratic, Horn, shellable, regular, threshold, read-once functions and their characterization by functional equations), and two fruitful generalizations of the concept of Boolean functions (partially defined functions and pseudo-Boolean functions). Several topics are presented here in book form for the first time. Because of the unique depth and breadth of the unified treatment that it provides and of its emphasis on algorithms and applications, this monograph will have special appeal for researchers and graduate students in discrete mathematics, operations research, computer science, engineering, and economics.
This gentle introduction to logic and model theory is based on a systematic use of three important games in logic: the semantic game; the Ehrenfeucht Fraisse game; and the model existence game. The third game has not been isolated in the literature before but it underlies the concepts of Beth tableaux and consistency properties. Jouko Vaananen shows that these games are closely related and in turn govern the three interrelated concepts of logic: truth, elementary equivalence and proof. All three methods are developed not only for first order logic but also for infinitary logic and generalized quantifiers. Along the way, the author also proves completeness theorems for many logics, including the cofinality quantifier logic of Shelah, a fully compact extension of first order logic. With over 500 exercises this book is ideal for graduate courses, covering the basic material as well as more advanced applications.
Bringing together over twenty years of research, this book gives a complete overview of independence-friendly logic. It emphasizes the game-theoretical approach to logic, according to which logical concepts such as truth and falsity are best understood via the notion of semantic games. The book pushes the paradigm of game-theoretical semantics further than the current literature by showing how mixed strategies and equilibria can be used to analyze independence-friendly formulas on finite models. The book is suitable for graduate students and advanced undergraduates who have taken a course on first-order logic. It contains a primer of the necessary background in game theory, numerous examples and full proofs.
A N Prior has a special place in the history of postwar philosophy for his highly original work at the intersection of logic and metaphysics. His logical innovations have found many applications in the areas of philosophical logic, mathematics, linguistics, and, increasingly, computer science. In addition, he made seminal contributions to debates in metaphysics, particularly on modality and the nature of time. This volume presents a selection of current research in the areas that were of most interest to Prior: temporal and tense logic, modal logic, proof theory, quantification and individuation, and the logic of agency. Both title and contents reflect Prior's view that logic is 'about the real world', and the orientation of the volume is towards the application of logic, in philosophy, computer science, and elsewhere. Following Prior, modal syntax is now widely applied to the formalization of a variety of subject matters, and tense logic has found numerous applications in computing, for example in natural language processing, logical deduction involving time-dependent data, program-verification, and VLSI. A special feature of the volume is the inclusion of three hitherto unpublished pieces by Prior on modal logic and the philosophy of time, along with a complete bibliography of Prior's published philosophical writings. |
You may like...
|