![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book is a collection of contributions honouring Arnon Avron's seminal work on the semantics and proof theory of non-classical logics. It includes presentations of advanced work by some of the most esteemed scholars working on semantic and proof-theoretical aspects of computer science logic. Topics in this book include frameworks for paraconsistent reasoning, foundations of relevance logics, analysis and characterizations of modal logics and fuzzy logics, hypersequent calculi and their properties, non-deterministic semantics, algebraic structures for many-valued logics, and representations of the mechanization of mathematics. Avron's foundational and pioneering contributions have been widely acknowledged and adopted by the scientific community. His research interests are very broad, spanning over proof theory, automated reasoning, non-classical logics, foundations of mathematics, and applications of logic in computer science and artificial intelligence. This is clearly reflected by the diversity of topics discussed in the chapters included in this book, all of which directly relate to Avron's past and present works. This book is of interest to computer scientists and scholars of formal logic.
This open access book is the first ever collection of Karl Popper's writings on deductive logic. Karl R. Popper (1902-1994) was one of the most influential philosophers of the 20th century. His philosophy of science ("falsificationism") and his social and political philosophy ("open society") have been widely discussed way beyond academic philosophy. What is not so well known is that Popper also produced a considerable work on the foundations of deductive logic, most of it published at the end of the 1940s as articles at scattered places. This little-known work deserves to be known better, as it is highly significant for modern proof-theoretic semantics. This collection assembles Popper's published writings on deductive logic in a single volume, together with all reviews of these papers. It also contains a large amount of unpublished material from the Popper Archives, including Popper's correspondence related to deductive logic and manuscripts that were (almost) finished, but did not reach the publication stage. All of these items are critically edited with additional comments by the editors. A general introduction puts Popper's work into the context of current discussions on the foundations of logic. This book should be of interest to logicians, philosophers, and anybody concerned with Popper's work.
This book features more than 20 papers that celebrate the work of Hajnal Andreka and Istvan Nemeti. It illustrates an interaction between developing and applying mathematical logic. The papers offer new results as well as surveys in areas influenced by these two outstanding researchers. They also provide details on the after-life of some of their initiatives. Computer science connects the papers in the first part of the book. The second part concentrates on algebraic logic. It features a range of papers that hint at the intricate many-way connections between logic, algebra, and geometry. The third part explores novel applications of logic in relativity theory, philosophy of logic, philosophy of physics and spacetime, and methodology of science. They include such exciting subjects as time travelling in emergent spacetime. The short autobiographies of Hajnal Andreka and Istvan Nemeti at the end of the book describe an adventurous journey from electric engineering and Maxwell's equations to a complex system of computer programs for designing Hungary's electric power system, to exploring and contributing deep results to Tarskian algebraic logic as the deepest core theory of such questions, then on to applications of the results in such exciting new areas as relativity theory in order to rejuvenate logic itself.
This textbook introduces enumerative combinatorics through the framework of formal languages and bijections. By starting with elementary operations on words and languages, the authors paint an insightful, unified picture for readers entering the field. Numerous concrete examples and illustrative metaphors motivate the theory throughout, while the overall approach illuminates the important connections between discrete mathematics and theoretical computer science. Beginning with the basics of formal languages, the first chapter quickly establishes a common setting for modeling and counting classical combinatorial objects and constructing bijective proofs. From here, topics are modular and offer substantial flexibility when designing a course. Chapters on generating functions and partitions build further fundamental tools for enumeration and include applications such as a combinatorial proof of the Lagrange inversion formula. Connections to linear algebra emerge in chapters studying Cayley trees, determinantal formulas, and the combinatorics that lie behind the classical Cayley-Hamilton theorem. The remaining chapters range across the Inclusion-Exclusion Principle, graph theory and coloring, exponential structures, matching and distinct representatives, with each topic opening many doors to further study. Generous exercise sets complement all chapters, and miscellaneous sections explore additional applications. Lessons in Enumerative Combinatorics captures the authors' distinctive style and flair for introducing newcomers to combinatorics. The conversational yet rigorous presentation suits students in mathematics and computer science at the graduate, or advanced undergraduate level. Knowledge of single-variable calculus and the basics of discrete mathematics is assumed; familiarity with linear algebra will enhance the study of certain chapters.
This textbook introduces the representation theory of algebras by focusing on two of its most important aspects: the Auslander-Reiten theory and the study of the radical of a module category. It starts by introducing and describing several characterisations of the radical of a module category, then presents the central concepts of irreducible morphisms and almost split sequences, before providing the definition of the Auslander-Reiten quiver, which encodes much of the information on the module category. It then turns to the study of endomorphism algebras, leading on one hand to the definition of the Auslander algebra and on the other to tilting theory. The book ends with selected properties of representation-finite algebras, which are now the best understood class of algebras. Intended for graduate students in representation theory, this book is also of interest to any mathematician wanting to learn the fundamentals of this rapidly growing field. A graduate course in non-commutative or homological algebra, which is standard in most universities, is a prerequisite for readers of this book.
Monograph( based very largely upon results original to the Czechoslovakian authors) presents an abstract account of the theory of automata for sophisticated readers presumed to be already conversant in the language of category theory. The seven chapters are punctuated at frequent intervals by exampl
In this book, Paulo Guilherme Santos studies diagonalization in formal mathematics from logical aspects to everyday mathematics. He starts with a study of the diagonalization lemma and its relation to the strong diagonalization lemma. After that, Yablo's paradox is examined, and a self-referential interpretation is given. From that, a general structure of diagonalization with paradoxes is presented. Finally, the author studies a general theory of diagonalization with the help of examples from mathematics.
Parameterized Complexity in the Polynomial Hierarchy was co-recipient of the E.W. Beth Dissertation Prize 2017 for outstanding dissertations in the fields of logic, language, and information. This work extends the theory of parameterized complexity to higher levels of the Polynomial Hierarchy (PH). For problems at higher levels of the PH, a promising solving approach is to develop fixed-parameter tractable reductions to SAT, and to subsequently use a SAT solving algorithm to solve the problem. In this dissertation, a theoretical toolbox is developed that can be used to classify in which cases this is possible. The use of this toolbox is illustrated by applying it to analyze a wide range of problems from various areas of computer science and artificial intelligence.
This book is dedicated to the life and work of the mathematician Joachim Lambek (1922-2014). The editors gather together noted experts to discuss the state of the art of various of Lambek's works in logic, category theory, and linguistics and to celebrate his contributions to those areas over the course of his multifaceted career. After early work in combinatorics and elementary number theory, Lambek became a distinguished algebraist (notably in ring theory). In the 1960s, he began to work in category theory, categorical algebra, logic, proof theory, and foundations of computability. In a parallel development, beginning in the late 1950s and for the rest of his career, Lambek also worked extensively in mathematical linguistics and computational approaches to natural languages. He and his collaborators perfected production and type grammars for numerous natural languages. Lambek grammars form an early noncommutative precursor to Girard's linear logic. In a surprising development (2000), he introduced a novel and deeper algebraic framework (which he called pregroup grammars) for analyzing natural language, along with algebraic, higher category, and proof-theoretic semantics. This book is of interest to mathematicians, logicians, linguists, and computer scientists.
This book is a collection of selected papers presented at the International Conference on Semigroups and Applications, held at the Cochin University of Science and Technology, India, from December 9-12, 2019. This book discusses the recent developments in semigroups theory, category theory and the applications of these in various areas of research, including structure theory of semigroups, lattices, rings and partial algebras. This book presents chapters on ordering orders and quotient rings, block groups and Hall's relations, quotients of the Booleanization of inverse semigroup, Markov chains through semigroup graph expansions, polycyclic inverse monoids and Thompson group, balanced category and bundle category. This book will be of much value to researchers working in areas of semigroup and operator theory.
The aim of this book is to provide a unified exposition of the theory of symmetric designs with emphasis on recent developments. The authors cover the combinatorial aspects of the theory giving particular attention to the construction of symmetric designs and related objects. The last five chapters of the book are devoted to balanced generalized weighing matrices, decomposable symmetric designs, subdesigns of symmetric designs, non-embeddable quasi-residual designs, and Ryser designs. Most results in these chapters have never previously appeared in book form. The book concludes with a comprehensive bibliography of over 400 entries. Researchers in all areas of combinatorial designs, including coding theory and finite geometries, will find much of interest here. Detailed proofs and a large number of exercises make this book suitable as a text for an advanced course in combinatorial designs.
This book, the third book in the four-volume series in algebra, deals with important topics in homological algebra, including abstract theory of derived functors, sheaf co-homology, and an introduction to etale and l-adic co-homology. It contains four chapters which discuss homology theory in an abelian category together with some important and fundamental applications in geometry, topology, algebraic geometry (including basics in abstract algebraic geometry), and group theory. The book will be of value to graduate and higher undergraduate students specializing in any branch of mathematics. The author has tried to make the book self-contained by introducing relevant concepts and results required. Prerequisite knowledge of the basics of algebra, linear algebra, topology, and calculus of several variables will be useful.
Professor Atiyah is one of the greatest living mathematicians and is well known throughout the mathematical world. He is a recipient of the Fields Medal, the mathematical equivalent of the Nobel Prize, and is still at the peak of his career. His huge number of published papers, focusing on the areas of algebraic geometry and topology, have here been collected into six volumes, divided thematically for easy reference by individuals interested in a particular subject. Volumes III and IV cover papers written in 1963-84 and are the result of a long collaboration with I. M. Singer on the Index Theory of elliptic operators.
Professor Atiyah is one of the greatest living mathematicians and is well known throughout the mathematical world. He is a recipient of the Fields Medal, the mathematical equivalent of the Nobel Prize, and is still at the peak of his career. His huge number of published papers, focusing on the areas of algebraic geometry and topology, have here been collected into six volumes, divided thematically for easy reference by individuals interested in a particular subject. Volumes III and IV cover papers written in 1963-84 and are the result of a long collaboration with I. M. Singer on the Index Theory of elliptic operators.
This exploration of a selection of fundamental topics and general purpose tools provides a roadmap to undergraduate students who yearn for a deeper dive into many of the concepts and ideas they have been encountering in their classes whether their motivation is pure curiosity or preparation for graduate studies. The topics intersect a wide range of areas encompassing both pure and applied mathematics. The emphasis and style of the book are motivated by the goal of developing self-reliance and independent mathematical thought. Mathematics requires both intuition and common sense as well as rigorous, formal argumentation. This book attempts to showcase both, simultaneously encouraging readers to develop their own insights and understanding and the adoption of proof writing skills. The most satisfying proofs/arguments are fully rigorous and completely intuitive at the same time.
This volume is a collection of essays in honour of Professor Mohammad Ardeshir. It examines topics which, in one way or another, are connected to the various aspects of his multidisciplinary research interests. Based on this criterion, the book is divided into three general categories. The first category includes papers on non-classical logics, including intuitionistic logic, constructive logic, basic logic, and substructural logic. The second category is made up of papers discussing issues in the contemporary philosophy of mathematics and logic. The third category contains papers on Avicenna's logic and philosophy. Mohammad Ardeshir is a full professor of mathematical logic at the Department of Mathematical Sciences, Sharif University of Technology, Tehran, Iran, where he has taught generations of students for around a quarter century. Mohammad Ardeshir is known in the first place for his prominent works in basic logic and constructive mathematics. His areas of interest are however much broader and include topics in intuitionistic philosophy of mathematics and Arabic philosophy of logic and mathematics. In addition to numerous research articles in leading international journals, Ardeshir is the author of a highly praised Persian textbook in mathematical logic. Partly through his writings and translations, the school of mathematical intuitionism was introduced to the Iranian academic community.
This book is the first systematic treatment of this area so far scattered in a vast number of articles. As in classical topology, concrete problems require restricting the (generalized point-free) spaces by various conditions playing the roles of classical separation axioms. These are typically formulated in the language of points; but in the point-free context one has either suitable translations, parallels, or satisfactory replacements. The interrelations of separation type conditions, their merits, advantages and disadvantages, and consequences are discussed. Highlights of the book include a treatment of the merits and consequences of subfitness, various approaches to the Hausdorff's axiom, and normality type axioms. Global treatment of the separation conditions put them in a new perspective, and, a.o., gave some of them unexpected importance. The text contains a lot of quite recent results; the reader will see the directions the area is taking, and may find inspiration for her/his further work. The book will be of use for researchers already active in the area, but also for those interested in this growing field (sometimes even penetrating into some parts of theoretical computer science), for graduate and PhD students, and others. For the reader's convenience, the text is supplemented with an Appendix containing necessary background on posets, frames and locales.
The purpose of the Reasoning Web Summer School is to disseminate recent advances on reasoning techniques and related issues that are of particular interest to Semantic Web and Linked Data applications. It is primarily intended for postgraduate students, postdocs, young researchers, and senior researchers wishing to deepen their knowledge. As in the previous years, lectures in the summer school were given by a distinguished group of expert lecturers.The broad theme of this year's summer school was again "Declarative Artificial Intelligence" and it covered various aspects of ontological reasoning and related issues that are of particular interest to Semantic Web and Linked Data applications. The following eight lectures were presented during the school: Foundations of Graph Path Query Languages; On Combining Ontologies and Rules; Modelling Symbolic Knowledge Using Neural Representations; Mining the Semantic Web with Machine Learning: Main Issues That Need to Be Known; Temporal ASP: From Logical Foundations to Practical Use with telingo; A Review of SHACL: From Data Validation to Schema Reasoning for RDF Graphs; and Score-Based Explanations in Data Management and Machine Learning.
This book considers the important twentieth century Austrian philosopher, Ludwig Wittgenstein, and his conception of certainty. In his work entitled On Certainty, Wittgenstein provides not only a brilliant solution to a previously intractable philosophical problem, but also the elements of an entirely new way of approaching this and similar longstanding, apparently unresolvable, problems. In On Certainty, he re-conceives the problem of radical skepticism-the claim that we can never really be certain of anything except the contents of our own minds-as a kind of philosophical "disease" of thought. His approach to the problem, which is emphasized in the book, is similar to the treatment of disease, has two main goals: (1) bring about an awareness in the philosopher that this kind of extreme skepticism is not a methodological approach to be taken seriously, and, with this awareness, (2) an attempt to replace this radical skepticism with a practical, Common Sense framework. Implicit in Wittgenstein's approach are a number of strategies found in a contemporary approach to psychotherapy known as Cognitive Behavioral Therapy (CBT). These strategies, along with philosophical methods and scientific practices rooted in the Scottish School of Common Sense, seek to diagnose and treat irrational thoughts and beliefs that often emerge (and re-emerge) in the discipline of philosophy. The aim of this book, then, is to provide students of philosophy with the tools necessary to adjust and reshape these irrational, self-defeating thoughts and beliefs into something new, something healthy.
This book constitutes the proceedings of the 23rd International Conference on Verification, Model Checking, and Abstract Interpretation, VMCAI 2022, which took place in Philadelphia, PA, USA, in January 2022.The 22 papers presented in this volume were carefully reviewed from 48 submissions. VMCAI provides a forum for researchers working on verification, model checking, and abstract interpretation and facilitates interaction, cross-fertilization, and advancement of hybrid methods that combine these and related areas.
This monograph is devoted to a new class of non-commutative rings, skew Poincare-Birkhoff-Witt (PBW) extensions. Beginning with the basic definitions and ring-module theoretic/homological properties, it goes on to investigate finitely generated projective modules over skew PBW extensions from a matrix point of view. To make this theory constructive, the theory of Groebner bases of left (right) ideals and modules for bijective skew PBW extensions is developed. For example, syzygies and the Ext and Tor modules over these rings are computed. Finally, applications to some key topics in the noncommutative algebraic geometry of quantum algebras are given, including an investigation of semi-graded Koszul algebras and semi-graded Artin-Schelter regular algebras, and the noncommutative Zariski cancellation problem. The book is addressed to researchers in noncommutative algebra and algebraic geometry as well as to graduate students and advanced undergraduate students.
The core of Volume3 consists of lecture notes for seven sets of lectures Hilbert gave (often in collaboration with Bernays) on the foundations of mathematics between 1917 and 1926. These texts make possible for the first time a detailed reconstruction of the rapid development of Hilbert's foundational thought during this period, and show the increasing dominance of the metamathematical perspective in his logical work: the emergence of modern mathematical logic; the explicit raising of questions of completeness, consistency and decidability for logical systems; the investigation of the relative strengths of various logical calculi; the birth and evolution of proof theory, and the parallel emergence of Hilbert's finitist standpoint. The lecture notes are accompanied by numerous supplementary documents, both published and unpublished, including a complete version of Bernays's "Habilitationschrift" of 1918, the text of the first edition of Hilbert and Ackermann's "Grundzuge der theoretischen Logik" (1928), and several shorter lectures by Hilbert from the later 1920s. These documents, which provide the background to Hilbert and Bernays's monumental "Grundlagen der Mathematik" (1934, 1938), are essential for understanding the development of modern mathematical logic, and for reconstructing the interactions between Hilbert, Bernays, Brouwer, and Weyl in the philosophy of mathematics. "
This research monograph develops the theory of relative nonhomogeneous Koszul duality. Koszul duality is a fundamental phenomenon in homological algebra and related areas of mathematics, such as algebraic topology, algebraic geometry, and representation theory. Koszul duality is a popular subject of contemporary research. This book, written by one of the world's leading experts in the area, includes the homogeneous and nonhomogeneous quadratic duality theory over a nonsemisimple, noncommutative base ring, the Poincare-Birkhoff-Witt theorem generalized to this context, and triangulated equivalences between suitable exotic derived categories of modules, curved DG comodules, and curved DG contramodules. The thematic example, meaning the classical duality between the ring of differential operators and the de Rham DG algebra of differential forms, involves some of the most important objects of study in the contemporary algebraic and differential geometry. For the first time in the history of Koszul duality the derived D-\Omega duality is included into a general framework. Examples highly relevant for algebraic and differential geometry are discussed in detail.
This book constitutes the refereed proceedings of the 4th International Workshop and Tutorial, FMTea 2021, Held as Part of the 4th World Congress on Formal Methods, FM 2021, as a virtual event in November 2021. The 8 full papers presented together with 2 short papers were carefully reviewed and selected from 12 submissions. The papers are organized in topical sections named: experiences and proposals related with online FM learning and teaching, integrating/embedding FM teaching/thinking within other computer science courses, teaching FM for industry, and innovative learning and teaching methods for FM.
This book offers an original and informative view of the development of fundamental concepts of computability theory. The treatment is put into historical context, emphasizing the motivation for ideas as well as their logical and formal development. In Part I the author introduces computability theory, with chapters on the foundational crisis of mathematics in the early twentieth century, and formalism. In Part II he explains classical computability theory, with chapters on the quest for formalization, the Turing Machine, and early successes such as defining incomputable problems, c.e. (computably enumerable) sets, and developing methods for proving incomputability. In Part III he explains relative computability, with chapters on computation with external help, degrees of unsolvability, the Turing hierarchy of unsolvability, the class of degrees of unsolvability, c.e. degrees and the priority method, and the arithmetical hierarchy. Finally, in the new Part IV the author revisits the computability (Church-Turing) thesis in greater detail. He offers a systematic and detailed account of its origins, evolution, and meaning, he describes more powerful, modern versions of the thesis, and he discusses recent speculative proposals for new computing paradigms such as hypercomputing. This is a gentle introduction from the origins of computability theory up to current research, and it will be of value as a textbook and guide for advanced undergraduate and graduate students and researchers in the domains of computability theory and theoretical computer science. This new edition is completely revised, with almost one hundred pages of new material. In particular the author applied more up-to-date, more consistent terminology, and he addressed some notational redundancies and minor errors. He developed a glossary relating to computability theory, expanded the bibliographic references with new entries, and added the new part described above and other new sections. |
You may like...
|