![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book introduces new models based on R-calculus and theories of belief revision for dealing with large and changing data. It extends R-calculus from first-order logic to propositional logic, description logics, modal logic and logic programming, and from minimal change semantics to subset minimal change, pseudo-subformula minimal change and deduction-based minimal change (the last two minimal changes are newly defined). And it proves soundness and completeness theorems with respect to the minimal changes in these logics. To make R-calculus computable, an approximate R-calculus is given which uses finite injury priority method in recursion theory. Moreover, two applications of R-calculus are given to default theory and semantic inheritance networks. This book offers a rich blend of theory and practice. It is suitable for students, researchers and practitioners in the field of logic. Also it is very useful for all those who are interested in data, digitization and correctness and consistency of information, in modal logics, non monotonic logics, decidable/undecidable logics, logic programming, description logics, default logics and semantic inheritance networks.
Algebraic theories, introduced as a concept in the 1960s, have been a fundamental step towards a categorical view of general algebra. Moreover, they have proved very useful in various areas of mathematics and computer science. This carefully developed book gives a systematic introduction to algebra based on algebraic theories that is accessible to both graduate students and researchers. It will facilitate interactions of general algebra, category theory and computer science. A central concept is that of sifted colimits - that is, those commuting with finite products in sets. The authors prove the duality between algebraic categories and algebraic theories and discuss Morita equivalence between algebraic theories. They also pay special attention to one-sorted algebraic theories and the corresponding concrete algebraic categories over sets, and to S-sorted algebraic theories, which are important in program semantics. The final chapter is devoted to finitary localizations of algebraic categories, a recent research area.
This monograph presents a general theory of weakly implicative logics, a family covering a vast number of non-classical logics studied in the literature, concentrating mainly on the abstract study of the relationship between logics and their algebraic semantics. It can also serve as an introduction to (abstract) algebraic logic, both propositional and first-order, with special attention paid to the role of implication, lattice and residuated connectives, and generalized disjunctions. Based on their recent work, the authors develop a powerful uniform framework for the study of non-classical logics. In a self-contained and didactic style, starting from very elementary notions, they build a general theory with a substantial number of abstract results. The theory is then applied to obtain numerous results for prominent families of logics and their algebraic counterparts, in particular for superintuitionistic, modal, substructural, fuzzy, and relevant logics. The book may be of interest to a wide audience, especially students and scholars in the fields of mathematics, philosophy, computer science, or related areas, looking for an introduction to a general theory of non-classical logics and their algebraic semantics.
This book provides an introduction to some key subjects in algebra and topology. It consists of comprehensive texts of some hours courses on the preliminaries for several advanced theories in (categorical) algebra and topology. Often, this kind of presentations is not so easy to find in the literature, where one begins articles by assuming a lot of knowledge in the field. This volume can both help young researchers to quickly get into the subject by offering a kind of " roadmap " and also help master students to be aware of the basics of other research directions in these fields before deciding to specialize in one of them. Furthermore, it can be used by established researchers who need a particular result for their own research and do not want to go through several research papers in order to understand a single proof. Although the chapters can be read as " self-contained " chapters, the authors have tried to coordinate the texts in order to make them complementary. The seven chapters of this volume correspond to the seven courses taught in two Summer Schools that took place in Louvain-la-Neuve in the frame of the project Fonds d'Appui a l'Internationalisation of the Universite catholique de Louvain to strengthen the collaborations with the universities of Coimbra, Padova and Poitiers, within the Coimbra Group.
This book presents a new nominalistic philosophy of mathematics: semantic conventionalism. Its central thesis is that mathematics should be founded on the human ability to create language - and specifically, the ability to institute conventions for the truth conditions of sentences. This philosophical stance leads to an alternative way of practicing mathematics: instead of "building" objects out of sets, a mathematician should introduce new syntactical sentence types, together with their truth conditions, as he or she develops a theory. Semantic conventionalism is justified first through criticism of Cantorian set theory, intuitionism, logicism, and predicativism; then on its own terms; and finally, exemplified by a detailed reconstruction of arithmetic and real analysis. Also included is a simple solution to the liar paradox and the other paradoxes that have traditionally been recognized as semantic. And since it is argued that mathematics is semantics, this solution also applies to Russell's paradox and the other mathematical paradoxes of self-reference. In addition to philosophers who care about the metaphysics and epistemology of mathematics or the paradoxes of self-reference, this book should appeal to mathematicians interested in alternative approaches.
This book is dedicated to the work of Alasdair Urquhart. The book starts out with an introduction to and an overview of Urquhart's work, and an autobiographical essay by Urquhart. This introductory section is followed by papers on algebraic logic and lattice theory, papers on the complexity of proofs, and papers on philosophical logic and history of logic. The final section of the book contains a response to the papers by Urquhart. Alasdair Urquhart has made extremely important contributions to a variety of fields in logic. He produced some of the earliest work on the semantics of relevant logic. He provided the undecidability of the logics R (of relevant implication) and E (of relevant entailment), as well as some of their close neighbors. He proved that interpolation fails in some of those systems. Urquhart has done very important work in complexity theory, both about the complexity of proofs in classical and some nonclassical logics. In pure algebra, he has produced a representation theorem for lattices and some rather beautiful duality theorems. In addition, he has done important work in the history of logic, especially on Bertrand Russell, including editing Volume four of Russell's Collected Papers.
If you have ever wondered what quaternions are - then look no further, John Vince will show you how simple and useful they are. This 2nd edition has been completely revised and includes extra detail on the invention of quaternions, a complete review of the text and equations, all figures are in colour, extra worked examples, an expanded index, and a bibliography arranged for each chapter. Quaternions for Computer Graphics includes chapters on number sets and algebra, imaginary and complex numbers, the complex plane, rotation transforms, and a comprehensive description of quaternions in the context of rotation. The book will appeal to students of computer graphics, computer science and mathematics, as well as programmers, researchers, academics and professional practitioners interested in learning about quaternions. John Vince explains in an easy-to-understand language, with the aid of useful figures, how quaternions emerged, gave birth to modern vector analysis, disappeared, and reemerged to be adopted by the flight simulation industry and computer graphics. This book will give you the confidence to use quaternions within your every-day mathematics, and explore more advanced texts.
This book describes some basic principles that allow developers of computer programs (computer scientists, software engineers, programmers) to clearly think about the artifacts they deal with in their daily work: data types, programming languages, programs written in these languages that compute from given inputs wanted outputs, and programs that describe continuously executing systems. The core message is that clear thinking about programs can be expressed in a single universal language, the formal language of logic. Apart from its universal elegance and expressiveness, this "logical" approach to the formal modeling of and reasoning about computer programs has another advantage: due to advances in computational logic (automated theorem proving, satisfiability solving, model checking), nowadays much of this process can be supported by software. This book therefore accompanies its theoretical elaborations by practical demonstrations of various systems and tools that are based on respectively make use of the presented logical underpinnings.
Automata theory lies at the foundation of computer science, and is vital to a theoretical understanding of how computers work and what constitutes formal methods. This treatise gives a rigorous account of the topic and illuminates its real meaning by looking at the subject in a variety of ways. The first part of the book is organised around notions of rationality and recognisability. The second part deals with relations between words realised by finite automata, which not only exemplifies the automata theory but also illustrates the variety of its methods and its fields of application. Many exercises are included, ranging from those that test the reader, to those that are technical results, to those that extend ideas presented in the text. Solutions or answers to many of these are included in the book.
This book is a concise, self-contained, up-to-date introduction to extremal combinatorics for nonspecialists. There is a strong emphasis on theorems with particularly elegant and informative proofs, they may be called gems of the theory. The author presents a wide spectrum of the most powerful combinatorial tools together with impressive applications in computer science: methods of extremal set theory, the linear algebra method, the probabilistic method, and fragments of Ramsey theory. No special knowledge in combinatorics or computer science is assumed - the text is self-contained and the proofs can be enjoyed by undergraduate students in mathematics and computer science. Over 300 exercises of varying difficulty, and hints to their solution, complete the text. This second edition has been extended with substantial new material, and has been revised and updated throughout. It offers three new chapters on expander graphs and eigenvalues, the polynomial method and error-correcting codes. Most of the remaining chapters also include new material, such as the Kruskal-Katona theorem on shadows, the Lovasz-Stein theorem on coverings, large cliques in dense graphs without induced 4-cycles, a new lower bounds argument for monotone formulas, Dvir's solution of the finite field Kakeya conjecture, Moser's algorithmic version of the Lovasz Local Lemma, Schoning's algorithm for 3-SAT, the Szemeredi-Trotter theorem on the number of point-line incidences, surprising applications of expander graphs in extremal number theory, and some other new results."
Logic programming has emerged over the last five years as one of the most promising new programming paradigms and as a very active research area. The PROLOG experience has shown that relevant problems in areas such as expert systems, deductive databases, knowledge representation, and rapid prototyping can profitably be tackled by logic programming technology. It has also shown that the performance of PROLOG systems can compare with more traditional programming languages by means of sophisticated optimization and implementation of a new class of languages: the concurrent logic languages. Many recent advances in the theory of logic programs are related to extensions of the basic positive logic language and the related semantic problems. The original non-monotonic negation-as-failure rule has been extended in various ways and provided with new declarative characterizations. Other new language constructs are constraints (which lead to a very important extension of the paradigm which allows us to compute on new domains), concurrency, and modules and objects. This book, written by a team of international experts, goes beyond the classical theory to discuss these recent advances for the first time in a systematic form. The work is intended for advanced students of computer science, logic programming and artificial intelligence.
This book provides a hands-on introduction to runtime verification which guides the reader from zero to sufficient practical knowledge required to consider and apply it in industry. It starts with almost no assumptions on the knowledge of the reader and provides exercises throughout the book through which the reader builds their own runtime verification tool. All that is required are basic programming skills and a good working knowledge of the object-oriented paradigm, ideally Java. Drawing from years of the authors' real-world experience, the reader progresses from manually writing runtime verification code to instrumenting monitoring using aspect-oriented programming, after which they explore increasing levels of specification abstraction: automata, regular expressions, and linear time temporal logic. A range of other topics is also explored in the book, including real-time properties, concerns of efficiency and persistence, integration with testing and architectural considerations. The book is written for graduate students specializing in software engineering as well as for industry professionals who need an introduction to the topic of runtime verification. While the book focuses on underlying foundations and practical techniques, it additionally provides for each chapter a reading list in the appendix for the interested reader who would like to deepen their knowledge in a particular area.
This open access book is the first ever collection of Karl Popper's writings on deductive logic. Karl R. Popper (1902-1994) was one of the most influential philosophers of the 20th century. His philosophy of science ("falsificationism") and his social and political philosophy ("open society") have been widely discussed way beyond academic philosophy. What is not so well known is that Popper also produced a considerable work on the foundations of deductive logic, most of it published at the end of the 1940s as articles at scattered places. This little-known work deserves to be known better, as it is highly significant for modern proof-theoretic semantics. This collection assembles Popper's published writings on deductive logic in a single volume, together with all reviews of these papers. It also contains a large amount of unpublished material from the Popper Archives, including Popper's correspondence related to deductive logic and manuscripts that were (almost) finished, but did not reach the publication stage. All of these items are critically edited with additional comments by the editors. A general introduction puts Popper's work into the context of current discussions on the foundations of logic. This book should be of interest to logicians, philosophers, and anybody concerned with Popper's work.
This book creates a conceptual schema that acts as a correlation between Epistemology and Epistemic Logic. It connects both fields and offers a proper theoretical foundation for the contemporary developments of Epistemic Logic regarding the dynamics of information. It builds a bridge between the view of Awareness Justification Internalism, and a dynamic approach to Awareness Logic. The book starts with an introduction to the main topics in Epistemic Logic and Epistemology and reviews the disconnection between the two fields. It analyses three core notions representing the basic structure of the conceptual schema: "Epistemic Awareness", "Knowledge" and "Justification". Next, it presents the Explicit Aware Knowledge (EAK) Schema, using a diagram of three ellipses to illustrate the schema, and a formal model based on a neighbourhood-model structure, that shows one concrete application of the EAK-Schema into a logical structure. The book ends by presenting conclusions and final remarks about the uses and applications of the EAK-Schema. It shows that the most important feature of the schema is that it serves both as a theoretical correlate to the dynamic extensions of Awareness Logic, providing it with a philosophical background, and as an abstract conceptual structure for a re-interpretation of Epistemology.
A significant number of works have set forth, over the past decades, the emphasis laid by seventeenth-century mathematicians and philosophers on motion and kinematic notions in geometry. These works demonstrated the crucial role attributed in this context to genetic definitions, which state the mode of generation of geometrical objects instead of their essential properties. While the growing importance of genetic definitions in sixteenth-century commentaries on Euclid's Elements has been underlined, the place, uses and status of motion in this geometrical tradition has however never been thoroughly and comprehensively studied. This book therefore undertakes to fill a gap in the history of early modern geometry and philosophy of mathematics by investigating the different treatments of motion and genetic definitions by seven major sixteenth-century commentators on Euclid's Elements, from Oronce Fine (1494-1555) to Christoph Clavius (1538-1612), including Jacques Peletier (1517-1582), John Dee (1527-1608/1609) and Henry Billingsley (d. 1606), among others. By investigating the ontological and epistemological conceptions underlying the introduction and uses of kinematic notions in their interpretation of Euclidean geometry, this study displays the richness of the conceptual framework, philosophical and mathematical, inherent to the sixteenth-century Euclidean tradition and shows how it contributed to a more generalised acceptance and promotion of kinematic approaches to geometry in the early modern period.
This volume provides a unified and accessible account of recent developments regarding the real homotopy type of configuration spaces of manifolds. Configuration spaces consist of collections of pairwise distinct points in a given manifold, the study of which is a classical topic in algebraic topology. One of this theory's most important questions concerns homotopy invariance: if a manifold can be continuously deformed into another one, then can the configuration spaces of the first manifold be continuously deformed into the configuration spaces of the second? This conjecture remains open for simply connected closed manifolds. Here, it is proved in characteristic zero (i.e. restricted to algebrotopological invariants with real coefficients), using ideas from the theory of operads. A generalization to manifolds with boundary is then considered. Based on the work of Campos, Ducoulombier, Lambrechts, Willwacher, and the author, the book covers a vast array of topics, including rational homotopy theory, compactifications, PA forms, propagators, Kontsevich integrals, and graph complexes, and will be of interest to a wide audience.
This book presents a collection of recent research on topics related to Pythagorean fuzzy set, dealing with dynamic and complex decision-making problems. It discusses a wide range of theoretical and practical information to the latest research on Pythagorean fuzzy sets, allowing readers to gain an extensive understanding of both fundamentals and applications. It aims at solving various decision-making problems such as medical diagnosis, pattern recognition, construction problems, technology selection, and more, under the Pythagorean fuzzy environment, making it of much value to students, researchers, and professionals associated with the field.
This book is a collection of contributions honouring Arnon Avron's seminal work on the semantics and proof theory of non-classical logics. It includes presentations of advanced work by some of the most esteemed scholars working on semantic and proof-theoretical aspects of computer science logic. Topics in this book include frameworks for paraconsistent reasoning, foundations of relevance logics, analysis and characterizations of modal logics and fuzzy logics, hypersequent calculi and their properties, non-deterministic semantics, algebraic structures for many-valued logics, and representations of the mechanization of mathematics. Avron's foundational and pioneering contributions have been widely acknowledged and adopted by the scientific community. His research interests are very broad, spanning over proof theory, automated reasoning, non-classical logics, foundations of mathematics, and applications of logic in computer science and artificial intelligence. This is clearly reflected by the diversity of topics discussed in the chapters included in this book, all of which directly relate to Avron's past and present works. This book is of interest to computer scientists and scholars of formal logic.
For a brief time in history, it was possible to imagine that a sufficiently advanced intellect could, given sufficient time and resources, in principle understand how to mathematically prove everything that was true. They could discern what math corresponds to physical laws, and use those laws to predict anything that happens before it happens. That time has passed. Goedel's undecidability results (the incompleteness theorems), Turing's proof of non-computable values, the formulation of quantum theory, chaos, and other developments over the past century have shown that there are rigorous arguments limiting what we can prove, compute, and predict. While some connections between these results have come to light, many remain obscure, and the implications are unclear. Are there, for example, real consequences for physics - including quantum mechanics - of undecidability and non-computability? Are there implications for our understanding of the relations between agency, intelligence, mind, and the physical world? This book, based on the winning essays from the annual FQXi competition, contains ten explorations of Undecidability, Uncomputability, and Unpredictability. The contributions abound with connections, implications, and speculations while undertaking rigorous but bold and open-minded investigation of the meaning of these constraints for the physical world, and for us as humans.
This easy-to-understand textbook introduces the mathematical language and problem-solving tools essential to anyone wishing to enter the world of computer and information sciences. Specifically designed for the student who is intimidated by mathematics, the book offers a concise treatment in an engaging style. The thoroughly revised third edition features a new chapter on relevance-sensitivity in logical reasoning and many additional explanations on points that students find puzzling, including the rationale for various shorthand ways of speaking and 'abuses of language' that are convenient but can give rise to misunderstandings. Solutions are now also provided for all exercises. Topics and features: presents an intuitive approach, emphasizing how finite mathematics supplies a valuable language for thinking about computation; discusses sets and the mathematical objects built with them, such as relations and functions, as well as recursion and induction; introduces core topics of mathematics, including combinatorics and finite probability, along with the structures known as trees; examines propositional and quantificational logic, how to build complex proofs from simple ones, and how to ensure relevance in logic; addresses questions that students find puzzling but may have difficulty articulating, through entertaining conversations between Alice and the Mad Hatter; provides an extensive set of solved exercises throughout the text. This clearly-written textbook offers invaluable guidance to students beginning an undergraduate degree in computer science. The coverage is also suitable for courses on formal methods offered to those studying mathematics, philosophy, linguistics, economics, and political science. Assuming only minimal mathematical background, it is ideal for both the classroom and independent study.
This book features more than 20 papers that celebrate the work of Hajnal Andreka and Istvan Nemeti. It illustrates an interaction between developing and applying mathematical logic. The papers offer new results as well as surveys in areas influenced by these two outstanding researchers. They also provide details on the after-life of some of their initiatives. Computer science connects the papers in the first part of the book. The second part concentrates on algebraic logic. It features a range of papers that hint at the intricate many-way connections between logic, algebra, and geometry. The third part explores novel applications of logic in relativity theory, philosophy of logic, philosophy of physics and spacetime, and methodology of science. They include such exciting subjects as time travelling in emergent spacetime. The short autobiographies of Hajnal Andreka and Istvan Nemeti at the end of the book describe an adventurous journey from electric engineering and Maxwell's equations to a complex system of computer programs for designing Hungary's electric power system, to exploring and contributing deep results to Tarskian algebraic logic as the deepest core theory of such questions, then on to applications of the results in such exciting new areas as relativity theory in order to rejuvenate logic itself.
This textbook introduces enumerative combinatorics through the framework of formal languages and bijections. By starting with elementary operations on words and languages, the authors paint an insightful, unified picture for readers entering the field. Numerous concrete examples and illustrative metaphors motivate the theory throughout, while the overall approach illuminates the important connections between discrete mathematics and theoretical computer science. Beginning with the basics of formal languages, the first chapter quickly establishes a common setting for modeling and counting classical combinatorial objects and constructing bijective proofs. From here, topics are modular and offer substantial flexibility when designing a course. Chapters on generating functions and partitions build further fundamental tools for enumeration and include applications such as a combinatorial proof of the Lagrange inversion formula. Connections to linear algebra emerge in chapters studying Cayley trees, determinantal formulas, and the combinatorics that lie behind the classical Cayley-Hamilton theorem. The remaining chapters range across the Inclusion-Exclusion Principle, graph theory and coloring, exponential structures, matching and distinct representatives, with each topic opening many doors to further study. Generous exercise sets complement all chapters, and miscellaneous sections explore additional applications. Lessons in Enumerative Combinatorics captures the authors' distinctive style and flair for introducing newcomers to combinatorics. The conversational yet rigorous presentation suits students in mathematics and computer science at the graduate, or advanced undergraduate level. Knowledge of single-variable calculus and the basics of discrete mathematics is assumed; familiarity with linear algebra will enhance the study of certain chapters.
This book gives a proof of Cherlin's conjecture for finite binary primitive permutation groups. Motivated by the part of model theory concerned with Lachlan's theory of finite homogeneous relational structures, this conjecture proposes a classification of those finite primitive permutation groups that have relational complexity equal to 2. The first part gives a full introduction to Cherlin's conjecture, including all the key ideas that have been used in the literature to prove some of its special cases. The second part completes the proof by dealing with primitive permutation groups that are almost simple with socle a group of Lie type. A great deal of material concerning properties of primitive permutation groups and almost simple groups is included, and new ideas are introduced. Addressing a hot topic which cuts across the disciplines of group theory, model theory and logic, this book will be of interest to a wide range of readers. It will be particularly useful for graduate students and researchers who need to work with simple groups of Lie type.
This textbook introduces the representation theory of algebras by focusing on two of its most important aspects: the Auslander-Reiten theory and the study of the radical of a module category. It starts by introducing and describing several characterisations of the radical of a module category, then presents the central concepts of irreducible morphisms and almost split sequences, before providing the definition of the Auslander-Reiten quiver, which encodes much of the information on the module category. It then turns to the study of endomorphism algebras, leading on one hand to the definition of the Auslander algebra and on the other to tilting theory. The book ends with selected properties of representation-finite algebras, which are now the best understood class of algebras. Intended for graduate students in representation theory, this book is also of interest to any mathematician wanting to learn the fundamentals of this rapidly growing field. A graduate course in non-commutative or homological algebra, which is standard in most universities, is a prerequisite for readers of this book.
Monograph( based very largely upon results original to the Czechoslovakian authors) presents an abstract account of the theory of automata for sophisticated readers presumed to be already conversant in the language of category theory. The seven chapters are punctuated at frequent intervals by exampl |
You may like...
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R983
Discovery Miles 9 830
Key to Advanced Arithmetic for Canadian…
Barnard 1817-1876 Smith, Archibald McMurchy
Hardcover
R863
Discovery Miles 8 630
|