![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book is dedicated to the life and work of the mathematician Joachim Lambek (1922-2014). The editors gather together noted experts to discuss the state of the art of various of Lambek's works in logic, category theory, and linguistics and to celebrate his contributions to those areas over the course of his multifaceted career. After early work in combinatorics and elementary number theory, Lambek became a distinguished algebraist (notably in ring theory). In the 1960s, he began to work in category theory, categorical algebra, logic, proof theory, and foundations of computability. In a parallel development, beginning in the late 1950s and for the rest of his career, Lambek also worked extensively in mathematical linguistics and computational approaches to natural languages. He and his collaborators perfected production and type grammars for numerous natural languages. Lambek grammars form an early noncommutative precursor to Girard's linear logic. In a surprising development (2000), he introduced a novel and deeper algebraic framework (which he called pregroup grammars) for analyzing natural language, along with algebraic, higher category, and proof-theoretic semantics. This book is of interest to mathematicians, logicians, linguists, and computer scientists.
General concepts and methods that occur throughout mathematics and
now also in theoretical computer science are the subject of this
book. It is a thorough introduction to Categories, emphasizing the
geometric nature of the subject and explaining its connections to
mathematical logic. The book should appeal to the inquisitive
reader who has seen some basic topology and algebra and would like
to learn and explore further.
This book gathers together selected contributions presented at the 3rd Moroccan Andalusian Meeting on Algebras and their Applications, held in Chefchaouen, Morocco, April 12-14, 2018, and which reflects the mathematical collaboration between south European and north African countries, mainly France, Spain, Morocco, Tunisia and Senegal. The book is divided in three parts and features contributions from the following fields: algebraic and analytic methods in associative and non-associative structures; homological and categorical methods in algebra; and history of mathematics. Covering topics such as rings and algebras, representation theory, number theory, operator algebras, category theory, group theory and information theory, it opens up new avenues of study for graduate students and young researchers. The findings presented also appeal to anyone interested in the fields of algebra and mathematical analysis.
This edited volume presents a fascinating collection of lecture notes focusing on differential equations from two viewpoints: formal calculus (through the theory of Groebner bases) and geometry (via quiver theory). Groebner bases serve as effective models for computation in algebras of various types. Although the theory of Groebner bases was developed in the second half of the 20th century, many works on computational methods in algebra were published well before the introduction of the modern algebraic language. Since then, new algorithms have been developed and the theory itself has greatly expanded. In comparison, diagrammatic methods in representation theory are relatively new, with the quiver varieties only being introduced - with big impact - in the 1990s. Divided into two parts, the book first discusses the theory of Groebner bases in their commutative and noncommutative contexts, with a focus on algorithmic aspects and applications of Groebner bases to analysis on systems of partial differential equations, effective analysis on rings of differential operators, and homological algebra. It then introduces representations of quivers, quiver varieties and their applications to the moduli spaces of meromorphic connections on the complex projective line. While no particular reader background is assumed, the book is intended for graduate students in mathematics, engineering and related fields, as well as researchers and scholars.
These two volumes cover the principal approaches to constructivism in mathematics. They present a thorough, up-to-date introduction to the metamathematics of constructive mathematics, paying special attention to Intuitionism, Markov's constructivism and Martin-Lof's type theory with its operational semantics. A detailed exposition of the basic features of constructive mathematics, with illustrations from analysis, algebra and topology, is provided, with due attention to the metamathematical aspects. Volume 1 is a self-contained introduction to the practice and foundations of constructivism, and does not require specialized knowledge beyond basic mathematical logic. Volume 2 contains mainly advanced topics of a proof-theoretical and semantical nature.
This book outlines a vast array of techniques and methods regarding model categories, without focussing on the intricacies of the proofs. Quillen model categories are a fundamental tool for the understanding of homotopy theory. While many introductions to model categories fall back on the same handful of canonical examples, the present book highlights a large, self-contained collection of other examples which appear throughout the literature. In particular, it collects a highly scattered literature into a single volume. The book is aimed at anyone who uses, or is interested in using, model categories to study homotopy theory. It is written in such a way that it can be used as a reference guide for those who are already experts in the field. However, it can also be used as an introduction to the theory for novices.
The philosophy of computer science is concerned with issues that arise from reflection upon the nature and practice of the discipline of computer science. This book presents an approach to the subject that is centered upon the notion of computational artefact. It provides an analysis of the things of computer science as technical artefacts. Seeing them in this way enables the application of the analytical tools and concepts from the philosophy of technology to the technical artefacts of computer science. With this conceptual framework the author examines some of the central philosophical concerns of computer science including the foundations of semantics, the logical role of specification, the nature of correctness, computational ontology and abstraction, formal methods, computational epistemology and explanation, the methodology of computer science, and the nature of computation. The book will be of value to philosophers and computer scientists.
This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory. The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tubingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.
The book is primarily intended as a textbook on modern algebra for undergraduate mathematics students. It is also useful for those who are interested in supplementary reading at a higher level. The text is designed in such a way that it encourages independent thinking and motivates students towards further study. The book covers all major topics in group, ring, vector space and module theory that are usually contained in a standard modern algebra text. In addition, it studies semigroup, group action, Hopf's group, topological groups and Lie groups with their actions, applications of ring theory to algebraic geometry, and defines Zariski topology, as well as applications of module theory to structure theory of rings and homological algebra. Algebraic aspects of classical number theory and algebraic number theory are also discussed with an eye to developing modern cryptography. Topics on applications to algebraic topology, category theory, algebraic geometry, algebraic number theory, cryptography and theoretical computer science interlink the subject with different areas. Each chapter discusses individual topics, starting from the basics, with the help of illustrative examples. This comprehensive text with a broad variety of concepts, applications, examples, exercises and historical notes represents a valuable and unique resource.
This book focuses on the game-theoretical semantics and epistemic logic of Jaakko Hintikka. Hintikka was a prodigious and esteemed philosopher and logician, and his death in August 2015 was a huge loss to the philosophical community. This book, whose chapters have been in preparation for several years, is dedicated to the work of Jaako Hintikka, and to his memory. This edited volume consists of 23 contributions from leading logicians and philosophers, who discuss themes that span across the entire range of Hintikka's career. Semantic Representationalism, Logical Dialogues, Knowledge and Epistemic logic are among some of the topics covered in this book's chapters. The book should appeal to students, scholars and teachers who wish to explore the philosophy of Jaako Hintikka.
This book examines the philosophical conception of abductive reasoning as developed by Charles S. Peirce, the founder of American pragmatism. It explores the historical and systematic connections of Peirce's original ideas and debates about their interpretations. Abduction is understood in a broad sense which covers the discovery and pursuit of hypotheses and inference to the best explanation. The analysis presents fresh insights into this notion of reasoning, which derives from effects to causes or from surprising observations to explanatory theories. The author outlines some logical and AI approaches to abduction as well as studies various kinds of inverse problems in astronomy, physics, medicine, biology, and human sciences to provide examples of retroductions and abductions. The discussion covers also everyday examples with the implication of this notion in detective stories, one of Peirce's own favorite themes. The author uses Bayesian probabilities to argue that explanatory abduction is a method of confirmation. He uses his own account of truth approximation to reformulate abduction as inference which leads to the truthlikeness of its conclusion. This allows a powerful abductive defense of scientific realism. This up-to-date survey and defense of the Peircean view of abduction may very well help researchers, students, and philosophers better understand the logic of truth-seeking.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
Stephen Cole Kleene was one of the greatest logicians of the twentieth century and this book is the influential textbook he wrote to teach the subject to the next generation. It was first published in 1952, some twenty years after the publication of Gadel's paper on the incompleteness of arithmetic, which marked, if not the beginning of modern logic, at least a turning point after which oenothing was ever the same. Kleene was an important figure in logic, and lived a long full life of scholarship and teaching. The 1930s was a time of creativity and ferment in the subject, when the notion of aEUROoecomputableaEURO moved from the realm of philosophical speculation to the realm of science. This was accomplished by the work of Kurt Gade1, Alan Turing, and Alonzo Church, who gave three apparently different precise definitions of aEUROoecomputableaEURO . When they all turned out to be equivalent, there was a collective realization that this was indeed the oeright notion. Kleene played a key role in this process. One could say that he was oethere at the beginning of modern logic. He showed the equivalence of lambda calculus with Turing machines and with Gadel's recursion equations, and developed the modern machinery of partial recursive functions. This textbook played an invaluable part in educating the logicians of the present. It played an important role in their own logical education.
This book provides a critical examination of how the choice of what to believe is represented in the standard model of belief change. In particular the use of possible worlds and infinite remainders as objects of choice is critically examined. Descriptors are introduced as a versatile tool for expressing the success conditions of belief change, addressing both local and global descriptor revision. The book presents dynamic descriptors such as Ramsey descriptors that convey how an agent's beliefs tend to be changed in response to different inputs. It also explores sentential revision and demonstrates how local and global operations of revision by a sentence can be derived as a special case of descriptor revision. Lastly, the book examines revocation, a generalization of contraction in which a specified sentence is removed in a process that may possibly also involve the addition of some new information to the belief set.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This volume offers a wide range of both reconstructions of Nikolai Vasiliev's original logical ideas and their implementations in the modern logic and philosophy. A collection of works put together through the international workshop "Nikolai Vasiliev's Logical Legacy and the Modern Logic," this book also covers foundations of logic in the light of Vasiliev's contradictory ontology. Chapters range from a look at the Heuristic and Conceptual Background of Vasiliev's Imaginary Logic to Generalized Vasiliev-style Propositions. It includes works which cover Imaginary and Non-Aristotelian Logics, Inconsistent Set Theory and the Expansion of Mathematical Thinking, Plurivalent Logic, and the Impact of Vasiliev's Imaginary Logic on Epistemic Logic. The Russian logician, Vasiliev, was widely recognized as one of the forerunners of modern non-classical logic. His "imaginary logic" developed in some of his work at the beginning of 20th century is often considered to be one of the first systems of paraconsistent and multi-valued logic. The novelty of his logical project has opened up prospects for modern logic as well as for non-classical science in general. This volume contains a selection of papers written by modern specialists in the field and deals with various aspects of Vasiliev's logical ideas. The logical legacy of Nikolai Vasiliev can serve as a promising source for developing an impressive range of philosophical interpretations, as it marries promising technical innovations with challenging philosophical insights.
The book has two parts: In the first, after a review of some seminal classical accounts of laws and explanations, a new account is proposed for distinguishing between laws and accidental generalizations (LAG). Among the new consequences of this proposal it is proved that any explanation of a contingent generalization shows that the generalization is not accidental. The second part involves physical theories, their modality, and their explanatory power. In particular, it is shown that (1) Each theory has a theoretical implication structure associated with it, such that there are new physical modal operators on these structures and also special modal entities that are in these structures. A special subset of the physical modals, the nomic modals are associated with the laws of theories. (2) The familiar idea that theories always explain laws by deduction of them has to be seriously modified in light of the fact that there are a host of physical theories (including for example, Newtonian Classical mechanics, Hamiltonian, and Lagrangian theory, and probability theory) that we believe are schematic (they do not have any truth value). Nevertheless, we think that there is a kind of non-deductive explanation and generality that they achieve by subsumtion under a schema.
A comprehensive one-year graduate (or advanced undergraduate)
course in mathematical logic and foundations of mathematics. No
previous knowledge of logic is required; the book is suitable for
self-study. Many exercises (with hints) are included.
This monograph provides a self-contained and easy-to-read
introduction to non-commutative multiple-valued logic algebras; a
subject which has attracted much interest in the past few years
because of its impact on information science, artificial
intelligence and other subjects.
This book is a source of valuable and useful information on the topics of dynamics of number systems and scientific computation with arbitrary precision. It is addressed to scholars, scientists and engineers, and graduate students. The treatment is elementary and self-contained with relevance both for theory and applications. The basic prerequisite of the book is linear algebra and matrix calculus.
This book features survey and research papers from The Abel Symposium 2011: Algebras, quivers and representations, held in Balestrand, Norway 2011. It examines a very active research area that has had a growing influence and profound impact in many other areas of mathematics like, commutative algebra, algebraic geometry, algebraic groups and combinatorics. This volume illustrates and extends such connections with algebraic geometry, cluster algebra theory, commutative algebra, dynamical systems and triangulated categories. In addition, it includes contributions on further developments in representation theory of quivers and algebras. "Algebras, Quivers and Representations" is targeted at researchers and graduate students in algebra, representation theory and triangulate categories. "
This book deals with the problem of finding suitable languages that can represent specific classes of Petri nets, the most studied and widely accepted model for distributed systems. Hence, the contribution of this book amounts to the alphabetization of some classes of distributed systems. The book also suggests the need for a generalization of Turing computability theory. It is important for graduate students and researchers engaged with the concurrent semantics of distributed communicating systems. The author assumes some prior knowledge of formal languages and theoretical computer science. |
![]() ![]() You may like...
Operator Theory and Harmonic Analysis…
Alexey N. Karapetyants, Vladislav V. Kravchenko, …
Hardcover
R6,433
Discovery Miles 64 330
Operator Theory and Harmonic Analysis…
Alexey N. Karapetyants, Igor V. Pavlov, …
Hardcover
R6,319
Discovery Miles 63 190
Acoustics of Layered Media II - Point…
Leonid M. Brekhovskikh, Oleg A. Godin
Hardcover
R5,897
Discovery Miles 58 970
Sequence Spaces and Measures of…
Jozef Banas, Mohammad Mursaleen
Hardcover
R3,646
Discovery Miles 36 460
Measurement of Nonlinear Ultrasonic…
Kyung-Young Jhang, Cliff J. Lissenden, …
Hardcover
R5,118
Discovery Miles 51 180
Stochastic Optimal Control in Infinite…
Giorgio Fabbri, Fausto Gozzi, …
Hardcover
R6,564
Discovery Miles 65 640
|