![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book provides a general survey of the main concepts, questions and results that have been developed in the recent interactions between quantum information, quantum computation and logic. Divided into 10 chapters, the books starts with an introduction of the main concepts of the quantum-theoretic formalism used in quantum information. It then gives a synthetic presentation of the main "mathematical characters" of the quantum computational game: qubits, quregisters, mixtures of quregisters, quantum logical gates. Next, the book investigates the puzzling entanglement-phenomena and logically analyses the Einstein-Podolsky-Rosen paradox and introduces the reader to quantum computational logics, and new forms of quantum logic. The middle chapters investigate the possibility of a quantum computational semantics for a language that can express sentences like "Alice knows that everybody knows that she is pretty", explore the mathematical concept of quantum Turing machine, and illustrate some characteristic examples that arise in the framework of musical languages. The book concludes with an analysis of recent discussions, and contains a Mathematical Appendix which is a survey of the definitions of all main mathematical concepts used in the book.
This book collects a series of contributions addressing the various contexts in which the theory of Lie groups is applied. A preliminary chapter serves the reader both as a basic reference source and as an ongoing thread that runs through the subsequent chapters. From representation theory and Gerstenhaber algebras to control theory, from differential equations to Finsler geometry and Lepage manifolds, the book introduces young researchers in Mathematics to a wealth of different topics, encouraging a multidisciplinary approach to research. As such, it is suitable for students in doctoral courses, and will also benefit researchers who want to expand their field of interest.
This book addresses mechanisms for reducing model heterogeneity induced by the absence of explicit semantics expression in the formal techniques used to specify design models. More precisely, it highlights the advances in handling both implicit and explicit semantics in formal system developments, and discusses different contributions expressing different views and perceptions on the implicit and explicit semantics. The book is based on the discussions at the Shonan meeting on this topic held in 2016, and includes contributions from the participants summarising their perspectives on the problem and offering solutions. Divided into 5 parts: domain modelling, knowledge-based modelling, proof-based modelling, assurance cases, and refinement-based modelling, and offers inspiration for researchers and practitioners in the fields of formal methods, system and software engineering, domain knowledge modelling, requirement analysis, and explicit and implicit semantics of modelling languages.
This book presents an intuitive picture-oriented approach to the formative processes technique and to its applications. In the first part the authors introduce basic set-theoretic terminology and properties, the decision problem in set theory, and formative processes. The second part of the book is devoted to applications of the technique of formative processes to decision problems. All chapters contain exercises and the book is appropriate for researchers and graduate students in the area of computer science logic.
This volume presents essays by pioneering thinkers including Tyler Burge, Gregory Chaitin, Daniel Dennett, Barry Mazur, Nicholas Humphrey, John Searle and Ian Stewart. Together they illuminate the Map/Territory Distinction that underlies at the foundation of the scientific method, thought and the very reality itself. It is imperative to distinguish Map from the Territory while analyzing any subject but we often mistake map for the territory. Meaning for the Reference. Computational tool for what it computes. Representations are handy and tempting that we often end up committing the category error of over-marrying the representation with what is represented, so much so that the distinction between the former and the latter is lost. This error that has its roots in the pedagogy often generates a plethora of paradoxes/confusions which hinder the proper understanding of the subject. What are wave functions? Fields? Forces? Numbers? Sets? Classes? Operators? Functions? Alphabets and Sentences? Are they a part of our map (theory/representation)? Or do they actually belong to the territory (Reality)? Researcher, like a cartographer, clothes (or creates?) the reality by stitching multitudes of maps that simultaneously co-exist. A simple apple, for example, can be analyzed from several viewpoints beginning with evolution and biology, all the way down its microscopic quantum mechanical components. Is there a reality (or a real apple) out there apart from these maps? How do these various maps interact/intermingle with each other to produce a coherent reality that we interact with? Or do they not? Does our brain uses its own internal maps to facilitate "physicist/mathematician" in us to construct the maps about the external territories in turn? If so, what is the nature of these internal maps? Are there meta-maps? Evolution definitely fences our perception and thereby our ability to construct maps, revealing to us only those aspects beneficial for our survival. But the question is, to what extent? Is there a way out of the metaphorical Platonic cave erected around us by the nature? While "Map is not the territory" as Alfred Korzybski remarked, join us in this journey to know more, while we inquire on the nature and the reality of the maps which try to map the reality out there. The book also includes a foreword by Sir Roger Penrose and an afterword by Dagfinn Follesdal.
This book is dedicated to the life and work of the mathematician Joachim Lambek (1922-2014). The editors gather together noted experts to discuss the state of the art of various of Lambek's works in logic, category theory, and linguistics and to celebrate his contributions to those areas over the course of his multifaceted career. After early work in combinatorics and elementary number theory, Lambek became a distinguished algebraist (notably in ring theory). In the 1960s, he began to work in category theory, categorical algebra, logic, proof theory, and foundations of computability. In a parallel development, beginning in the late 1950s and for the rest of his career, Lambek also worked extensively in mathematical linguistics and computational approaches to natural languages. He and his collaborators perfected production and type grammars for numerous natural languages. Lambek grammars form an early noncommutative precursor to Girard's linear logic. In a surprising development (2000), he introduced a novel and deeper algebraic framework (which he called pregroup grammars) for analyzing natural language, along with algebraic, higher category, and proof-theoretic semantics. This book is of interest to mathematicians, logicians, linguists, and computer scientists.
This book gathers together selected contributions presented at the 3rd Moroccan Andalusian Meeting on Algebras and their Applications, held in Chefchaouen, Morocco, April 12-14, 2018, and which reflects the mathematical collaboration between south European and north African countries, mainly France, Spain, Morocco, Tunisia and Senegal. The book is divided in three parts and features contributions from the following fields: algebraic and analytic methods in associative and non-associative structures; homological and categorical methods in algebra; and history of mathematics. Covering topics such as rings and algebras, representation theory, number theory, operator algebras, category theory, group theory and information theory, it opens up new avenues of study for graduate students and young researchers. The findings presented also appeal to anyone interested in the fields of algebra and mathematical analysis.
This edited volume presents a fascinating collection of lecture notes focusing on differential equations from two viewpoints: formal calculus (through the theory of Groebner bases) and geometry (via quiver theory). Groebner bases serve as effective models for computation in algebras of various types. Although the theory of Groebner bases was developed in the second half of the 20th century, many works on computational methods in algebra were published well before the introduction of the modern algebraic language. Since then, new algorithms have been developed and the theory itself has greatly expanded. In comparison, diagrammatic methods in representation theory are relatively new, with the quiver varieties only being introduced - with big impact - in the 1990s. Divided into two parts, the book first discusses the theory of Groebner bases in their commutative and noncommutative contexts, with a focus on algorithmic aspects and applications of Groebner bases to analysis on systems of partial differential equations, effective analysis on rings of differential operators, and homological algebra. It then introduces representations of quivers, quiver varieties and their applications to the moduli spaces of meromorphic connections on the complex projective line. While no particular reader background is assumed, the book is intended for graduate students in mathematics, engineering and related fields, as well as researchers and scholars.
This book outlines a vast array of techniques and methods regarding model categories, without focussing on the intricacies of the proofs. Quillen model categories are a fundamental tool for the understanding of homotopy theory. While many introductions to model categories fall back on the same handful of canonical examples, the present book highlights a large, self-contained collection of other examples which appear throughout the literature. In particular, it collects a highly scattered literature into a single volume. The book is aimed at anyone who uses, or is interested in using, model categories to study homotopy theory. It is written in such a way that it can be used as a reference guide for those who are already experts in the field. However, it can also be used as an introduction to the theory for novices.
The collected works of Turing, including a substantial amount of unpublished material, will comprise four volumes: Mechanical Intelligence, Pure Mathematics, Morphogenesis and Mathematical Logic. Alan Mathison Turing (1912-1954) was a brilliant man who made major contributions in several areas of science. Today his name is mentioned frequently in philosophical discussions about the nature of Artificial Intelligence. Actually, he was a pioneer researcher in computer architecture and software engineering; his work in pure mathematics and mathematical logic extended considerably further and his last work, on morphogenesis in plants, is also acknowledged as being of the greatest originality and of permanent importance. He was one of the leading figures in Twentieth-century science, a fact which would have been known to the general public sooner but for the British Official Secrets Act, which prevented discussion of his wartime work. What is maybe surprising about these papers is that although they were written decades ago, they address major issues which concern researchers today.
The philosophy of computer science is concerned with issues that arise from reflection upon the nature and practice of the discipline of computer science. This book presents an approach to the subject that is centered upon the notion of computational artefact. It provides an analysis of the things of computer science as technical artefacts. Seeing them in this way enables the application of the analytical tools and concepts from the philosophy of technology to the technical artefacts of computer science. With this conceptual framework the author examines some of the central philosophical concerns of computer science including the foundations of semantics, the logical role of specification, the nature of correctness, computational ontology and abstraction, formal methods, computational epistemology and explanation, the methodology of computer science, and the nature of computation. The book will be of value to philosophers and computer scientists.
This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory. The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tubingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.
The book is primarily intended as a textbook on modern algebra for undergraduate mathematics students. It is also useful for those who are interested in supplementary reading at a higher level. The text is designed in such a way that it encourages independent thinking and motivates students towards further study. The book covers all major topics in group, ring, vector space and module theory that are usually contained in a standard modern algebra text. In addition, it studies semigroup, group action, Hopf's group, topological groups and Lie groups with their actions, applications of ring theory to algebraic geometry, and defines Zariski topology, as well as applications of module theory to structure theory of rings and homological algebra. Algebraic aspects of classical number theory and algebraic number theory are also discussed with an eye to developing modern cryptography. Topics on applications to algebraic topology, category theory, algebraic geometry, algebraic number theory, cryptography and theoretical computer science interlink the subject with different areas. Each chapter discusses individual topics, starting from the basics, with the help of illustrative examples. This comprehensive text with a broad variety of concepts, applications, examples, exercises and historical notes represents a valuable and unique resource.
This book focuses on the game-theoretical semantics and epistemic logic of Jaakko Hintikka. Hintikka was a prodigious and esteemed philosopher and logician, and his death in August 2015 was a huge loss to the philosophical community. This book, whose chapters have been in preparation for several years, is dedicated to the work of Jaako Hintikka, and to his memory. This edited volume consists of 23 contributions from leading logicians and philosophers, who discuss themes that span across the entire range of Hintikka's career. Semantic Representationalism, Logical Dialogues, Knowledge and Epistemic logic are among some of the topics covered in this book's chapters. The book should appeal to students, scholars and teachers who wish to explore the philosophy of Jaako Hintikka.
Our much-valued mathematical knowledge rests on two supports: the logic of proof and the axioms from which those proofs begin. Naturalism in Mathematics investigates the status of the latter, the fundamental assumptions of mathematics. These were once held to be self-evident, but progress in work on the foundations of mathematics, especially in set theory, has rendered that comforting notion obsolete. Given that candidates for axiomatic status cannot be proved, what sorts of considerations can be offered for or against them? That is the central question addressed in this book. One answer is that mathematics aims to describe an objective world of mathematical objects, and that axiom candidates should be judged by their truth or falsity in that world. This promising view-realism-is assessed and finally rejected in favour of another-naturalism-which attends less to metaphysical considerations of objective truth and falsity, and more to practical considerations drawn from within mathematics itself. Penelope Maddy defines this naturalism, explains the motivation for it, and shows how it can be helpfully applied in the assessment of candidates for axiomatic status in set theory. Maddy's clear, original treatment of this fundamental issue is informed by current work in both philosophy and mathematics, and will be accessible and enlightening to readers from both disciplines.
This book examines the philosophical conception of abductive reasoning as developed by Charles S. Peirce, the founder of American pragmatism. It explores the historical and systematic connections of Peirce's original ideas and debates about their interpretations. Abduction is understood in a broad sense which covers the discovery and pursuit of hypotheses and inference to the best explanation. The analysis presents fresh insights into this notion of reasoning, which derives from effects to causes or from surprising observations to explanatory theories. The author outlines some logical and AI approaches to abduction as well as studies various kinds of inverse problems in astronomy, physics, medicine, biology, and human sciences to provide examples of retroductions and abductions. The discussion covers also everyday examples with the implication of this notion in detective stories, one of Peirce's own favorite themes. The author uses Bayesian probabilities to argue that explanatory abduction is a method of confirmation. He uses his own account of truth approximation to reformulate abduction as inference which leads to the truthlikeness of its conclusion. This allows a powerful abductive defense of scientific realism. This up-to-date survey and defense of the Peircean view of abduction may very well help researchers, students, and philosophers better understand the logic of truth-seeking.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
This book provides a critical examination of how the choice of what to believe is represented in the standard model of belief change. In particular the use of possible worlds and infinite remainders as objects of choice is critically examined. Descriptors are introduced as a versatile tool for expressing the success conditions of belief change, addressing both local and global descriptor revision. The book presents dynamic descriptors such as Ramsey descriptors that convey how an agent's beliefs tend to be changed in response to different inputs. It also explores sentential revision and demonstrates how local and global operations of revision by a sentence can be derived as a special case of descriptor revision. Lastly, the book examines revocation, a generalization of contraction in which a specified sentence is removed in a process that may possibly also involve the addition of some new information to the belief set.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This volume offers a wide range of both reconstructions of Nikolai Vasiliev's original logical ideas and their implementations in the modern logic and philosophy. A collection of works put together through the international workshop "Nikolai Vasiliev's Logical Legacy and the Modern Logic," this book also covers foundations of logic in the light of Vasiliev's contradictory ontology. Chapters range from a look at the Heuristic and Conceptual Background of Vasiliev's Imaginary Logic to Generalized Vasiliev-style Propositions. It includes works which cover Imaginary and Non-Aristotelian Logics, Inconsistent Set Theory and the Expansion of Mathematical Thinking, Plurivalent Logic, and the Impact of Vasiliev's Imaginary Logic on Epistemic Logic. The Russian logician, Vasiliev, was widely recognized as one of the forerunners of modern non-classical logic. His "imaginary logic" developed in some of his work at the beginning of 20th century is often considered to be one of the first systems of paraconsistent and multi-valued logic. The novelty of his logical project has opened up prospects for modern logic as well as for non-classical science in general. This volume contains a selection of papers written by modern specialists in the field and deals with various aspects of Vasiliev's logical ideas. The logical legacy of Nikolai Vasiliev can serve as a promising source for developing an impressive range of philosophical interpretations, as it marries promising technical innovations with challenging philosophical insights.
The book has two parts: In the first, after a review of some seminal classical accounts of laws and explanations, a new account is proposed for distinguishing between laws and accidental generalizations (LAG). Among the new consequences of this proposal it is proved that any explanation of a contingent generalization shows that the generalization is not accidental. The second part involves physical theories, their modality, and their explanatory power. In particular, it is shown that (1) Each theory has a theoretical implication structure associated with it, such that there are new physical modal operators on these structures and also special modal entities that are in these structures. A special subset of the physical modals, the nomic modals are associated with the laws of theories. (2) The familiar idea that theories always explain laws by deduction of them has to be seriously modified in light of the fact that there are a host of physical theories (including for example, Newtonian Classical mechanics, Hamiltonian, and Lagrangian theory, and probability theory) that we believe are schematic (they do not have any truth value). Nevertheless, we think that there is a kind of non-deductive explanation and generality that they achieve by subsumtion under a schema.
This monograph provides a self-contained and easy-to-read
introduction to non-commutative multiple-valued logic algebras; a
subject which has attracted much interest in the past few years
because of its impact on information science, artificial
intelligence and other subjects.
This book is a source of valuable and useful information on the topics of dynamics of number systems and scientific computation with arbitrary precision. It is addressed to scholars, scientists and engineers, and graduate students. The treatment is elementary and self-contained with relevance both for theory and applications. The basic prerequisite of the book is linear algebra and matrix calculus. |
![]() ![]() You may like...
Physics of Impurities in Quantum Gases
Simeon Mistakidis, Artem Volosniev
Hardcover
R1,629
Discovery Miles 16 290
Machine and Industrial Design in…
Milan Rackov, Radivoje Mitrovic, …
Hardcover
R1,906
Discovery Miles 19 060
Analysis of Shells, Plates, and Beams…
Holm Altenbach, Natalia Chinchaladze, …
Hardcover
R5,445
Discovery Miles 54 450
Management Of Information Security
Michael Whitman, Herbert Mattord
Paperback
|