|
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
 |
Logical Foundations of Computer Science
- International Symposium, LFCS 2020, Deerfield Beach, FL, USA, January 4-7, 2020, Proceedings
(Paperback, 1st ed. 2020)
Sergei Artemov, Anil Nerode
|
R1,409
Discovery Miles 14 090
|
Ships in 18 - 22 working days
|
|
|
This book constitutes the refereed proceedings of the International
Symposium on Logical Foundations of Computer Science, LFCS 2020,
held in Deerfield Beach, FL, USA, in January 2020. The 17 revised
full papers were carefully reviewed and selected from 30
submissions. The scope of the Symposium is broad and includes
constructive mathematics and type theory; homotopy type theory;
logic, automata, and automatic structures; computability and
randomness; logical foundations of programming; logical aspects of
computational complexity; parameterized complexity; logic
programming and constraints; automated deduction and interactive
theorem proving; logical methods in protocol and program
verification; logical methods in program specification and
extraction; domain theory logics; logical foundations of database
theory; equational logic and term rewriting; lambda and combinatory
calculi; categorical logic and topological semantics; linear logic;
epistemic and temporal logics; intelligent and multiple-agent
system logics; logics of proof and justification; non-monotonic
reasoning; logic in game theory and social software; logic of
hybrid systems; distributed system logics; mathematical fuzzy
logic; system design logics; other logics in computer science.
This book offers a review of the vibrant areas of geometric
representation theory and gauge theory, which are characterized by
a merging of traditional techniques in representation theory with
the use of powerful tools from algebraic geometry, and with strong
inputs from physics. The notes are based on lectures delivered at
the CIME school "Geometric Representation Theory and Gauge Theory"
held in Cetraro, Italy, in June 2018. They comprise three
contributions, due to Alexander Braverman and Michael Finkelberg,
Andrei Negut, and Alexei Oblomkov, respectively. Braverman and
Finkelberg's notes review the mathematical theory of the Coulomb
branch of 3D N=4 quantum gauge theories. The purpose of Negut's
notes is to study moduli spaces of sheaves on a surface, as well as
Hecke correspondences between them. Oblomkov's notes concern matrix
factorizations and knot homology. This book will appeal to both
mathematicians and theoretical physicists and will be a source of
inspiration for PhD students and researchers.
This monograph initiates a theory of new categorical structures
that generalize the simplicial Segal property to higher dimensions.
The authors introduce the notion of a d-Segal space, which is a
simplicial space satisfying locality conditions related to
triangulations of d-dimensional cyclic polytopes. Focus here is on
the 2-dimensional case. Many important constructions are shown to
exhibit the 2-Segal property, including Waldhausen's
S-construction, Hecke-Waldhausen constructions, and configuration
spaces of flags. The relevance of 2-Segal spaces in the study of
Hall and Hecke algebras is discussed. Higher Segal Spaces marks the
beginning of a program to systematically study d-Segal spaces in
all dimensions d. The elementary formulation of 2-Segal spaces in
the opening chapters is accessible to readers with a basic
background in homotopy theory. A chapter on Bousfield localizations
provides a transition to the general theory, formulated in terms of
combinatorial model categories, that features in the main part of
the book. Numerous examples throughout assist readers entering this
exciting field to move toward active research; established
researchers in the area will appreciate this work as a reference.
This edited volume focuses on the work of Professor Larisa
Maksimova, providing a comprehensive account of her outstanding
contributions to different branches of non-classical logic. The
book covers themes ranging from rigorous implication, relevance and
algebraic logic, to interpolation, definability and recognizability
in superintuitionistic and modal logics. It features both her
scientific autobiography and original contributions from experts in
the field of non-classical logics. Professor Larisa Maksimova's
influential work involved combining methods of algebraic and
relational semantics. Readers will be able to trace both influences
on her work, and the ways in which her work has influenced other
logicians. In the historical part of this book, it is possible to
trace important milestones in Maksimova's career. Early on, she
developed an algebraic semantics for relevance logics and
relational semantics for the logic of entailment. Later, Maksimova
discovered that among the continuum of superintuitionisitc logics
there are exactly three pretabular logics. She went on to obtain
results on the decidability of tabularity and local tabularity
problems for superintuitionistic logics and for extensions of S4.
Further investigations by Maksimova were aimed at the study of
fundamental properties of logical systems (different versions of
interpolation and definability, disjunction property, etc.) in big
classes of logics, and on decidability and recognizability of such
properties. To this end she determined a powerful combination of
algebraic and semantic methods, which essentially determine the
modern state of investigations in the area, as can be seen in the
later chapters of this book authored by leading experts in
non-classical logics. These original contributions bring the reader
up to date on the very latest work in this field.
This book focuses on the game-theoretical semantics and epistemic
logic of Jaakko Hintikka. Hintikka was a prodigious and esteemed
philosopher and logician, and his death in August 2015 was a huge
loss to the philosophical community. This book, whose chapters have
been in preparation for several years, is dedicated to the work of
Jaako Hintikka, and to his memory. This edited volume consists of
23 contributions from leading logicians and philosophers, who
discuss themes that span across the entire range of Hintikka's
career. Semantic Representationalism, Logical Dialogues, Knowledge
and Epistemic logic are among some of the topics covered in this
book's chapters. The book should appeal to students, scholars and
teachers who wish to explore the philosophy of Jaako Hintikka.
This book provides an accessible introduction to the state of the
art of representation theory of finite groups. Starting from a
basic level that is summarized at the start, the book proceeds to
cover topics of current research interest, including open problems
and conjectures. The central themes of the book are block theory
and module theory of group representations, which are
comprehensively surveyed with a full bibliography. The individual
chapters cover a range of topics within the subject, from blocks
with cyclic defect groups to representations of symmetric groups.
Assuming only modest background knowledge at the level of a first
graduate course in algebra, this guidebook, intended for students
taking first steps in the field, will also provide a reference for
more experienced researchers. Although no proofs are included,
end-of-chapter exercises make it suitable for student seminars.
This book is a comprehensive explanation of graph and model
transformation. It contains a detailed introduction, including
basic results and applications of the algebraic theory of graph
transformations, and references to the historical context. Then in
the main part the book contains detailed chapters on M-adhesive
categories, M-adhesive transformation systems, and
multi-amalgamated transformations, and model transformation based
on triple graph grammars. In the final part of the book the authors
examine application of the techniques in various domains, including
chapters on case studies and tool support. The book will be of
interest to researchers and practitioners in the areas of
theoretical computer science, software engineering, concurrent and
distributed systems, and visual modelling.
This textbook offers an introduction to the philosophy of science.
It helps undergraduate students from the natural, the human and
social sciences to gain an understanding of what science is, how it
has developed, what its core traits are, how to distinguish between
science and pseudo-science and to discover what a scientific
attitude is. It argues against the common assumption that there is
fundamental difference between natural and human science, with
natural science being concerned with testing hypotheses and
discovering natural laws, and the aim of human and some social
sciences being to understand the meanings of individual and social
group actions. Instead examines the similarities between the
sciences and shows how the testing of hypotheses and doing
interpretation/hermeneutics are similar activities. The book makes
clear that lessons from natural scientists are relevant to students
and scholars within the social and human sciences, and vice versa.
It teaches its readers how to effectively demarcate between science
and pseudo-science and sets criteria for true scientific thinking.
Divided into three parts, the book first examines the question What
is Science? It describes the evolution of science, defines
knowledge, and explains the use of and need for hypotheses and
hypothesis testing. The second half of part I deals with scientific
data and observation, qualitative data and methods, and ends with a
discussion of theories on the development of science. Part II
offers philosophical reflections on four of the most important con
cepts in science: causes, explanations, laws and models. Part III
presents discussions on philosophy of mind, the relation between
mind and body, value-free and value-related science, and
reflections on actual trends in science.
This book contains selected papers based on talks given at the
"Representation Theory, Number Theory, and Invariant Theory"
conference held at Yale University from June 1 to June 5, 2015. The
meeting and this resulting volume are in honor of Professor Roger
Howe, on the occasion of his 70th birthday, whose work and insights
have been deeply influential in the development of these fields.
The speakers who contributed to this work include Roger Howe's
doctoral students, Roger Howe himself, and other world renowned
mathematicians. Topics covered include automorphic forms, invariant
theory, representation theory of reductive groups over local
fields, and related subjects.
This collection documents the work of the Hyperuniverse Project
which is a new approach to set-theoretic truth based on justifiable
principles and which leads to the resolution of many questions
independent from ZFC. The contributions give an overview of the
program, illustrate its mathematical content and implications, and
also discuss its philosophical assumptions. It will thus be of wide
appeal among mathematicians and philosophers with an interest in
the foundations of set theory. The Hyperuniverse Project was
supported by the John Templeton Foundation from January 2013 until
September 2015
This monograph proposes a new way of implementing interaction in
logic. It also provides an elementary introduction to Constructive
Type Theory (CTT). The authors equally emphasize basic ideas and
finer technical details. In addition, many worked out exercises and
examples will help readers to better understand the concepts under
discussion. One of the chief ideas animating this study is that the
dialogical understanding of definitional equality and its execution
provide both a simple and a direct way of implementing the CTT
approach within a game-theoretical conception of meaning. In
addition, the importance of the play level over the strategy level
is stressed, binding together the matter of execution with that of
equality and the finitary perspective on games constituting
meaning. According to this perspective the emergence of concepts
are not only games of giving and asking for reasons (games
involving Why-questions), they are also games that include moves
establishing how it is that the reasons brought forward accomplish
their explicative task. Thus, immanent reasoning games are
dialogical games of Why and How.
This book explains exactly what human knowledge is. The key
concepts in this book are structures and algorithms, i.e., what the
readers "see" and how they make use of what they see. Thus in
comparison with some other books on the philosophy (or methodology)
of science, which employ a syntactic approach, the author's
approach is model theoretic or structural. Properly understood, it
extends the current art and science of mathematical modeling to all
fields of knowledge. The link between structure and algorithms is
mathematics. But viewing "mathematics" as such a link is not
exactly what readers most likely learned in school; thus, the task
of this book is to explain what "mathematics" should actually mean.
Chapter 1, an introductory essay, presents a general analysis of
structures, algorithms and how they are to be linked. Several
examples from the natural and social sciences, and from the history
of knowledge, are provided in Chapters 2-6. In turn, Chapters 7 and
8 extend the analysis to include language and the mind. Structures
are what the readers see. And, as abstract cultural objects, they
can almost always be seen in many different ways. But certain
structures, such as natural numbers and the basic theory of
grammar, seem to have an absolute character. Any theory of
knowledge grounded in human culture must explain how this is
possible. The author's analysis of this cultural invariance,
combining insights from evolutionary theory and neuroscience, is
presented in the book's closing chapter. The book will be of
interest to researchers, students and those outside academia who
seek a deeper understanding of knowledge in our present-day
society.
This book celebrates the work of Don Pigozzi on the occasion of his
80th birthday. In addition to articles written by leading
specialists and his disciples, it presents Pigozzi's scientific
output and discusses his impact on the development of science. The
book both catalogues his works and offers an extensive profile of
Pigozzi as a person, sketching the most important events, not only
related to his scientific activity, but also from his personal
life. It reflects Pigozzi's contribution to the rise and
development of areas such as abstract algebraic logic (AAL),
universal algebra and computer science, and introduces new
scientific results. Some of the papers also present chronologically
ordered facts relating to the development of the disciplines he
contributed to, especially abstract algebraic logic. The book
offers valuable source material for historians of science,
especially those interested in history of mathematics and logic.
This collection of prize-winning essays addresses the controversial
question of how meaning and goals can emerge in a physical world
governed by mathematical laws. What are the prerequisites for a
system to have goals? What makes a physical process into a signal?
Does eliminating the homunculus solve the problem? The three
first-prize winners, Larissa Albantakis, Carlo Rovelli and Jochen
Szangolies tackle exactly these challenges, while many other
aspects (agency, the role of the observer, causality versus
teleology, ghosts in the machine etc.) feature in the other award
winning contributions. All contributions are accessible to
non-specialists. These seventeen stimulating and often entertaining
essays are enhanced versions of the prize-winning entries to the
FQXi essay competition in 2017.The Foundational Questions
Institute, FQXi, catalyzes, supports, and disseminates research on
questions at the foundations of physics and cosmology, particularly
new frontiers and innovative ideas integral to a deep understanding
of reality, but unlikely to be supported by conventional funding
sources.
This book, presented in two parts, offers a slow introduction to
mathematical logic, and several basic concepts of model theory,
such as first-order definability, types, symmetries, and elementary
extensions. Its first part, Logic Sets, and Numbers, shows how
mathematical logic is used to develop the number structures of
classical mathematics. The exposition does not assume any
prerequisites; it is rigorous, but as informal as possible. All
necessary concepts are introduced exactly as they would be in a
course in mathematical logic; but are accompanied by more extensive
introductory remarks and examples to motivate formal developments.
The second part, Relations, Structures, Geometry, introduces
several basic concepts of model theory, such as first-order
definability, types, symmetries, and elementary extensions, and
shows how they are used to study and classify mathematical
structures. Although more advanced, this second part is accessible
to the reader who is either already familiar with basic
mathematical logic, or has carefully read the first part of the
book. Classical developments in model theory, including the
Compactness Theorem and its uses, are discussed. Other topics
include tameness, minimality, and order minimality of structures.
The book can be used as an introduction to model theory, but unlike
standard texts, it does not require familiarity with abstract
algebra. This book will also be of interest to mathematicians who
know the technical aspects of the subject, but are not familiar
with its history and philosophical background.
Written by one of the subject's foremost experts, this book focuses
on the central developments and modern methods of the advanced
theory of abelian groups, while remaining accessible, as an
introduction and reference, to the non-specialist. It provides a
coherent source for results scattered throughout the research
literature with lots of new proofs. The presentation highlights
major trends that have radically changed the modern character of
the subject, in particular, the use of homological methods in the
structure theory of various classes of abelian groups, and the use
of advanced set-theoretical methods in the study of un decidability
problems. The treatment of the latter trend includes Shelah's
seminal work on the un decidability in ZFC of Whitehead's Problem;
while the treatment of the former trend includes an extensive (but
non-exhaustive) study of p-groups, torsion-free groups, mixed
groups and important classes of groups arising from ring theory. To
prepare the reader to tackle these topics, the book reviews the
fundamentals of abelian group theory and provides some background
material from category theory, set theory, topology and homological
algebra. An abundance of exercises are included to test the
reader's comprehension, and to explore noteworthy extensions and
related sidelines of the main topics. A list of open problems and
questions, in each chapter, invite the reader to take an active
part in the subject's further development.
This book features a series of lectures that explores three
different fields in which functor homology (short for homological
algebra in functor categories) has recently played a significant
role. For each of these applications, the functor viewpoint
provides both essential insights and new methods for tackling
difficult mathematical problems. In the lectures by Aurelien
Djament, polynomial functors appear as coefficients in the homology
of infinite families of classical groups, e.g. general linear
groups or symplectic groups, and their stabilization. Djament's
theorem states that this stable homology can be computed using only
the homology with trivial coefficients and the manageable functor
homology. The series includes an intriguing development of
Scorichenko's unpublished results. The lectures by Wilberd van der
Kallen lead to the solution of the general cohomological finite
generation problem, extending Hilbert's fourteenth problem and its
solution to the context of cohomology. The focus here is on the
cohomology of algebraic groups, or rational cohomology, and the
coefficients are Friedlander and Suslin's strict polynomial
functors, a conceptual form of modules over the Schur algebra.
Roman Mikhailov's lectures highlight topological invariants: homoto
py and homology of topological spaces, through derived functors of
polynomial functors. In this regard the functor framework makes
better use of naturality, allowing it to reach calculations that
remain beyond the grasp of classical algebraic topology. Lastly,
Antoine Touze's introductory course on homological algebra makes
the book accessible to graduate students new to the field. The
links between functor homology and the three fields mentioned above
offer compelling arguments for pushing the development of the
functor viewpoint. The lectures in this book will provide readers
with a feel for functors, and a valuable new perspective to apply
to their favourite problems.
This Undergraduate Textbook introduces key methods and examines the
major areas of philosophy in which formal methods play pivotal
roles. Coverage begins with a thorough introduction to
formalization and to the advantages and pitfalls of formal methods
in philosophy. The ensuing chapters show how to use formal methods
in a wide range of areas. Throughout, the contributors clarify the
relationships and interdependencies between formal and informal
notions and constructions. Their main focus is to show how formal
treatments of philosophical problems may help us understand them
better. Formal methods can be used to solve problems but also to
express new philosophical problems that would never have seen the
light of day without the expressive power of the formal apparatus.
Formal philosophy merges work in different areas of philosophy as
well as logic, mathematics, computer science, linguistics, physics,
psychology, biology, economics, political theory, and sociology.
This title offers an accessible introduction to this new
interdisciplinary research area to a wide academic audience.
Cyber-physical systems (CPSs) combine cyber capabilities, such as
computation or communication, with physical capabilities, such as
motion or other physical processes. Cars, aircraft, and robots are
prime examples, because they move physically in space in a way that
is determined by discrete computerized control algorithms.
Designing these algorithms is challenging due to their tight
coupling with physical behavior, while it is vital that these
algorithms be correct because we rely on them for safety-critical
tasks. This textbook teaches undergraduate students the core
principles behind CPSs. It shows them how to develop models and
controls; identify safety specifications and critical properties;
reason rigorously about CPS models; leverage multi-dynamical
systems compositionality to tame CPS complexity; identify required
control constraints; verify CPS models of appropriate scale in
logic; and develop an intuition for operational effects. The book
is supported with homework exercises, lecture videos, and slides.
This book begins with the fundamentals of the generalized inverses,
then moves to more advanced topics. It presents a theoretical study
of the generalization of Cramer's rule, determinant representations
of the generalized inverses, reverse order law of the generalized
inverses of a matrix product, structures of the generalized
inverses of structured matrices, parallel computation of the
generalized inverses, perturbation analysis of the generalized
inverses, an algorithmic study of the computational methods for the
full-rank factorization of a generalized inverse, generalized
singular value decomposition, imbedding method, finite method,
generalized inverses of polynomial matrices, and generalized
inverses of linear operators. This book is intended for
researchers, postdocs, and graduate students in the area of the
generalized inverses with an undergraduate-level understanding of
linear algebra.
Corina Keller studies non-perturbative facets of abelian
Chern-Simons theories. This is a refinement of the entirely
perturbative approach to classical Chern-Simons theory via homotopy
factorization algebras of observables that arise from the
associated formal moduli problem describing deformations of flat
principal bundles with connections over the spacetime manifold. The
author shows that for theories with abelian group structure, this
factorization algebra of classical observables comes naturally
equipped with an action of the gauge group, which allows to encode
non-perturbative effects in the classical observables. About the
Author: Corina Keller currently is a doctoral student in the
research group of Prof. Dr. Damien Calaque at the Universite
Montpellier, France. She is mostly interested in the mathematical
study of field theories. Her master's thesis was supervised by PD
Dr. Alessandro Valentino and Prof. Dr. Alberto Cattaneo at Zurich
University, Switzerland.
This volume investigates what is beyond the Principle of
Non-Contradiction. It features 14 papers on the foundations of
reasoning, including logical systems and philosophical
considerations. Coverage brings together a cluster of issues
centered upon the variety of meanings of consistency,
contradiction, and related notions. Most of the papers, but not
all, are developed around the subtle distinctions between
consistency and non-contradiction, as well as among contradiction,
inconsistency, and triviality, and concern one of the above
mentioned threads of the broadly understood non-contradiction
principle and the related principle of explosion. Some others take
a perspective that is not too far away from such themes, but with
the freedom to tread new paths. Readers should understand the title
of this book in a broad way,because it is not so obvious to deal
with notions like contradictions, consistency, inconsistency, and
triviality. The papers collected here present groundbreaking ideas
related to consistency and inconsistency.
While it is well known that the Delian problems are impossible to
solve with a straightedge and compass - for example, it is
impossible to construct a segment whose length is cube root of 2
with these instruments - the discovery of the Italian mathematician
Margherita Beloch Piazzolla in 1934 that one can in fact construct
a segment of length cube root of 2 with a single paper fold was
completely ignored (till the end of the 1980s). This comes as no
surprise, since with few exceptions paper folding was seldom
considered as a mathematical practice, let alone as a mathematical
procedure of inference or proof that could prompt novel
mathematical discoveries. A few questions immediately arise: Why
did paper folding become a non-instrument? What caused the
marginalisation of this technique? And how was the mathematical
knowledge, which was nevertheless transmitted and prompted by paper
folding, later treated and conceptualised? Aiming to answer these
questions, this volume provides, for the first time, an extensive
historical study on the history of folding in mathematics, spanning
from the 16th century to the 20th century, and offers a general
study on the ways mathematical knowledge is marginalised,
disappears, is ignored or becomes obsolete. In doing so, it makes a
valuable contribution to the field of history and philosophy of
science, particularly the history and philosophy of mathematics and
is highly recommended for anyone interested in these topics.
|
You may like...
Republic
Plato
Paperback
R110
R99
Discovery Miles 990
|