![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
The book presents theory and algorithms for secure networked inference in the presence of Byzantines. It derives fundamental limits of networked inference in the presence of Byzantine data and designs robust strategies to ensure reliable performance for several practical network architectures. In particular, it addresses inference (or learning) processes such as detection, estimation or classification, and parallel, hierarchical, and fully decentralized (peer-to-peer) system architectures. Furthermore, it discusses a number of new directions and heuristics to tackle the problem of design complexity in these practical network architectures for inference.
* The book offers a well-balanced mathematical analysis of modelling physical systems. * Summarizes basic principles in differential geometry and convex analysis as needed. * The book covers a wide range of industrial and social applications, and bridges the gap between core theory and costly experiments through simulations and modelling. * The focus of the book is manifold ranging from stability of fluid flows, nano fluids, drug delivery, and security of image data to Pandemic modeling etc.
This unique book gives a comprehensive account of new mathematical tools used to solve polygon problems. In the 20th and 21st centuries, many problems in mathematics, theoretical physics and theoretical chemistry - and more recently in molecular biology and bio-informatics - can be expressed as counting problems, in which specified graphs, or shapes, are counted. One very special class of shapes is that of polygons. These are closed, connected paths in space. We usually sketch them in two-dimensions, but they can exist in any dimension. The typical questions asked include "how many are there of a given perimeter?," "how big is the average polygon of given perimeter?," and corresponding questions about the area or volume enclosed. That is to say "how many enclosing a given area?" and "how large is an average polygon of given area?" Simple though these questions are to pose, they are extraordinarily difficult to answer. They are important questions because of the application of polygon, and the related problems of polyomino and polycube counting, to phenomena occurring in the natural world, and also because the study of these problems has been responsible for the development of powerful new techniques in mathematics and mathematical physics, as well as in computer science. These new techniques then find application more broadly. The book brings together chapters from many of the major contributors in the field. An introductory chapter giving the history of the problem is followed by fourteen further chapters describing particular aspects of the problem, and applications to biology, to surface phenomena and to computer enumeration methods.
In this thesis, the author develops for the first time an implementation methodology for arbitrary Gaussian operations using temporal-mode cluster states. The author also presents three experiments involving continuous-variable one-way quantum computations, where their non-classical nature is shown by observing entanglement at the outputs. The experimental basic structure of one-way quantum computation over two-mode input state is demonstrated by the controlled-Z gate and the optimum nonlocal gate experiments. Furthermore, the author proves that the operation can be controlled by the gain-tunable entangling gate experiment.
With considerations such as complex-dimensional geometries and nonlinearity, the computational solution of partial differential systems has become so involved that it is important to automate decisions that have been normally left to the individual. This book covers such decisions: 1) mesh generation with links to the software generating the domain geometry, 2) solution accuracy and reliability with mesh selection linked to solution generation. This book is suited for mathematicians, computer scientists and engineers and is intended to encourage interdisciplinary interaction between the diverse groups.
This volume, the 7th volume in the DRUMS Handbook series, is part of the aftermath of the successful ESPRIT project DRUMS (Defeasible Reasoning and Uncertainty Management Systems) which took place in two stages from 1989- 1996. In the second stage (1993-1996) a work package was introduced devoted to the topics Reasoning and Dynamics, covering both the topics of "Dynamics of Reasoning," where reasoning is viewed as a process, and "Reasoning about Dynamics," which must be understood as pertaining to how both designers of and agents within dynamic systems may reason about these systems. The present volume presents work done in this context extended with some work done by outstanding researchers outside the project on related issues. While the previous volume in this series had its focus on the dynamics of reasoning pro cesses, the present volume is more focused on "reasoning about dynamics', viz. how (human and artificial) agents reason about (systems in) dynamic environments in order to control them. In particular we consider modelling frameworks and generic agent models for modelling these dynamic systems and formal approaches to these systems such as logics for agents and formal means to reason about agent based and compositional systems, and action & change more in general. We take this opportunity to mention that we have very pleasant recollections of the project, with its lively workshops and other meetings, with the many sites and researchers involved, both within and outside our own work package."
This book gathers the proceedings of the 2018 Abel Symposium, which was held in Geiranger, Norway, on June 4-8, 2018. The symposium offered an overview of the emerging field of "Topological Data Analysis". This volume presents papers on various research directions, notably including applications in neuroscience, materials science, cancer biology, and immune response. Providing an essential snapshot of the status quo, it represents a valuable asset for practitioners and those considering entering the field.
The development of information processing systems requires models, calculi, and theories for the analysis of computations. Complex software systems are best constructed in a careful, systematic, and disciplined structuring of the development process. Starting from basic requirement specifications in which all the relevant details are formalized, the envisaged solution should be developed step by step by adding more and more details and giving evidence or formal proofs to show the correctness of the steps, until a description of a solution is obtained that has all the required properties. The Marktoberdorf Advanced Study Institute 1992 presented scientific highlights in approaches to the systematic study ofreliable software and hardware systems using functional, algebraic, and logical calculi. Leading scientists treated the specification, development, verification, and implementation of complex time-sensitive systems, such as signal processing systems, process control systems, and general software systems. The mathematical foundations of specification and refinement were carefully treated, and several formalisms for describing processes were introduced. Emphasis was put on application-oriented descriptions of signal processing systems with real-time dependencies. Formalisms for reasoning about distributed causality-based computations were presented and new styles of programming leading to shorter and more expressive notations were demonstrated. This book is based on the Institute, and gives an impressive demonstration of the state of the art and the essential progress in our formal abilities to specify, refine, verify, develop, and implement complex software systems including embeddedsystems and hard real-time dependent systems.
Domains are mathematical structures for information and
approximation; they combine order-theoretic, logical, and
topological ideas and provide a natural framework for modelling and
reasoning about computation. The theory of domains has proved to be
a useful tool for programming languages and other areas of computer
science, and for applications in mathematics.
This is the second volume in a series of innovative proceedings entirely devoted to the connections between mathematics and computer science. Here mathematics and computer science are directly confronted and joined to tackle intricate problems in computer science with deep and innovative mathematical approaches. The book serves as an outstanding tool and a main information source for a large public in applied mathematics, discrete mathematics and computer science, including researchers, teachers, graduate students and engineers. It provides an overview of the current questions in computer science and the related modern and powerful mathematical methods. The range of applications is very wide and reaches beyond computer science.
This book presents a mathematically-based introduction into the fascinating topic of Fuzzy Sets and Fuzzy Logic and might be used as textbook at both undergraduate and graduate levels and also as reference guide for mathematician, scientists or engineers who would like to get an insight into Fuzzy Logic. Fuzzy Sets have been introduced by Lotfi Zadeh in 1965 and since then, they have been used in many applications. As a consequence, there is a vast literature on the practical applications of fuzzy sets, while theory has a more modest coverage. The main purpose of the present book is to reduce this gap by providing a theoretical introduction into Fuzzy Sets based on Mathematical Analysis and Approximation Theory. Well-known applications, as for example fuzzy control, are also discussed in this book and placed on new ground, a theoretical foundation. Moreover, a few advanced chapters and several new results are included. These comprise, among others, a new systematic and constructive approach for fuzzy inference systems of Mamdani and Takagi-Sugeno types, that investigates their approximation capability by providing new error estimates. "
In January 2012 an Oberwolfach workshop took place on the topic of recent developments in the numerics of partial differential equations. Focus was laid on methods of high order and on applications in Computational Fluid Dynamics. The book covers most of the talks presented at this workshop.
This book offers a brief but effective introduction to quantum machine learning (QML). QML is not merely a translation of classical machine learning techniques into the language of quantum computing, but rather a new approach to data representation and processing. Accordingly, the content is not divided into a "classical part" that describes standard machine learning schemes and a "quantum part" that addresses their quantum counterparts. Instead, to immerse the reader in the quantum realm from the outset, the book starts from fundamental notions of quantum mechanics and quantum computing. Avoiding unnecessary details, it presents the concepts and mathematical tools that are essential for the required quantum formalism. In turn, it reviews those quantum algorithms most relevant to machine learning. Later chapters highlight the latest advances in this field and discuss the most promising directions for future research. To gain the most from this book, a basic grasp of statistics and linear algebra is sufficient; no previous experience with quantum computing or machine learning is needed. The book is aimed at researchers and students with no background in quantum physics and is also suitable for physicists looking to enter the field of QML.
Thirty years ago mathematical, as opposed to applied numerical, computation was difficult to perform and so relatively little used. Three threads changed that: the emergence of the personal computer; the discovery of fiber-optics and the consequent development of the modern internet; and the building of the Three M s Maple, Mathematica and Matlab. We intend to persuade that Maple and other like tools are worth knowing assuming only that one wishes to be a mathematician, a mathematics educator, a computer scientist, an engineer or scientist, or anyone else who wishes/needs to use mathematics better. We also hope to explain how to become an experimental mathematician' while learning to be better at proving things. To accomplish this our material is divided into three main chapters followed by a postscript. These cover elementary number theory, calculus of one and several variables, introductory linear algebra, and visualization and interactive geometric computation."
The classical restricted three-body problem is of fundamental importance because of its applications in astronomy and space navigation, and also as a simple model of a non-integrable Hamiltonian dynamical system. A central role is played by periodic orbits, of which many have been computed numerically. This is the second volume of an attempt to explain and organize the material through a systematic study of generating families, the limits of families of periodic orbits when the mass ratio of the two main bodies becomes vanishingly small. We use quantitative analysis in the vicinity of bifurcations of types 1 and 2. In most cases the junctions between branches can now be determined. A first-order approximation of families of periodic orbits in the vicinity of a bifurcation is also obtained. This book is intended for scientists and students interested in the restricted problem, in its applications to astronomy and space research, and in the theory of dynamical systems.
"Poisson Point Processes provides an overview of non-homogeneous and multidimensional Poisson point processes and their numerous applications. Readers will find constructive mathematical tools and applications ranging from emission and transmission computed tomography to multiple target tracking and distributed sensor detection, written from an engineering perspective. A valuable discussion of the basic properties of finite random sets is included. Maximum likelihood estimation techniques are discussed for several parametric forms of the intensity function, including Gaussian sums, together with their Cramer-Rao bounds. These methods are then used to investigate: -Several medical imaging techniques, including positron emission tomography (PET), single photon emission computed tomography (SPECT), and transmission tomography (CT scans) -Various multi-target and multi-sensor tracking applications, -Practical applications in areas like distributed sensing and detection, -Related finite point processes such as marked processes, hard core processes, cluster processes, and doubly stochastic processes, Perfect for researchers, engineers and graduate students working in electrical engineering and computer science, Poisson Point Processes will prove to be an extremely valuable volume for those seeking insight into the nature of these processes and their diverse applications.
The analysis, processing, evolution, optimization and/or regulation, and control of shapes and images appear naturally in engineering (shape optimization, image processing, visual control), numerical analysis (interval analysis), physics (front propagation), biological morphogenesis, population dynamics (migrations), and dynamic economic theory. These problems are currently studied with tools forged out of differential geometry and functional analysis, thus requiring shapes and images to be smooth. However, shapes and images are basically sets, most often not smooth. J.-P. Aubin thus constructs another vision, where shapes and images are just any compact set. Hence their evolution -- which requires a kind of differential calculus -- must be studied in the metric space of compact subsets. Despite the loss of linearity, one can transfer most of the basic results of differential calculus and differential equations in vector spaces to mutational calculus and mutational equations in any mutational space, including naturally the space of nonempty compact subsets. "Mutational and Morphological Analysis" offers a structure that embraces and integrates the various approaches, including shape optimization and mathematical morphology. Scientists and graduate students will find here other powerful mathematical tools for studying problems dealing with shapes and images arising in so many fields.
This book is an up-to-date documentation of the state of the art in combinatorial optimization, presenting approximate solutions of virtually all relevant classes of NP-hard optimization problems. The well-structured wealth of problems, algorithms, results, and techniques introduced systematically will make the book an indispensible source of reference for professionals. The smooth integration of numerous illustrations, examples, and exercises make this monograph an ideal textbook.
This book is about reasoning with causal associations during diagnostic problem-solving. It formalizes several currently vague notions of abductive inference in the context of diagnosis. The result is a mathematical model of diagnostic reasoning called parsimonious covering theory. Within this diagnostic, problems and important relevant concepts are formally defined, properties of diagnostic problem-solving are identified and analyzed, and algorithms for finding plausible explanations in different situations are given along with proofs of their correctness. Another feature of this book is the integration of parsimonious covering theory and probability theory. Based on underlying cause-effect relations, the resulting probabilistic causal model generalized Bayesian classification to diagnostic problems where multiple disorders (faults) may occur simultaneously. Both sequential best-first search algorithms and parallel connectionist (neural network) algorithms for finding the most probable hypothesis are provided. This book should appeal to both theoretical researchers and practitioners. For researchers in artificial intelligence and cognitive science, it provides a coherent presentation of a new theory of diagnostic inference. For engineers and developers of automated diagnostic systems or systems solving other abductive tasks, the book may provide useful insights, guidance, or even directly workable algorithms.
The papers presented here describe research to improve the general understanding of the application of SAMR to practical problems, to identify issues critical to efficient and effective implementation on high performance computers and to stimulate the development of a community code repository for software including benchmarks to assist in the evaluation of software and compiler technologies. The ten chapters have been divided into two parts reflecting two major issues in the topic: programming complexity of SAMR algorithms and the applicability and numerical challenges of SAMR methods.
This monograph gives a thorough treatment of the celebrated compositions of signature and encryption that allow for verifiability, that is, to efficiently prove properties about the encrypted data. This study is provided in the context of two cryptographic primitives: (1) designated confirmer signatures, an opaque signature which was introduced to control the proliferation of certified copies of documents, and (2) signcryption, a primitive that offers privacy and authenticity at once in an efficient way. This book is a useful resource to researchers in cryptology and information security, graduate and PhD students, and security professionals.
Mathematics and Computer Science III contains invited and contributed papers on combinatorics, random graphs and networks, algorithms analysis and trees, branching processes, constituting the Proceedings of the Third International Colloquium on Mathematics and Computer Science, held in Vienna in September 2004. It addresses a large public in applied mathematics, discrete mathematics and computer science, including researchers, teachers, graduate students and engineers.
The title of this book contains the words ALGORITHMIC LANGUAGE, in the singular. This is meant to convey the idea that it deals not so much with the diversity of program ming languages, but rather with their commonalities. The task of formal program develop It allows classifying ment proved to be the ideal frame for demonstrating this unity. concepts and distinguishing fundamental notions from notational features; and it leads immediately to a systematic disposition. This approach is supported by didactic, practical, and theoretical considerations. The clarity of the structure of a programming language de signed according to the principles of program transformation is remarkable. Of course there are various notations for such a language. The notation used in this book is mainly oriented towards ALGOL 68, but is also strongly influenced by PASCAL - it could equally well have been the other way round. In the appendices there are occa sional references to the styles used in ALGOL, PASCAL, LISP, and elsewhere."
This book covers recent developments in the understanding, quantification, and exploitation of entanglement in spin chain models from both condensed matter and quantum information perspectives. Spin chain models are at the foundation of condensed matter physics and quantum information technologies and elucidate many fundamental phenomena such as information scrambling, quantum phase transitions, and many-body localization. Moreover, many quantum materials and emerging quantum devices are well described by spin chains. Comprising accessible, self-contained chapters written by leading researchers, this book is essential reading for graduate students and researchers in quantum materials and quantum information. The coverage is comprehensive, from the fundamental entanglement aspects of quantum criticality, non-equilibrium dynamics, classical and quantum simulation of spin chains through to their experimental realizations, and beyond into machine learning applications.
Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of the cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state of the art in the interface between OR/MS and CS/AI and of the high-caliber research being conducted by members of the INFORMS Computing Society. |
![]() ![]() You may like...
Embedded Computing for High Performance…
Joao Manuel Paiva Cardoso, Jose Gabriel de Figueired Coutinho, …
Paperback
Role of Single Board Computers (SBCs) in…
G. R. Kanagachidambaresan
Hardcover
R2,645
Discovery Miles 26 450
Lifetime Reliability-aware Design of…
Mohsen Raji, Behnam Ghavami
Hardcover
R2,510
Discovery Miles 25 100
Cyber-Physical Systems - Digital…
Alla G. Kravets, Alexander A. Bolshakov, …
Hardcover
R4,656
Discovery Miles 46 560
|