![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
This timely text presents a comprehensive overview of fault tolerance techniques for high-performance computing (HPC). The text opens with a detailed introduction to the concepts of checkpoint protocols and scheduling algorithms, prediction, replication, silent error detection and correction, together with some application-specific techniques such as ABFT. Emphasis is placed on analytical performance models. This is then followed by a review of general-purpose techniques, including several checkpoint and rollback recovery protocols. Relevant execution scenarios are also evaluated and compared through quantitative models. Features: provides a survey of resilience methods and performance models; examines the various sources for errors and faults in large-scale systems; reviews the spectrum of techniques that can be applied to design a fault-tolerant MPI; investigates different approaches to replication; discusses the challenge of energy consumption of fault-tolerance methods in extreme-scale systems.
This collection of papers, celebrating the contributions of Swedish logician Dag Prawitz to Proof Theory, has been assembled from those presented at the Natural Deduction conference organized in Rio de Janeiro to honour his seminal research. Dag Prawitz's work forms the basis of intuitionistic type theory and his inversion principle constitutes the foundation of most modern accounts of proof-theoretic semantics in Logic, Linguistics and Theoretical Computer Science. The range of contributions includes material on the extension of
natural deduction with higher-order rules, as opposed to
higher-order connectives, and a paper discussing the application of
natural deduction rules to dealing with equality in predicate
calculus. The volume continues with a key chapter summarizing work
on the extension of the Curry-Howard isomorphism (itself a
by-product of the work on natural deduction), via methods of
category theory that have been successfully applied to linear
logic, as well as many other contributions from highly regarded
authorities. With an illustrious group of contributors addressing a
wealth of topics and applications, this volume is a valuable
addition to the libraries of academics in the multiple disciplines
whose development has been given added scope by the methodologies
supplied by natural deduction. The volume is representative of the
rich and varied directions that Prawitz work has inspired in the
area of natural deduction.
Automatic Graph Drawing is concerned with the layout of relational structures as they occur in Computer Science (Data Base Design, Data Mining, Web Mining), Bioinformatics (Metabolic Networks), Businessinformatics (Organization Diagrams, Event Driven Process Chains), or the Social Sciences (Social Networks). In mathematical terms, such relational structures are modeled as graphs or more general objects such as hypergraphs, clustered graphs, or compound graphs. A variety of layout algorithms that are based on graph theoretical foundations have been developed in the last two decades and implemented in software systems. After an introduction to the subject area and a concise treatment of the technical foundations for the subsequent chapters, this book features 14 chapters on state-of-the-art graph drawing software systems, ranging from general "tool boxes'' to customized software for various applications. These chapters are written by leading experts, they follow a uniform scheme and can be read independently from each other.
Providing a wide variety of technologies for ensuring the safety and dependability of cyber-physical systems (CPS), this book offers a comprehensive introduction to the architecture-centric modeling, analysis, and verification of CPS. In particular, it focuses on model driven engineering methods including architecture description languages, virtual prototyping, and formal analysis methods. CPS are based on a new design paradigm intended to enable emerging software-intensive systems. Embedded computers and networks monitor and control the physical processes, usually with the help of feedback loops where physical processes affect computations and vice versa. The principal challenges in system design lie in this constant interaction of software, hardware and physics. Developing reliable CPS has become a critical issue for the industry and society, because many applications such as transportation, power distribution, medical equipment and tele-medicine are dependent on CPS. Safety and security requirements must be ensured by means of powerful validation tools. Satisfying such requirements, including quality of service, implies having formally proven the required properties of the system before it is deployed. The book is concerned with internationally standardized modeling languages such as AADL, SysML, and MARTE. As the effectiveness of the technologies is demonstrated with industrial sample cases from the automotive and aerospace sectors, links between the methods presented and industrial problems are clearly understandable. Each chapter is self-contained, addressing specific scientific or engineering problems, and identifying further issues. In closing, it includes perspectives on future directions in CPS design from an architecture analysis viewpoint.
This present book provides an alternative approach to study the pre-kernel solution of transferable utility games based on a generalized conjugation theory from convex analysis. Although the pre-kernel solution possesses an appealing axiomatic foundation that lets one consider this solution concept as a standard of fairness, the pre-kernel and its related solutions are regarded as obscure and too technically complex to be treated as a real alternative to the Shapley value. Comprehensible and efficient computability is widely regarded as a desirable feature to qualify a solution concept apart from its axiomatic foundation as a standard of fairness. We review and then improve an approach to compute the pre-kernel of a cooperative game by the indirect function. The indirect function is known as the Fenchel-Moreau conjugation of the characteristic function. Extending the approach with the indirect function, we are able to characterize the pre-kernel of the grand coalition simply by the solution sets of a family of quadratic objective functions.
Cyberspace in increasingly important to people in their everyday lives for purchasing goods on the Internet, to energy supply increasingly managed remotely using Internet protocols. Unfortunately, this dependence makes us susceptible to attacks from nation states, terrorists, criminals and hactivists. Therefore, we need a better understanding of cyberspace, for which patterns, which are predictable regularities, may help to detect, understand and respond to incidents better. The inspiration for the workshop came from the existing work on formalising design patterns applied to cybersecurity, but we also need to understand the many other types of patterns that arise in cyberspace.
Walter Gautschi has written extensively on topics ranging from special functions, quadrature and orthogonal polynomials to difference and differential equations, software implementations, and the history of mathematics. He is world renowned for his pioneering work in numerical analysis and constructive orthogonal polynomials, including a definitive textbook in the former, and a monograph in the latter area. This three-volume set, Walter Gautschi: Selected Works with Commentaries, is a compilation of Gautschi s most influential papers and includes commentaries by leading experts. The work begins with a detailed biographical section and ends with a section commemorating Walter s prematurely deceased twin brother. This title will appeal to graduate students and researchers in numerical analysis, as well as to historians of science. Selected Works with Commentaries, Vol. 1 Numerical Conditioning Special Functions Interpolation and Approximation Selected Works with Commentaries, Vol. 2 Orthogonal Polynomials on the Real Line Orthogonal Polynomials on the Semicircle Chebyshev Quadrature Kronrod and Other Quadratures Gauss-type Quadrature Selected Works with Commentaries, Vol. 3 Linear Difference Equations Ordinary Differential Equations Software History and Biography Miscellanea Works of Werner Gautschi Numerical Conditioning Special Functions Interpolation and Approximation Selected Works with Commentaries, Vol. 2 Orthogonal Polynomials on the Real Line Orthogonal Polynomials on the Semicircle Chebyshev Quadrature Kronrod and Other Quadratures Gauss-type Quadrature Selected Works with Commentaries, Vol. 3 Linear Difference Equations Ordinary Differential Equations Software History and Biography Miscellanea Works of Werner Gautschi
Thirty years ago mathematical, as opposed to applied numerical, computation was difficult to perform and so relatively little used. Three threads changed that: the emergence of the personal computer; the discovery of fiber-optics and the consequent development of the modern internet; and the building of the Three "M's" Maple, Mathematica and Matlab. We intend to persuade that Mathematica and other similar tools are worth knowing, assuming only that one wishes to be a mathematician, a mathematics educator, a computer scientist, an engineer or scientist, or anyone else who wishes/needs to use mathematics better. We also hope to explain how to become an "experimental mathematician" while learning to be better at proving things. To accomplish this our material is divided into three main chapters followed by a postscript. These cover elementary number theory, calculus of one and several variables, introductory linear algebra, and visualization and interactive geometric computation.
A modern information retrieval system must have the capability to find, organize and present very different manifestations of information - such as text, pictures, videos or database records - any of which may be of relevance to the user. However, the concept of relevance, while seemingly intuitive, is actually hard to define, and it's even harder to model in a formal way. Lavrenko does not attempt to bring forth a new definition of relevance, nor provide arguments as to why any particular definition might be theoretically superior or more complete. Instead, he takes a widely accepted, albeit somewhat conservative definition, makes several assumptions, and from them develops a new probabilistic model that explicitly captures that notion of relevance. With this book, he makes two major contributions to the field of information retrieval: first, a new way to look at topical relevance, complementing the two dominant models, i.e., the classical probabilistic model and the language modeling approach, and which explicitly combines documents, queries, and relevance in a single formalism; second, a new method for modeling exchangeable sequences of discrete random variables which does not make any structural assumptions about the data and which can also handle rare events. Thus his book is of major interest to researchers and graduate students in information retrieval who specialize in relevance modeling, ranking algorithms, and language modeling.
This book discusses the latest advances in algorithms for symbolic summation, factorization, symbolic-numeric linear algebra and linear functional equations. It presents a collection of papers on original research topics from the Waterloo Workshop on Computer Algebra (WWCA-2016), a satellite workshop of the International Symposium on Symbolic and Algebraic Computation (ISSAC'2016), which was held at Wilfrid Laurier University (Waterloo, Ontario, Canada) on July 23-24, 2016. This workshop and the resulting book celebrate the 70th birthday of Sergei Abramov (Dorodnicyn Computing Centre of the Russian Academy of Sciences, Moscow), whose highly regarded and inspirational contributions to symbolic methods have become a crucial benchmark of computer algebra and have been broadly adopted by many Computer Algebra systems.
In recent years, IT application scenarios have evolved in very
innovative ways. Highly distributed networks have now become a
common platform for large-scale distributed programming, high
bandwidth communications are inexpensive and widespread, and most
of our work tools are equipped with processors enabling us to
perform a multitude of tasks. In addition, mobile computing
(referring specifically to wireless devices and, more broadly, to
dynamically configured systems) has made it possible to exploit
interaction in novel ways. -Algorithms, Complexity and Models of Computation;
This book describes a broad research program on quantum communication. Here, a cryptographic key is exchanged by two parties using quantum states of light and the security of the system arises from the fundamental properties of quantum mechanics. The author developed new communication protocols using high-dimensional quantum states so that more than one classical bit is transferred by each photon. This approach helps circumvent some of the non-ideal properties of the experimental system, enabling record key rates on metropolitan distance scales. Another important aspect of the work is the encoding of the key on high-dimensional phase-randomized weak coherent states, combined with so-called decoy states to thwart a class of possible attacks on the system. The experiments are backed up by a rigorous security analysis of the system, which accounts for all known device non-idealities. The author goes on to demonstrate a scalable approach for increasing the dimension of the quantum states, and considers attacks on the system that use optimal quantum cloning techniques. This thesis captures the current state-of-the-art of the field of quantum communication in laboratory systems, and demonstrates that phase-randomized weak coherent states have application beyond quantum communication.
Feller Semigroups, Bernstein type Operators and Generalized Convexity Associated with Positive Projections.- Gregory's Rational Cubic Splines in Interpolation Subject to Derivative Obstacles.- Interpolation by Splines on Triangulations Oleg Davydov.- On the Use of Quasi-Newton Methods in DAE-Codes.- On the Regularity of Some Differential Operators.- Some Inequalities for Trigonometric Polynomials and their Derivatives.- Inf-Convolution and Radial Basis Functions.- On a Special Property of the Averaged Modulus for Functions of Bounded Variation.- A Simple Approach to the Variational Theory for Interpolation on Spheres.- Constants in Comonotone Polynomial Approximation - A Survey.- Will Ramanujan kill Baker-Gammel-Wills? (A Selective Survey of Pade Approximation).- Approximation Operators of Binomial Type.- Certain Results involving Gammaoperators.- Recent research at Cambridge on radial basis functions.- Representation of quasi-interpolants as differential operators and applications.- Native Hilbert Spaces for Radial Basis Functions I.- Adaptive Approximation with Walsh-similar Functions.- Dual Recurrence and Christoffel-Darboux-Type Formulas for Orthogonal Polynomials.- On Some Problems of Weighted Polynomial Approximation and Interpolation.- Asymptotics of derivatives of orthogonal polynomials based on generalized Jacobi weights. Some new theorems and applications.- List of participants.
This volume is the first diverse and comprehensive treatment of
algorithms and architectures for the realization of neural network
systems. It presents techniques and diverse methods in numerous
areas of this broad subject. The book covers major neural network
systems structures for achieving effective systems, and illustrates
them with examples.
This text explains how advances in wavelet analysis provide new means for multiresolution analysis and describes its wide array of powerful tools. The book covers such topics as: the variations of the windowed Fourier transform; constructions of special waveforms suitable for specific tasks; the use of redundant representations in reconstruction and enhancement; applications of efficient numerical compression as a tool for fast numerical analysis; and approximation properties of various waveforms in different contexts.
Recent developments in computer science clearly show the need for a
better theoretical foundation for some central issues. Methods and
results from mathematical logic, in particular proof theory and
model theory, are of great help here and will be used much more in
future than previously. This book provides an excellent
introduction to the interplay of mathematical logic and computer
science. It contains extensively reworked versions of the lectures
given at the 1997 Marktoberdorf Summer School by leading
researchers in the field.
Robust Technology with Analysis of Interference in Signal Processing discusses for the first time the theoretical fundamentals and algorithms of analysis of noise as an information carrier. On their basis the robust technology of noisy signals processing is developed. This technology can be applied to solving the problems of control, identification, diagnostics, and pattern recognition in petrochemistry, energetics, geophysics, medicine, physics, aviation, and other sciences and industries. The text explores the emergent possibility of forecasting failures on various objects, in conjunction with the fact that failures follow the hidden microchanges revealed via interference estimates. This monograph is of interest to students, postgraduates, engineers, scientific associates and others who are concerned with the processing of measuring information on computers.
This book provides a snapshot of the state of the art of the rapidly evolving field of integration of geometric data in finite element computations. The contributions to this volume, based on research presented at the UCL workshop on the topic in January 2016, include three review papers on core topics such as fictitious domain methods for elasticity, trace finite element methods for partial differential equations defined on surfaces, and Nitsche's method for contact problems. Five chapters present original research articles on related theoretical topics, including Lagrange multiplier methods, interface problems, bulk-surface coupling, and approximation of partial differential equations on moving domains. Finally, two chapters discuss advanced applications such as crack propagation or flow in fractured poroelastic media. This is the first volume that provides a comprehensive overview of the field of unfitted finite element methods, including recent techniques such as cutFEM, traceFEM, ghost penalty, and augmented Lagrangian techniques. It is aimed at researchers in applied mathematics, scientific computing or computational engineering.
This book concerns non-linguistic knowledge required to perform computational natural language understanding (NLU). The main objective of the book is to show that inference-based NLU has the potential for practical large scale applications. First, an introduction to research areas relevant for NLU is given. We review approaches to linguistic meaning, explore knowledge resources, describe semantic parsers, and compare two main forms of inference: deduction and abduction. In the main part of the book, we propose an integrative knowledge base combining lexical-semantic, ontological, and distributional knowledge. A particular attention is payed to ensuring its consistency. We then design a reasoning procedure able to make use of the large scale knowledge base. We experiment both with a deduction-based NLU system and with an abductive reasoner. For evaluation, we use three different NLU tasks: recognizing textual entailment, semantic role labeling, and interpretation of noun dependencies.
This is the first comprehensive treatment of subjective logic and all its operations. The author developed the approach, and in this book he first explains subjective opinions, opinion representation, and decision-making under vagueness and uncertainty, and he then offers a full definition of subjective logic, harmonising the key notations and formalisms, concluding with chapters on trust networks and subjective Bayesian networks, which when combined form general subjective networks. The author shows how real-world situations can be realistically modelled with regard to how situations are perceived, with conclusions that more correctly reflect the ignorance and uncertainties that result from partially uncertain input arguments. The book will help researchers and practitioners to advance, improve and apply subjective logic to build powerful artificial reasoning models and tools for solving real-world problems. A good grounding in discrete mathematics is a prerequisite.
This textbook provides a first introduction to mathematical logic which is closely attuned to the applications of logic in computer science. In it the authors emphasize the notion that deduction is a form of computation. Whilst all the traditional subjects of logic are covered thoroughly: syntax, semantics, completeness, and compactness; much of the book deals with less traditional topics such as resolution theorem proving, logic programming and non-classical logics - modal and intuitionistic - which are becoming increasingly important in computer science. No previous exposure to logic is assumed and so this will be suitable for upper level undergraduates or beginning graduate students in computer science or mathematics.From reviews of the first edition: "... must surely rank as one of the most fruitful textbooks introduced into computer science ... We strongly suggest it as a textbook ..." SIGACT News
The book introduces new techniques which imply rigorous lower bounds on the complexity of some number theoretic and cryptographic problems. These methods and techniques are based on bounds of character sums and numbers of solutions of some polynomial equations over finite fields and residue rings. It also contains a number of open problems and proposals for further research. We obtain several lower bounds, exponential in terms of logp, on the de grees and orders of * polynomials; * algebraic functions; * Boolean functions; * linear recurring sequences; coinciding with values of the discrete logarithm modulo a prime p at suf ficiently many points (the number of points can be as small as pI/He). These functions are considered over the residue ring modulo p and over the residue ring modulo an arbitrary divisor d of p - 1. The case of d = 2 is of special interest since it corresponds to the representation of the right most bit of the discrete logarithm and defines whether the argument is a quadratic residue. We also obtain non-trivial upper bounds on the de gree, sensitivity and Fourier coefficients of Boolean functions on bits of x deciding whether x is a quadratic residue. These results are used to obtain lower bounds on the parallel arithmetic and Boolean complexity of computing the discrete logarithm. For example, we prove that any unbounded fan-in Boolean circuit. of sublogarithmic depth computing the discrete logarithm modulo p must be of superpolynomial size.
Part I of this book is a practical introduction to working with the Isabelle proof assistant. It teaches you how to write functional programs and inductive definitions and how to prove properties about them in Isabelle's structured proof language. Part II is an introduction to the semantics of imperative languages with an emphasis on applications like compilers and program analysers. The distinguishing feature is that all the mathematics has been formalised in Isabelle and much of it is executable. Part I focusses on the details of proofs in Isabelle; Part II can be read even without familiarity with Isabelle's proof language, all proofs are described in detail but informally. The book teaches the reader the art of precise logical reasoning and the practical use of a proof assistant as a surgical tool for formal proofs about computer science artefacts. In this sense it represents a formal approach to computer science, not just semantics. The Isabelle formalisation, including the proofs and accompanying slides, are freely available online, and the book is suitable for graduate students, advanced undergraduate students, and researchers in theoretical computer science and logic.
This book publishes a collection of original scientific research articles that address the state-of-art in using partial differential equations for image and signal processing. Coverage includes: level set methods for image segmentation and construction, denoising techniques, digital image inpainting, image dejittering, image registration, and fast numerical algorithms for solving these problems.
This contributed volume offers a collection of papers presented at the 2016 Network Games, Control, and Optimization conference (NETGCOOP), held at the University of Avignon in France, November 23-25, 2016. These papers highlight the increasing importance of network control and optimization in many networking application domains, such as mobile and fixed access networks, computer networks, social networks, transportation networks, and, more recently, electricity grids and biological networks. Covering a wide variety of both theoretical and applied topics in the areas listed above, the authors explore several conceptual and algorithmic tools that are needed for efficient and robust control operation, performance optimization, and better understanding the relationships between entities that may be acting cooperatively or selfishly in uncertain and possibly adversarial environments. As such, this volume will be of interest to applied mathematicians, computer scientists, engineers, and researchers in other related fields. |
![]() ![]() You may like...
Design Thinking Research - Making…
Hasso Plattner, Christoph Meinel, …
Hardcover
R6,803
Discovery Miles 68 030
Design Thinking Research - Taking…
Hasso Plattner, Christoph Meinel, …
Hardcover
R5,704
Discovery Miles 57 040
Proceedings of the 2012 International…
Wei Lu, Guoqiang Cai, …
Hardcover
R5,835
Discovery Miles 58 350
Lean Software Development in Action
Andrea Janes, Giancarlo Succi
Hardcover
Research Anthology on Implementing…
Information R Management Association
Hardcover
R17,073
Discovery Miles 170 730
|