![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
Geared primarily to an audience consisting of mathematically advanced undergraduate or beginning graduate students, this text may additionally be used by engineering students interested in a rigorous, proof-oriented systems course that goes beyond the classical frequency-domain material and more applied courses. The minimal mathematical background required is a working knowledge of linear algebra and differential equations. The book covers what constitutes the common core of control theory and is unique in its emphasis on foundational aspects. While covering a wide range of topics written in a standard theorem/proof style, it also develops the necessary techniques from scratch. In this second edition, new chapters and sections have been added, dealing with time optimal control of linear systems, variational and numerical approaches to nonlinear control, nonlinear controllability via Lie-algebraic methods, and controllability of recurrent nets and of linear systems with bounded controls.
This contributed volume offers a collection of papers presented at the 2016 Network Games, Control, and Optimization conference (NETGCOOP), held at the University of Avignon in France, November 23-25, 2016. These papers highlight the increasing importance of network control and optimization in many networking application domains, such as mobile and fixed access networks, computer networks, social networks, transportation networks, and, more recently, electricity grids and biological networks. Covering a wide variety of both theoretical and applied topics in the areas listed above, the authors explore several conceptual and algorithmic tools that are needed for efficient and robust control operation, performance optimization, and better understanding the relationships between entities that may be acting cooperatively or selfishly in uncertain and possibly adversarial environments. As such, this volume will be of interest to applied mathematicians, computer scientists, engineers, and researchers in other related fields.
This book presents research in an interdisciplinary field, resulting from the vigorous and fruitful cross-pollination between traditional deontic logic and computer science. AI researchers have used deontic logic as one of the tools in modelling legal reasoning. Computer scientists have discovered that computer systems (including their interaction with other computer systems and with human agents) can often be productively modelled as norm-governed. So, for example, deontic logic has been applied by computer scientists for specifying bureaucratic systems, access and security policies, and soft design or integrity constraints, and for modelling fault tolerance. In turn, computer scientists and AI researchers have also discovered (and made it clear to the rest of us) that various formal tools (e.g. nonmonotonic, temporal and dynamic logics) developed in computer science and artificial intelligence have interesting applications to traditional issues in deontic logic. This volume presents some of the best work done in this area, with the selection at once reflecting the general interdisciplinary (and international) character that this area of research has taken on, as well as reflecting the more specific recent inter-disciplinary developments between traditional deontic logic and computer science.
This book illustrates the broad range of Jerry Marsden's mathematical legacy in areas of geometry, mechanics, and dynamics, from very pure mathematics to very applied, but always with a geometric perspective. Each contribution develops its material from the viewpoint of geometric mechanics beginning at the very foundations, introducing readers to modern issues via illustrations in a wide range of topics. The twenty refereed papers contained in this volume are based on lectures and research performed during the month of July 2012 at the Fields Institute for Research in Mathematical Sciences, in a program in honor of Marsden's legacy. The unified treatment of the wide breadth of topics treated in this book will be of interest to both experts and novices in geometric mechanics. Experts will recognize applications of their own familiar concepts and methods in a wide variety of fields, some of which they may never have approached from a geometric viewpoint. Novices may choose topics that interest them among the various fields and learn about geometric approaches and perspectives toward those topics that will be new for them as well.
The approximation of a continuous function by either an algebraic polynomial, a trigonometric polynomial, or a spline, is an important issue in application areas like computer-aided geometric design and signal analysis. This book is an introduction to the mathematical analysis of such approximation, and, with the prerequisites of only calculus and linear algebra, the material is targeted at senior undergraduate level, with a treatment that is both rigorous and self-contained. The topics include polynomial interpolation; Bernstein polynomials and the Weierstrass theorem; best approximations in the general setting of normed linear spaces and inner product spaces; best uniform polynomial approximation; orthogonal polynomials; Newton-Cotes, Gauss and Clenshaw-Curtis quadrature; the Euler-Maclaurin formula; approximation of periodic functions; the uniform convergence of Fourier series; spline approximation, with an extensive treatment of local spline interpolation, and its application in quadrature. Exercises are provided at the end of each chapter
In "Distributed Algorithms," Nancy Lynch provides a blueprint
for designing, implementing, and analyzing distributed algorithms.
She directs her book at a wide audience, including students,
programmers, system designers, and researchers. "Distributed Algorithms" contains the most significant
algorithms and impossibility results in the area, all in a simple
automata-theoretic setting. The algorithms are proved correct, and
their complexity is analyzed according to precisely defined
complexity measures. The problems covered include resource
allocation, communication, consensus among distributed processes,
data consistency, deadlock detection, leader election, global
snapshots, and many others. The material is organized according to the system model first by
the timing model and then by the interprocess communication
mechanism. The material on system models is isolated in separate
chapters for easy reference. The presentation is completely rigorous, yet is intuitive enough for immediate comprehension. This book familiarizes readers with important problems, algorithms, and impossibility results in the area: readers can then recognize the problems when they arise in practice, apply the algorithms to solve them, and use the impossibility results to determine whether problems are unsolvable. The book also provides readers with the basic mathematical tools for designing new algorithms and proving new impossibility results. In addition, it teaches readers how to reason carefully about distributed algorithms to model them formally, devise precise specifications for their required behavior, prove their correctness, and evaluate their performance with realistic measures."
This book explains music's comprehensive ontology, its way of existence and processing, as specified in its compact characterization: music embodies meaningful communication and mediates physically between its emotional and mental layers. The book unfolds in a basic discourse in everyday language that is accessible to everybody who wants to understand what this topic is about. Musical ontology is delayed in its fundamental dimensions: its realities, its meaningful communication, and its embodied utterance from musical creators to an interested audience. The authors' approach is applicable to every musical genre and is scientific, the book is suitable for non-musicians and non-scientists alike.
This book is designed to make accessible to nonspecialists the still evolving concepts of quantum mechanics and the terminology in which these are expressed. The opening chapters summarize elementary concepts of twentieth century quantum mechanics and describe the mathematical methods employed in the field, with clear explanation of, for example, Hilbert space, complex variables, complex vector spaces and Dirac notation, and the Heisenberg uncertainty principle. After detailed discussion of the Schroedinger equation, subsequent chapters focus on isotropic vectors, used to construct spinors, and on conceptual problems associated with measurement, superposition, and decoherence in quantum systems. Here, due attention is paid to Bell's inequality and the possible existence of hidden variables. Finally, progression toward quantum computation is examined in detail: if quantum computers can be made practicable, enormous enhancements in computing power, artificial intelligence, and secure communication will result. This book will be of interest to a wide readership seeking to understand modern quantum mechanics and its potential applications.
This book offers an overview on the main modern important topics in random variables, random processes, and decision theory for solving real-world problems. After an introduction to concepts of statistics and signals, the book introduces many essential applications to signal processing like denoising, texture classification, histogram equalization, deep learning, or feature extraction. The book uses MATLAB algorithms to demonstrate the implementation of the theory to real systems. This makes the contents of the book relevant to students and professionals who need a quick introduction but practical introduction how to deal with random signals and processes
The main purpose of the present volume is to advance our understanding of the notions of knowledge and context, the connections between them and the ways in which they can be modeled, in particular formalized a question of prime importance and utmost relevance to such diverse disciplines as philosophy, linguistics, computer science and artificial intelligence and cognitive science. Bringing together essays written by world-leading experts and emerging researchers in epistemology, logic, philosophy of language, linguistics and theoretical computer science, the book examines the formal modeling of knowledge and the knowledge-context link at one or more of three intersections - context and epistemology, epistemology and formalism, formalism and context and presents a novel range of approaches to the current discussions that the connections between knowledge, language, action, reasoning and context continually enlivens. It develops powerful ideas that will push the relevant fields forward and give a sense of the new directions in which mainstream and formal research on knowledge and context is heading."
The Bia owie a workshops on Geometric Methods in Physics, taking place in the unique environment of the Bia owie a natural forest in Poland, are among the important meetings in the field. Every year some 80 to 100 participants both from mathematics and physics join to discuss new developments and to interchange ideas. The current volume was produced on the occasion of the XXXI meeting in 2012. For the first time the workshop was followed by a School on Geometry and Physics, which consisted of advanced lectures for graduate students and young researchers. Selected speakers of the workshop were asked to contribute, and additional review articles were added. The selection shows that despite its now long tradition the workshop remains always at the cutting edge of ongoing research. The XXXI workshop had as a special topic the works of the late Boris Vasilievich Fedosov (1938 2011) who is best known for a simple and very natural construction of a deformation quantization for any symplectic manifold, and for his contributions to index theory.
This book presents a selection of peer-reviewed contributions on the latest advances in time series analysis, presented at the International Conference on Time Series and Forecasting (ITISE 2019), held in Granada, Spain, on September 25-27, 2019. The first two parts of the book present theoretical contributions on statistical and advanced mathematical methods, and on econometric models, financial forecasting and risk analysis. The remaining four parts include practical contributions on time series analysis in energy; complex/big data time series and forecasting; time series analysis with computational intelligence; and time series analysis and prediction for other real-world problems. Given this mix of topics, readers will acquire a more comprehensive perspective on the field of time series analysis and forecasting. The ITISE conference series provides a forum for scientists, engineers, educators and students to discuss the latest advances and implementations in the foundations, theory, models and applications of time series analysis and forecasting. It focuses on interdisciplinary research encompassing computer science, mathematics, statistics and econometrics.
The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.
The aim of this book is to present the mathematical theory and the know-how to make computer programs for the numerical approximation of Optimal Control of PDE's. The computer programs are presented in a straightforward generic language. As a consequence they are well structured, clearly explained and can be translated easily into any high level programming language. Applications and corresponding numerical tests are also given and discussed. To our knowledge, this is the first book to put together mathematics and computer programs for Optimal Control in order to bridge the gap between mathematical abstract algorithms and concrete numerical ones. The text is addressed to students and graduates in Mathematics, Mechanics, Applied Mathematics, Numerical Software, Information Technology and Engineering. It can also be used for Master and Ph.D. programs.
When we learn from books or daily experience, we make associations and draw inferences on the basis of information that is insufficient for under standing. One example of insufficient information may be a small sample derived from observing experiments. With this perspective, the need for de veloping a better understanding of the behavior of a small sample presents a problem that is far beyond purely academic importance. During the past 15 years considerable progress has been achieved in the study of this issue in China. One distinguished result is the principle of in formation diffusion. According to this principle, it is possible to partly fill gaps caused by incomplete information by changing crisp observations into fuzzy sets so that one can improve the recognition of relationships between input and output. The principle of information diffusion has been proven suc cessful for the estimation of a probability density function. Many successful applications reflect the advantages of this new approach. It also supports an argument that fuzzy set theory can be used not only in "soft" science where some subjective adjustment is necessary, but also in "hard" science where all data are recorded."
This book presents recent research on Advanced Computing in Industrial Mathematics, which is one of the most prominent interdisciplinary areas, bringing together mathematics, computer science, scientific computations, engineering, physics, chemistry, medicine, etc. Further, the book presents the major tools used in Industrial Mathematics, which are based on mathematical models, and the corresponding computer codes, which are used to perform virtual experiments to obtain new data or to better understand previous experimental findings. The book gathers the peer-reviewed papers presented at the 11th Annual Meeting of the Bulgarian Section of SIAM (BGSIAM), from December 20 to 22, 2016 in Sofia, Bulgaria.
The book contains the methods and bases of functional analysis that are directly adjacent to the problems of numerical mathematics and its applications; they are what one needs for the understand ing from a general viewpoint of ideas and methods of computational mathematics and of optimization problems for numerical algorithms. Functional analysis in mathematics is now just the small visible part of the iceberg. Its relief and summit were formed under the influence of this author's personal experience and tastes. This edition in English contains some additions and changes as compared to the second edition in Russian; discovered errors and misprints had been corrected again here; to the author's distress, they jump incomprehensibly from one edition to another as fleas. The list of literature is far from being complete; just a number of textbooks and monographs published in Russian have been included. The author is grateful to S. Gerasimova for her help and patience in the complex process of typing the mathematical manuscript while the author corrected, rearranged, supplemented, simplified, general ized, and improved as it seemed to him the book's contents. The author thanks G. Kontarev for the difficult job of translation and V. Klyachin for the excellent figures."
The present volume is a tribute to Gian-Carlo Rota. It is an anthology of the production of a unique collaboration among leading researchers who were greatly influenced by Gian-Carlo Rota's mathematical thought.The book begins with an essay in mathematical biography by H. Crapo in which the prospects for research opened up by Rota's work are outlined. The subsequent section is devoted to the prestigious Fubini lectures delivered by Gian-Carlo Rota at the Institute for scientific Interchange in 1998, with a preface by E. Vesentini. These lectures provide the only published documentation of Rota's plans for a fundamental reform of probability theory, a program interrupted by his untimely demise.The lectures by M. Aigner and D. Perrin specially conceived for this volume, provide self-contained surveys of central topics in combinatorics and theoretical computer science; they will also be of great use to both undergraduate and graduate students.The essays and research papers that appear in the final section present recent developments of some of the mathematical themes promoted by Gian-Carlo Rota. These will be of particular interest as they propose many new problems for research.
Signal processing applications have burgeoned in the past decade.
During the same time, signal processing techniques have matured
rapidly and now include tools from many areas of mathematics,
computer science, physics, and engineering. This trend will
continue as many new signal processing applications are opening up
in consumer products and communications systems.
An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability. The authors consider the problem in terms of parameterized languages and taking "k-slices" of the language, thus introducing readers to new classes of algorithms which may be analysed more precisely than was the case until now. The book is as self-contained as possible and includes a great deal of background material. As a result, computer scientists, mathematicians, and graduate students interested in the design and analysis of algorithms will find much of interest.
Advances in microelectronic technology have made massively parallel computing a reality and triggered an outburst of research activity in parallel processing architectures and algorithms. Distributed memory multiprocessors - parallel computers that consist of microprocessors connected in a regular topology - are increasingly being used to solve large problems in many application areas. In order to use these computers for a specific application, existing algorithms need to be restructured for the architecture and new algorithms developed. The performance of a computation on a distributed memory multiprocessor is affected by the node and communication architecture, the interconnection network topology, the I/O subsystem, and the parallel algorithm and communication protocols. Each of these parametersis a complex problem, and solutions require an understanding of the interactions among them. This book is based on the papers presented at the NATO Advanced Study Institute held at Bilkent University, Turkey, in July 1991. The book is organized in five parts: Parallel computing structures and communication, Parallel numerical algorithms, Parallel programming, Fault tolerance, and Applications and algorithms.
Strategies for Quasi-Monte Carlo builds a framework to design and analyze strategies for randomized quasi-Monte Carlo (RQMC). One key to efficient simulation using RQMC is to structure problems to reveal a small set of important variables, their number being the effective dimension, while the other variables collectively are relatively insignificant. Another is smoothing. The book provides many illustrations of both keys, in particular for problems involving Poisson processes or Gaussian processes. RQMC beats grids by a huge margin. With low effective dimension, RQMC is an order-of-magnitude more efficient than standard Monte Carlo. With, in addition, certain smoothness - perhaps induced - RQMC is an order-of-magnitude more efficient than deterministic QMC. Unlike the latter, RQMC permits error estimation via the central limit theorem. For random-dimensional problems, such as occur with discrete-event simulation, RQMC gets judiciously combined with standard Monte Carlo to keep memory requirements bounded. This monograph has been designed to appeal to a diverse audience, including those with applications in queueing, operations research, computational finance, mathematical programming, partial differential equations (both deterministic and stochastic), and particle transport, as well as to probabilists and statisticians wanting to know how to apply effectively a powerful tool, and to those interested in numerical integration or optimization in their own right. It recognizes that the heart of practical application is algorithms, so pseudocodes appear throughout the book. While not primarily a textbook, it is suitable as a supplementary text for certain graduate courses. As a reference, it belongs on the shelf of everyone with a serious interest in improving simulation efficiency. Moreover, it will be a valuable reference to all those individuals interested in improving simulation efficiency with more than incremental increases.
This book takes a unique approach to information retrieval by laying down the foundations for a modern algebra of information retrieval based on lattice theory. All major retrieval methods developed so far are described in detail a" Boolean, Vector Space and probabilistic methods, but also Web retrieval algorithms like PageRank, HITS, and SALSA a" and the author shows that they all can be treated elegantly in a unified formal way, using lattice theory as the one basic concept. Further, he also demonstrates that the lattice-based approach to information retrieval allows us to formulate new retrieval methods. SAndor Dominicha (TM)s presentation is characterized by an engineering-like approach, describing all methods and technologies with as much mathematics as needed for clarity and exactness. His readers in both computer science and mathematics will learn how one single concept can be used to understand the most important retrieval methods, to propose new ones, and also to gain new insights into retrieval modeling in general. Thus, his book is appropriate for researchers and graduate students, who will additionally benefit from the many exercises at the end of each chapter.
This volume gathers selected papers presented at the Fourth Asian Workshop on Philosophical Logic, held in Beijing in October 2018. The contributions cover a wide variety of topics in modal logic (epistemic logic, temporal logic and dynamic logic), proof theory, algebraic logic, game logics, and philosophical foundations of logic. They also reflect the interdisciplinary nature of logic - a subject that has been studied in fields as diverse as philosophy, linguistics, mathematics, computer science and artificial intelligence. More specifically. The book also presents the latest developments in logic both in Asia and beyond.
Automated and semi-automated manipulation of so-called labelled transition systems has become an important means in discovering flaws in software and hardware systems. Process algebra has been developed to express such labelled transition systems algebraically, which enhances the ways of manipulation by means of equational logic and term rewriting.The theory of process algebra has developed rapidly over the last twenty years, and verification tools have been developed on the basis of process algebra, often in cooperation with techniques related to model checking. This textbook gives a thorough introduction into the basics of process algebra and its applications. |
![]() ![]() You may like...
Essential Maths A Level Pure Mathematics…
Lauren Gurney, David Rayner, …
Paperback
R517
Discovery Miles 5 170
Deep In-memory Architectures for Machine…
Mingu Kang, Sujan Gonugondla, …
Hardcover
R2,627
Discovery Miles 26 270
Financial Mathematics - A Computational…
K. Pereira, N. Modhien, …
Paperback
R326
Discovery Miles 3 260
Pearson Edexcel International A Level…
Joe Skrakowski, Harry Smith
Paperback
R881
Discovery Miles 8 810
Introducing Design Automation for…
Alwin Zulehner, Robert Wille
Hardcover
R3,130
Discovery Miles 31 300
|