![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
This monograph proposes a new way of implementing interaction in logic. It also provides an elementary introduction to Constructive Type Theory (CTT). The authors equally emphasize basic ideas and finer technical details. In addition, many worked out exercises and examples will help readers to better understand the concepts under discussion. One of the chief ideas animating this study is that the dialogical understanding of definitional equality and its execution provide both a simple and a direct way of implementing the CTT approach within a game-theoretical conception of meaning. In addition, the importance of the play level over the strategy level is stressed, binding together the matter of execution with that of equality and the finitary perspective on games constituting meaning. According to this perspective the emergence of concepts are not only games of giving and asking for reasons (games involving Why-questions), they are also games that include moves establishing how it is that the reasons brought forward accomplish their explicative task. Thus, immanent reasoning games are dialogical games of Why and How.
This book constitutes the refereed proceedings of the 4th EAI International Conference on Industrial Networks and Intelligent Systems, INISCOM 2018, held in Da Nang, Vietnam, in August 2018. The 26 full papers were selected from 38 submissions and are organized thematically in tracks: Telecommunications Systems and Networks; Industrial Networks and Applications; Hardware and Software Design and Development; Information Processing and Data Analysis; Signal Processing; Security and Privacy.
The revised edition of this book offers an extended overview of quantum walks and explains their role in building quantum algorithms, in particular search algorithms. Updated throughout, the book focuses on core topics including Grover's algorithm and the most important quantum walk models, such as the coined, continuous-time, and Szedgedy's quantum walk models. There is a new chapter describing the staggered quantum walk model. The chapter on spatial search algorithms has been rewritten to offer a more comprehensive approach and a new chapter describing the element distinctness algorithm has been added. There is a new appendix on graph theory highlighting the importance of graph theory to quantum walks. As before, the reader will benefit from the pedagogical elements of the book, which include exercises and references to deepen the reader's understanding, and guidelines for the use of computer programs to simulate the evolution of quantum walks. Review of the first edition: "The book is nicely written, the concepts are introduced naturally, and many meaningful connections between them are highlighted. The author proposes a series of exercises that help the reader get some working experience with the presented concepts, facilitating a better understanding. Each chapter ends with a discussion of further references, pointing the reader to major results on the topics presented in the respective chapter." - Florin Manea, zbMATH.
This unique text/reference presents a fresh look at nonlinear processing through nonlinear eigenvalue analysis, highlighting how one-homogeneous convex functionals can induce nonlinear operators that can be analyzed within an eigenvalue framework. The text opens with an introduction to the mathematical background, together with a summary of classical variational algorithms for vision. This is followed by a focus on the foundations and applications of the new multi-scale representation based on non-linear eigenproblems. The book then concludes with a discussion of new numerical techniques for finding nonlinear eigenfunctions, and promising research directions beyond the convex case. Topics and features: introduces the classical Fourier transform and its associated operator and energy, and asks how these concepts can be generalized in the nonlinear case; reviews the basic mathematical notion, briefly outlining the use of variational and flow-based methods to solve image-processing and computer vision algorithms; describes the properties of the total variation (TV) functional, and how the concept of nonlinear eigenfunctions relate to convex functionals; provides a spectral framework for one-homogeneous functionals, and applies this framework for denoising, texture processing and image fusion; proposes novel ways to solve the nonlinear eigenvalue problem using special flows that converge to eigenfunctions; examines graph-based and nonlocal methods, for which a TV eigenvalue analysis gives rise to strong segmentation, clustering and classification algorithms; presents an approach to generalizing the nonlinear spectral concept beyond the convex case, based on pixel decay analysis; discusses relations to other branches of image processing, such as wavelets and dictionary based methods. This original work offers fascinating new insights into established signal processing techniques, integrating deep mathematical concepts from a range of different fields, which will be of great interest to all researchers involved with image processing and computer vision applications, as well as computations for more general scientific problems.
Succinct representation and fast access to large amounts of data are challenges of our time. This unique book suggests general approaches of 'complexity of descriptions'. It deals with a variety of concrete topics and bridges between them, while opening new perspectives and providing promising avenues for the 'complexity puzzle'.
This book presents the refereed proceedings of the Twelfth International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at Stanford University (California) in August 2016. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising in particular, in finance, statistics, computer graphics and the solution of PDEs.
This book presents the latest developments regarding a detailed mobile agent-enabled anomaly detection and verification system for resource constrained sensor networks; a number of algorithms on multi-aspect anomaly detection in sensor networks; several algorithms on mobile agent transmission optimization in resource constrained sensor networks; an algorithm on mobile agent-enabled in situ verification of anomalous sensor nodes; a detailed Petri Net-based formal modeling and analysis of the proposed system, and an algorithm on fuzzy logic-based cross-layer anomaly detection and mobile agent transmission optimization. As such, it offers a comprehensive text for interested readers from academia and industry alike.
This book presents two new decomposition methods to decompose a time series in intrinsic components of low and high frequencies. The methods are based on Singular Value Decomposition (SVD) of a Hankel matrix (HSVD). The proposed decomposition is used to improve the accuracy of linear and nonlinear auto-regressive models. Linear Auto-regressive models (AR, ARMA and ARIMA) and Auto-regressive Neural Networks (ANNs) have been found insufficient because of the highly complicated nature of some time series. Hybrid models are a recent solution to deal with non-stationary processes which combine pre-processing techniques with conventional forecasters, some pre-processing techniques broadly implemented are Singular Spectrum Analysis (SSA) and Stationary Wavelet Transform (SWT). Although the flexibility of SSA and SWT allows their usage in a wide range of forecast problems, there is a lack of standard methods to select their parameters. The proposed decomposition HSVD and Multilevel SVD are described in detail through time series coming from the transport and fishery sectors. Further, for comparison purposes, it is evaluated the forecast accuracy reached by SSA and SWT, both jointly with AR-based models and ANNs.
This book presents selected peer-reviewed contributions from the International Work-Conference on Time Series, ITISE 2017, held in Granada, Spain, September 18-20, 2017. It discusses topics in time series analysis and forecasting, including advanced mathematical methodology, computational intelligence methods for time series, dimensionality reduction and similarity measures, econometric models, energy time series forecasting, forecasting in real problems, online learning in time series as well as high-dimensional and complex/big data time series. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
Cyber-physical systems (CPSs) combine cyber capabilities, such as computation or communication, with physical capabilities, such as motion or other physical processes. Cars, aircraft, and robots are prime examples, because they move physically in space in a way that is determined by discrete computerized control algorithms. Designing these algorithms is challenging due to their tight coupling with physical behavior, while it is vital that these algorithms be correct because we rely on them for safety-critical tasks. This textbook teaches undergraduate students the core principles behind CPSs. It shows them how to develop models and controls; identify safety specifications and critical properties; reason rigorously about CPS models; leverage multi-dynamical systems compositionality to tame CPS complexity; identify required control constraints; verify CPS models of appropriate scale in logic; and develop an intuition for operational effects. The book is supported with homework exercises, lecture videos, and slides.
The focus of these conference proceedings is on research, development, and applications in the fields of numerical geometry, scientific computing and numerical simulation, particularly in mesh generation and related problems. In addition, this year's special focus is on Voronoi diagrams and their applications, celebrating the 150th birthday of G.F. Voronoi. In terms of content, the book strikes a balance between engineering algorithms and mathematical foundations. It presents an overview of recent advances in numerical geometry, grid generation and adaptation in terms of mathematical foundations, algorithm and software development and applications. The specific topics covered include: quasi-conformal and quasi-isometric mappings, hyperelastic deformations, multidimensional generalisations of the equidistribution principle, discrete differential geometry, spatial and metric encodings, Voronoi-Delaunay theory for tilings and partitions, duality in mathematical programming and numerical geometry, mesh-based optimisation and optimal control methods. Further aspects examined include iterative solvers for variational problems and algorithm and software development. The applications of the methods discussed are multidisciplinary and include problems from mathematics, physics, biology, chemistry, material science, and engineering.
This volume contains papers based on presentations at the "Nagoya Winter Workshop 2015: Reality and Measurement in Algebraic Quantum Theory (NWW 2015)", held in Nagoya, Japan, in March 2015. The foundations of quantum theory have been a source of mysteries, puzzles, and confusions, and have encouraged innovations in mathematical languages to describe, analyze, and delineate this wonderland. Both ontological and epistemological questions about quantum reality and measurement have been placed in the center of the mysteries explored originally by Bohr, Heisenberg, Einstein, and Schroedinger. This volume describes how those traditional problems are nowadays explored from the most advanced perspectives. It includes new research results in quantum information theory, quantum measurement theory, information thermodynamics, operator algebraic and category theoretical foundations of quantum theory, and the interplay between experimental and theoretical investigations on the uncertainty principle. This book is suitable for a broad audience of mathematicians, theoretical and experimental physicists, and philosophers of science.
This book discusses the elementary ideas and tools needed for open quantum systems in a comprehensive manner. The emphasis is given to both the traditional master equation as well as the functional (path) integral approaches. It discusses the basic paradigm of open systems, the harmonic oscillator and the two-level system in detail. The traditional topics of dissipation and tunneling, as well as the modern field of quantum information, find a prominent place in the book. Assuming a basic background of quantum and statistical mechanics, this book will help readers familiarize with the basic tools of open quantum systems. Open quantum systems is the study of quantum dynamics of the system of interest, taking into account the effects of the ambient environment. It is ubiquitous in the sense that any system could be envisaged to be surrounded by its environment which could naturally exert its influence on it. Open quantum systems allows for a systematic understanding of irreversible processes such as decoherence and dissipation, of the essence in order to have a correct understanding of realistic quantum dynamics and also for possible implementations. This would be essential for a possible development of quantum technologies.
This book brings together the latest findings on efficient solutions of multi/many-objective optimization problems from the leading researchers in the field. The focus is on solving real-world optimization problems using strategies ranging from evolutionary to hybrid frameworks, and involving various computation platforms. The topics covered include solution frameworks using evolutionary to hybrid models in application areas like Analytics, Cancer Research, Traffic Management, Networks and Communications, E-Governance, Quantum Technology, Image Processing, etc. As such, the book offers a valuable resource for all postgraduate students and researchers interested in exploring solution frameworks for multi/many-objective optimization problems.
This book contains more than 15 essays that explore issues in truth, existence, and explanation. It features cutting-edge research in the philosophy of mathematics and logic. Renowned philosophers, mathematicians, and younger scholars provide an insightful contribution to the lively debate in this interdisciplinary field of inquiry. The essays look at realism vs. anti-realism as well as inflationary vs. deflationary theories of truth. The contributors also consider mathematical fictionalism, structuralism, the nature and role of axioms, constructive existence, and generality. In addition, coverage also looks at the explanatory role of mathematics and the philosophical relevance of mathematical explanation. The book will appeal to a broad mathematical and philosophical audience. It contains work from FilMat, the Italian Network for the Philosophy of Mathematics. These papers collected here were also presented at their second international conference, held at the University of Chieti-Pescara, May 2016.
This book explains exactly what human knowledge is. The key concepts in this book are structures and algorithms, i.e., what the readers "see" and how they make use of what they see. Thus in comparison with some other books on the philosophy (or methodology) of science, which employ a syntactic approach, the author's approach is model theoretic or structural. Properly understood, it extends the current art and science of mathematical modeling to all fields of knowledge. The link between structure and algorithms is mathematics. But viewing "mathematics" as such a link is not exactly what readers most likely learned in school; thus, the task of this book is to explain what "mathematics" should actually mean. Chapter 1, an introductory essay, presents a general analysis of structures, algorithms and how they are to be linked. Several examples from the natural and social sciences, and from the history of knowledge, are provided in Chapters 2-6. In turn, Chapters 7 and 8 extend the analysis to include language and the mind. Structures are what the readers see. And, as abstract cultural objects, they can almost always be seen in many different ways. But certain structures, such as natural numbers and the basic theory of grammar, seem to have an absolute character. Any theory of knowledge grounded in human culture must explain how this is possible. The author's analysis of this cultural invariance, combining insights from evolutionary theory and neuroscience, is presented in the book's closing chapter. The book will be of interest to researchers, students and those outside academia who seek a deeper understanding of knowledge in our present-day society.
This book discusses recent advances and research in applied mathematics, statistics and their applications in computing. It features papers presented at the fourth conference in the series organized at the Indian Institute of Technology (Banaras Hindu University), Varanasi, India, on 9 - 11 January 2018 on areas of current interest, including operations research, soft computing, applied mathematical modelling, cryptology, and security analysis. The conference has emerged as a powerful forum, bringing together leading academic scientists, experts from industry, and researchers and offering a venue to discuss, interact and collaborate to stimulate the advancement of mathematics and its applications in computer science. The education of future consumers, users, producers, developers and researchers of mathematics and its applications is an important challenge in modern society, and as such, mathematics and its application in computer science are of vital significance to all spectrums of the community, as well as to mathematicians and computing professionals across different educational levels and disciplines. With contributions by leading international experts, this book motivates and creates interest among young researchers.
The book presents theory and algorithms for secure networked inference in the presence of Byzantines. It derives fundamental limits of networked inference in the presence of Byzantine data and designs robust strategies to ensure reliable performance for several practical network architectures. In particular, it addresses inference (or learning) processes such as detection, estimation or classification, and parallel, hierarchical, and fully decentralized (peer-to-peer) system architectures. Furthermore, it discusses a number of new directions and heuristics to tackle the problem of design complexity in these practical network architectures for inference.
This open access book covers all facets of entity-oriented search-where "search" can be interpreted in the broadest sense of information access-from a unified point of view, and provides a coherent and comprehensive overview of the state of the art. It represents the first synthesis of research in this broad and rapidly developing area. Selected topics are discussed in-depth, the goal being to establish fundamental techniques and methods as a basis for future research and development. Additional topics are treated at a survey level only, containing numerous pointers to the relevant literature. A roadmap for future research, based on open issues and challenges identified along the way, rounds out the book. The book is divided into three main parts, sandwiched between introductory and concluding chapters. The first two chapters introduce readers to the basic concepts, provide an overview of entity-oriented search tasks, and present the various types and sources of data that will be used throughout the book. Part I deals with the core task of entity ranking: given a textual query, possibly enriched with additional elements or structural hints, return a ranked list of entities. This core task is examined in a number of different variants, using both structured and unstructured data collections, and numerous query formulations. In turn, Part II is devoted to the role of entities in bridging unstructured and structured data. Part III explores how entities can enable search engines to understand the concepts, meaning, and intent behind the query that the user enters into the search box, and how they can provide rich and focused responses (as opposed to merely a list of documents)-a process known as semantic search. The final chapter concludes the book by discussing the limitations of current approaches, and suggesting directions for future research. Researchers and graduate students are the primary target audience of this book. A general background in information retrieval is sufficient to follow the material, including an understanding of basic probability and statistics concepts as well as a basic knowledge of machine learning concepts and supervised learning algorithms.
This book constitutes the refereed proceedings of the 11th International Workshop on Hybrid Metaheuristics, HM 2019, held in Concepcion, Chile, in January 2019. The 11 revised full papers and 5 short papers presented were carefully reviewed and selected from 23 submissions. The papers present hybridization strategies and explore the integration of new techniques coming from other areas of expertise. They cover a variety of topics such as low-level hybridization, high-level hybridization, portfolio techniques, cooperative search, and theoretical aspects of hybridization.
The books in this trilogy capture the foundational core of advanced informatics. The authors make the foundations accessible, enabling students to become effective problem solvers. This first volume establishes the inductive approach as a fundamental principle for system and domain analysis. After a brief introduction to the elementary mathematical structures, such as sets, propositional logic, relations, and functions, the authors focus on the separation between syntax (representation) and semantics (meaning), and on the advantages of the consistent and persistent use of inductive definitions. They identify compositionality as a feature that not only acts as a foundation for algebraic proofs but also as a key for more general scalability of modeling and analysis. A core principle throughout is invariance, which the authors consider a key for the mastery of change, whether in the form of extensions, transformations, or abstractions. This textbook is suitable for undergraduate and graduate courses in computer science and for self-study. Most chapters contain exercises and the content has been class-tested over many years in various universities.
This monograph provides a modern introduction to the theory of quantales. First coined by C.J. Mulvey in 1986, quantales have since developed into a significant topic at the crossroads of algebra and logic, of notable interest to theoretical computer science. This book recasts the subject within the powerful framework of categorical algebra, showcasing its versatility through applications to C*- and MV-algebras, fuzzy sets and automata. With exercises and historical remarks at the end of each chapter, this self-contained book provides readers with a valuable source of references and hints for future research. This book will appeal to researchers across mathematics and computer science with an interest in category theory, lattice theory, and many-valued logic.
This book explores Probabilistic Cellular Automata (PCA) from the perspectives of statistical mechanics, probability theory, computational biology and computer science. PCA are extensions of the well-known Cellular Automata models of complex systems, characterized by random updating rules. Thanks to their probabilistic component, PCA offer flexible computing tools for complex numerical constructions, and realistic simulation tools for phenomena driven by interactions among a large number of neighboring structures. PCA are currently being used in various fields, ranging from pure probability to the social sciences and including a wealth of scientific and technological applications. This situation has produced a highly diversified pool of theoreticians, developers and practitioners whose interaction is highly desirable but can be hampered by differences in jargon and focus. This book - just as the workshop on which it is based - is an attempt to overcome these difference and foster interest among newcomers and interaction between practitioners from different fields. It is not intended as a treatise, but rather as a gentle introduction to the role and relevance of PCA technology, illustrated with a number of applications in probability, statistical mechanics, computer science, the natural sciences and dynamical systems. As such, it will be of interest to students and non-specialists looking to enter the field and to explore its challenges and open issues.
The book will provide: 1) In depth explanation of rough set theory along with examples of the concepts. 2) Detailed discussion on idea of feature selection. 3) Details of various representative and state of the art feature selection techniques along with algorithmic explanations. 4) Critical review of state of the art rough set based feature selection methods covering strength and weaknesses of each. 5) In depth investigation of various application areas using rough set based feature selection. 6) Complete Library of Rough Set APIs along with complexity analysis and detailed manual of using APIs 7) Program files of various representative Feature Selection algorithms along with explanation of each. The book will be a complete and self-sufficient source both for primary and secondary audience. Starting from basic concepts to state-of-the art implementation, it will be a constant source of help both for practitioners and researchers. Book will provide in-depth explanation of concepts supplemented with working examples to help in practical implementation. As far as practical implementation is concerned, the researcher/practitioner can fully concentrate on his/her own work without any concern towards implementation of basic RST functionality. Providing complexity analysis along with full working programs will further simplify analysis and comparison of algorithms.
This book provides practical applications of doubly classified models by using R syntax to generate the models. It also presents these models in symbolic tables so as to cater to those who are not mathematically inclined, while numerous examples throughout the book illustrate the concepts and their applications. For those who are not aware of this modeling approach, it serves as a good starting point to acquire a basic understanding of doubly classified models. It is also a valuable resource for academics, postgraduate students, undergraduates, data analysts and researchers who are interested in examining square contingency tables. |
![]() ![]() You may like...
Hazardous Waste Management
Rajesh Banu Jeyakumar, Kavitha Sankarapandian, …
Hardcover
R3,458
Discovery Miles 34 580
Handbook of the Birds of the World, v. 2…
Josep del Hoyo, Andrew Elliott, …
Hardcover
R4,208
Discovery Miles 42 080
Best Practices in Green Supply Chain…
Sadia Samar Ali, Rajbir Kaur, …
Hardcover
R2,724
Discovery Miles 27 240
Computational Hydrodynamics of Capsules…
Constantine Pozrikidis
Paperback
R2,042
Discovery Miles 20 420
Morphogenesis - An Analysis of the…
Edward F. Rossomando, Stephen Alexander
Hardcover
R9,876
Discovery Miles 98 760
|