![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
This two-volume set constitutes the refereed post-conference proceedings of the 12th International Conference on Simulation Tools and Techniques, SIMUTools 2020, held in Guiyang, China, in August 2020. Due to COVID-19 pandemic the conference was held virtually. The 125 revised full papers were carefully selected from 354 submissions. The papers focus on simulation methods, simulation techniques, simulation software, simulation performance, modeling formalisms, simulation verification and widely used frameworks.
This text provides deep and comprehensive coverage of the mathematical background for data science, including machine learning, optimal recovery, compressed sensing, optimization, and neural networks. In the past few decades, heuristic methods adopted by big tech companies have complemented existing scientific disciplines to form the new field of Data Science. This text embarks the readers on an engaging itinerary through the theory supporting the field. Altogether, twenty-seven lecture-length chapters with exercises provide all the details necessary for a solid understanding of key topics in data science. While the book covers standard material on machine learning and optimization, it also includes distinctive presentations of topics such as reproducing kernel Hilbert spaces, spectral clustering, optimal recovery, compressed sensing, group testing, and applications of semidefinite programming. Students and data scientists with less mathematical background will appreciate the appendices that provide more background on some of the more abstract concepts.
This book on optimization includes forewords by Michael I. Jordan, Zongben Xu and Zhi-Quan Luo. Machine learning relies heavily on optimization to solve problems with its learning models, and first-order optimization algorithms are the mainstream approaches. The acceleration of first-order optimization algorithms is crucial for the efficiency of machine learning. Written by leading experts in the field, this book provides a comprehensive introduction to, and state-of-the-art review of accelerated first-order optimization algorithms for machine learning. It discusses a variety of methods, including deterministic and stochastic algorithms, where the algorithms can be synchronous or asynchronous, for unconstrained and constrained problems, which can be convex or non-convex. Offering a rich blend of ideas, theories and proofs, the book is up-to-date and self-contained. It is an excellent reference resource for users who are seeking faster optimization algorithms, as well as for graduate students and researchers wanting to grasp the frontiers of optimization in machine learning in a short time.
Inverse problems such as imaging or parameter identification deal with the recovery of unknown quantities from indirect observations, connected via a model describing the underlying context. While traditionally inverse problems are formulated and investigated in a static setting, we observe a significant increase of interest in time-dependence in a growing number of important applications over the last few years. Here, time-dependence affects a) the unknown function to be recovered and / or b) the observed data and / or c) the underlying process. Challenging applications in the field of imaging and parameter identification are techniques such as photoacoustic tomography, elastography, dynamic computerized or emission tomography, dynamic magnetic resonance imaging, super-resolution in image sequences and videos, health monitoring of elastic structures, optical flow problems or magnetic particle imaging to name only a few. Such problems demand for innovation concerning their mathematical description and analysis as well as computational approaches for their solution.
This is an introduction to mathematical computation that examines a broad array of techniques in the area, from scientific to symbolic to graphical. The book also introduces topics such as implicit and parametric representations, interpolation, approximation, sampling, filtering, reconstruction and integration. Although it is targeted to students in the pure and applied sciences, mathematics and computer science, the book is designed to be accessible to undergraduates at all levels, having a minimum of prerequisites (essentially single-variable calculus).
MATRIX is Australia's international and residential mathematical research institute. It facilitates new collaborations and mathematical advances through intensive residential research programs, each 1-4 weeks in duration. This book is a scientific record of the eight programs held at MATRIX in 2018: - Non-Equilibrium Systems and Special Functions - Algebraic Geometry, Approximation and Optimisation - On the Frontiers of High Dimensional Computation - Month of Mathematical Biology - Dynamics, Foliations, and Geometry In Dimension 3 - Recent Trends on Nonlinear PDEs of Elliptic and Parabolic Type - Functional Data Analysis and Beyond - Geometric and Categorical Representation Theory The articles are grouped into peer-reviewed contributions and other contributions. The peer-reviewed articles present original results or reviews on a topic related to the MATRIX program; the remaining contributions are predominantly lecture notes or short articles based on talks or activities at MATRIX.
A compact and easily accessible book, it guides the reader in unravelling the apparent mysteries found in doing mathematical proofs. Simply written, it introduces the art and science of proving mathematical theorems and propositions and equips students with the skill required to tackle the task of proving mathematical assertions. Theoremus - A Student's Guide to Mathematical Proofs is divided into two parts. Part 1 provides a grounding in the notion of mathematical assertions, arguments and fallacies and Part 2, presents lessons learned in action by applying them into the study of logic itself. The book supplies plenty of examples and figures, gives some historical background on personalities that gave rise to the topic and provides reflective problems to try and solve. The author aims to provide the reader with the confidence to take a deep dive into some more advanced work in mathematics or logic.
This book presents the latest developments in both qualitative and quantitative computational methods for reliability and statistics, as well as their applications. Consisting of contributions from active researchers and experienced practitioners in the field, it fills the gap between theory and practice and explores new research challenges in reliability and statistical computing. The book consists of 18 chapters. It covers (1) modeling in and methods for reliability computing, with chapters dedicated to predicted reliability modeling, optimal maintenance models, and mechanical reliability and safety analysis; (2) statistical computing methods, including machine learning techniques and deep learning approaches for sentiment analysis and recommendation systems; and (3) applications and case studies, such as modeling innovation paths of European firms, aircraft components, bus safety analysis, performance prediction in textile finishing processes, and movie recommendation systems. Given its scope, the book will appeal to postgraduates, researchers, professors, scientists, and practitioners in a range of fields, including reliability engineering and management, maintenance engineering, quality management, statistics, computer science and engineering, mechanical engineering, business analytics, and data science.
Models and simulations of all kinds are tools for dealing with reality. Humans have always used mental models to better understand the world around them: to make plans, to consider different possibilities, to share ideas with others, to test changes, and to determine whether or not the development of an idea is feasible. The book Modeling and Simulation uses exactly the same approach except that the traditional mental model is translated into a computer model, and the simulations of alternative outcomes under varying conditions are programmed on the computer. The advantage of this method is that the computer can track the multitude of implications and consequences in complex relationships much more quickly and reliably than the human mind. This unique interdisciplinary text not only provides a self contained and complete guide to the methods and mathematical background of modeling and simulation software (SIMPAS) and a collection of 50 systems models on an accompanying diskette. Students from fields as diverse as ecology and economics will find this clear interactive package an instructive and engaging guide.
This new work is an introduction to the numerical solution of the initial value problem for a system of ordinary differential equations. The first three chapters are general in nature, and chapters 4 through 8 derive the basic numerical methods, prove their convergence, study their stability and consider how to implement them effectively. The book focuses on the most important methods in practice and develops them fully, uses examples throughout, and emphasizes practical problem-solving methods.
This book analyzes techniques that use the direct and inverse fuzzy transform for image processing and data analysis. The book is divided into two parts, the first of which describes methods and techniques that use the bi-dimensional fuzzy transform method in image analysis. In turn, the second describes approaches that use the multidimensional fuzzy transform method in data analysis. An F-transform in one variable is defined as an operator which transforms a continuous function f on the real interval [a,b] in an n-dimensional vector by using n-assigned fuzzy sets A1, ... , An which constitute a fuzzy partition of [a,b]. Then, an inverse F-transform is defined in order to convert the n-dimensional vector output in a continuous function that equals f up to an arbitrary quantity . We may limit this concept to the finite case by defining the discrete F-transform of a function f in one variable, even if it is not known a priori. A simple extension of this concept to functions in two variables allows it to be used for the coding/decoding and processing of images. Moreover, an extended version with multidimensional functions can be used to address a host of topics in data analysis, including the analysis of large and very large datasets. Over the past decade, many researchers have proposed applications of fuzzy transform techniques for various image processing topics, such as image coding/decoding, image reduction, image segmentation, image watermarking and image fusion; and for such data analysis problems as regression analysis, classification, association rule extraction, time series analysis, forecasting, and spatial data analysis. The robustness, ease of use, and low computational complexity of fuzzy transforms make them a powerful fuzzy approximation tool suitable for many computer science applications. This book presents methods and techniques based on the use of fuzzy transforms in various applications of image processing and data analysis, including image segmentation, image tamper detection, forecasting, and classification, highlighting the benefits they offer compared with traditional methods. Emphasis is placed on applications of fuzzy transforms to innovative problems, such as massive data mining, and image and video security in social networks based on the application of advanced fragile watermarking systems. This book is aimed at researchers, students, computer scientists and IT developers to acquire the knowledge and skills necessary to apply and implement fuzzy transforms-based techniques in image and data analysis applications.
Type theory is one of the most important tools in the design of higher-level programming languages, such as ML. This book introduces and teaches its techniques by focusing on one particularly neat system and studying it in detail. By concentrating on the principles that make the theory work in practice, the author covers all the key ideas without getting involved in the complications of more advanced systems. This book takes a type-assignment approach to type theory, and the system considered is the simplest polymorphic one. The author covers all the basic ideas, including the system's relation to propositional logic, and gives a careful treatment of the type-checking algorithm that lies at the heart of every such system. Also featured are two other interesting algorithms that until now have been buried in inaccessible technical literature. The mathematical presentation is rigorous but clear, making it the first book at this level that can be used as an introduction to type theory for computer scientists.
Quantum logic gates are the crucial information-processing operation of quantumcomputers. Two crucial performance metrics for logic gates are their precision andspeed. Quantum processors based on trapped ions have always been the touchstonefor gate precision, but have suffered from slow speed relative to other quantum logicplatforms such as solid state systems. This thesis shows that it is possible to acceleratethe logic "clock speed" from kHz to MHz speeds, whilst maintaining a precision of99.8%. This is almost as high as the world record for conventional trapped-ion gates,but more than 20 times faster. It also demonstrates entanglement generation in atime (480ns) shorter than the natural timescale of the ions' motion in the trap, whichstarts to probe an interesting new regime of ion trap physics. In separate experiments, some of the first "mixed-species" quantum logic gates areperformed, both between two different elements, and between different isotopes.The mixed-isotope gate is used to make the first test of the quantum-mechanical Bellinequality between two different species of isolated atoms.
Solving nonsmooth optimization (NSO) problems is critical in many practical applications and real-world modeling systems. The aim of this book is to survey various numerical methods for solving NSO problems and to provide an overview of the latest developments in the field. Experts from around the world share their perspectives on specific aspects of numerical NSO. The book is divided into four parts, the first of which considers general methods including subgradient, bundle and gradient sampling methods. In turn, the second focuses on methods that exploit the problem's special structure, e.g. algorithms for nonsmooth DC programming, VU decomposition techniques, and algorithms for minimax and piecewise differentiable problems. The third part considers methods for special problems like multiobjective and mixed integer NSO, and problems involving inexact data, while the last part highlights the latest advancements in derivative-free NSO. Given its scope, the book is ideal for students attending courses on numerical nonsmooth optimization, for lecturers who teach optimization courses, and for practitioners who apply nonsmooth optimization methods in engineering, artificial intelligence, machine learning, and business. Furthermore, it can serve as a reference text for experts dealing with nonsmooth optimization.
This book presents the basics of quantum information, e.g., foundation of quantum theory, quantum algorithms, quantum entanglement, quantum entropies, quantum coding, quantum error correction and quantum cryptography. The required knowledge is only elementary calculus and linear algebra. This way the book can be understood by undergraduate students. In order to study quantum information, one usually has to study the foundation of quantum theory. This book describes it from more an operational viewpoint which is suitable for quantum information while traditional textbooks of quantum theory lack this viewpoint. The currentbook bases on Shor's algorithm, Grover's algorithm, Deutsch-Jozsa's algorithm as basic algorithms. To treat several topics in quantum information, this book covers several kinds of information quantities in quantum systems including von Neumann entropy. The limits of several kinds of quantum information processing are given. As important quantum protocols, this book contains quantum teleportation, quantum dense coding, quantum data compression. In particular conversion theory of entanglement via local operation and classical communication are treated too. This theory provides the quantification of entanglement, which coincides with von Neumann entropy. The next part treats the quantum hypothesis testing. The decision problem of two candidates of the unknown state are given. The asymptotic performance of this problem is characterized by information quantities. Using this result, the optimal performance of classical information transmission via noisy quantum channel is derived. Quantum information transmission via noisy quantum channel by quantum error correction are discussed too. Based on this topic, the secure quantum communication is explained. In particular, the quantification of quantum security which has not been treated in existing book is explained. This book treats quantum cryptography from a more practical viewpoint."
Splines are the fundamental tools for fitting curves and surfaces in computer-aided design and computer graphics. This volume presents a practical introduction to computing spline functions and takes the elementary and directly available approach of using explicit and easily evaluated forms of the spline interpolants. Spath outlines the conditions under which splines can be best applied and integrates into his presentation numerous formulas and algorithms to emphasize his concepts. He also includes FORTRAN-77 subroutines which can be applied to the abundant problems illustrated and treated in the book which in turn allows the reader to assess the performance of various spline interpolants based on the configuration of the data. A program disc is available to supplement the text and there is also a companion volume, One Dimensional Spline Interpolation Algorithms.
This book constitutes the post-conference proceedings of the Second EAI International Conference on Artificial Intelligence for Communications and Networks, AICON 2020, held in December 2020. Due to COVID-19 pandemic the conference was held virtually.The 52 full papers were carefully reviewed and selected from 112 submissions. The papers are organized in topical sections on Deep Learning/Machine Learning on Information and Signal Processing; AI in Ubiquitous Mobile Wireless Communications; AI in UAV-assisted wireless communications; Smart Education: Educational Change in the age of artificial Intelligence; AI in SAR/ISAR Target Detection; Recent advances in AI and their applications in future electronic and information field.
This book discusses all the major nature-inspired algorithms with a focus on their application in the context of solving navigation and routing problems. It also reviews the approximation methods and recent nature-inspired approaches for practical navigation, and compares these methods with traditional algorithms to validate the approach for the case studies discussed. Further, it examines the design of alternative solutions using nature-inspired techniques, and explores the challenges of navigation and routing problems and nature-inspired metaheuristic approaches.
Written for researchers and developers applying Integrated Function Systems in the creation of fractal images, this book presents a modification of a widely used probabilistic algorithm for generating IFS-encoded images. The book also includes a discussion of how IFS techniques can be applied to produce animated motion pictures.
This book presents the best papers from the 1st International Conference on Mathematical Research for Blockchain Economy (MARBLE) 2019, held in Santorini, Greece. While most blockchain conferences and forums are dedicated to business applications, product development or Initial Coin Offering (ICO) launches, this conference focused on the mathematics behind blockchain to bridge the gap between practice and theory. Every year, thousands of blockchain projects are launched and circulated in the market, and there is a tremendous wealth of blockchain applications, from finance to healthcare, education, media, logistics and more. However, due to theoretical and technical barriers, most of these applications are impractical for use in a real-world business context. The papers in this book reveal the challenges and limitations, such as scalability, latency, privacy and security, and showcase solutions and developments to overcome them.
This collection of peer-reviewed workshop papers provides comprehensive coverage of cutting-edge research into topological approaches to data analysis and visualization. It encompasses the full range of new algorithms and insights, including fast homology computation, comparative analysis of simplification techniques, and key applications in materials and medical science. The book also addresses core research challenges such as the representation of large and complex datasets, and integrating numerical methods with robust combinatorial algorithms. In keeping with the focus of the TopoInVis 2017 Workshop, the contributions reflect the latest advances in finding experimental solutions to open problems in the sector. They provide an essential snapshot of state-of-the-art research, helping researchers to keep abreast of the latest developments and providing a basis for future work. Gathering papers by some of the world's leading experts on topological techniques, the book represents a valuable contribution to a field of growing importance, with applications in disciplines ranging from engineering to medicine.
This proceedings constitutes the refereed proceedings of the 15th EAI International Conference on Communications and Networking, ChinaCom 2020, held in November 2020 in Shanghai, China. Due to COVID-19 pandemic the conference was held virtually. The 54 papers presented were carefully selected from 143 submissions. The papers are organized in topical sections on Transmission Optimization in Edge Computing; Performance and Scheduling Optimization in Edge Computing; Mobile Edge Network System; Communication Routing and Control; Transmission and Load Balancing; Edge Computing and Distributed Machine Learning; Deep Learning.
The volume contains original research papers as the Proceedings of the International Conference on Advances in Mathematics and Computing, held at Veer Surendra Sai University of Technology, Odisha, India, on 7-8 February, 2020. It focuses on new trends in applied analysis, computational mathematics and related areas. It also includes certain new models, image analysis technique, fluid flow problems, etc. as applications of mathematical analysis and computational mathematics. The volume should bring forward new and emerging topics of mathematics and computing having potential applications and uses in other areas of sciences. It can serve as a valuable resource for graduate students, researchers and educators interested in mathematical tools and techniques for solving various problems arising in science and engineering.
This open access book systematically explores the statistical characteristics of cryptographic systems, the computational complexity theory of cryptographic algorithms and the mathematical principles behind various encryption and decryption algorithms. The theory stems from technology. Based on Shannon's information theory, this book systematically introduces the information theory, statistical characteristics and computational complexity theory of public key cryptography, focusing on the three main algorithms of public key cryptography, RSA, discrete logarithm and elliptic curve cryptosystem. It aims to indicate what it is and why it is. It systematically simplifies and combs the theory and technology of lattice cryptography, which is the greatest feature of this book. It requires a good knowledge in algebra, number theory and probability statistics for readers to read this book. The senior students majoring in mathematics, compulsory for cryptography and science and engineering postgraduates will find this book helpful. It can also be used as the main reference book for researchers in cryptography and cryptographic engineering areas.
Term rewriting systems, which developed out of mathematical logic, consist of sequences of discrete steps where one term is replaced with another. Their many applications range from automatic theorem proving systems to computer algebra. This book begins with several examples, followed by a chapter on basic notions that provides a foundation for the rest of the work. First-order and higher-order theories are presented, with much of the latter material appearing for the first time in book form. Subjects treated include orthogonality, termination, lambda calculus and term graph rewriting. There is also a chapter detailing the required mathematical background. |
![]() ![]() You may like...
Sounds and Systems - Studies in…
David Restle, Dietmar Zaefferer
Hardcover
Urban Health - Participatory…
Alessandra Battisti, Maurizio Marceca, …
Hardcover
R5,111
Discovery Miles 51 110
Urban Planning, Management and…
Jan Fransen, Meine P. van Dijk, …
Hardcover
R3,327
Discovery Miles 33 270
Social Emotions in Nature and Artifact
Jonathan Gratch, Stacy Marsella
Hardcover
R3,682
Discovery Miles 36 820
The Pathway to Flow - Unlock the Healing…
Julia F. Christensen
Paperback
|