![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
This book starts with the basic concepts of fuzzy sets and progresses througha normative view on possibility distributions and OWA operators in multiple criteria decisions. Five applications (that all build on experience from solving complex real world problems)of possibility distributions to strategic decisions about closing/not closinga production plant using fuzzy real options, portfolio selection with imprecise future data, predictive probabilities and possibilities for risk assessment in grid computing, fuzzy ontologies for process industry, and design (and implementation) of mobile value servicesare presented and carefully discussed. It can be useful for researchers and students workingin soft computing, real options, fuzzy decision making, grid computing, knowledge mobilization andmobile value services."
This book contains leading survey papers on the various aspects of Abduction, both logical and numerical approaches. Abduction is central to all areas of applied reasoning, including artificial intelligence, philosophy of science, machine learning, data mining and decision theory, as well as logic itself.
The European Conference on Numerical Mathematics and Advanced Applications (ENUMATH), held every 2 years, provides a forum for discussing recent advances in and aspects of numerical mathematics and scientific and industrial applications. The previous ENUMATH meetings took place in Paris (1995), Heidelberg (1997), Jyvaskyla (1999), Ischia (2001), Prague (2003), Santiago de Compostela (2005), Graz (2007), Uppsala (2009), Leicester (2011) and Lausanne (2013). This book presents a selection of invited and contributed lectures from the ENUMATH 2015 conference, which was organised by the Institute of Applied Mathematics (IAM), Middle East Technical University, Ankara, Turkey, from September 14 to 18, 2015. It offers an overview of central recent developments in numerical analysis, computational mathematics, and applications in the form of contributions by leading experts in the field.
The correlations between physical systems provide significant information about their collective behaviour - information that is used as a resource in many applications, e.g. communication protocols. However, when it comes to the exploitation of such correlations in the quantum world, identification of the associated 'resource' is extremely challenging and a matter of debate in the quantum community. This dissertation describes three key results on the identification, detection, and quantification of quantum correlations. It starts with an extensive and accessible introduction to the mathematical and physical grounds for the various definitions of quantum correlations. It subsequently focusses on introducing a novel unified picture of quantum correlations by taking a modern resource-theoretic position. The results show that this novel concept plays a crucial role in the performance of collaborative quantum computations that is not captured by the standard textbook approaches. Further, this new perspective provides a deeper understanding of the quantum-classical boundary and paves the way towards establishing a resource theory of quantum computations.
The fields of image analysis, computer vision, and artificial intelligence all make use of descriptions of shape in grey-level images. Most existing algorithms for the automatic recognition and classification of particular shapes have been devel oped for specific purposes, with the result that these methods are often restricted in their application. The use of advanced and theoretically well-founded math ematical methods should lead to the construction of robust shape descriptors having more general application. Shape description can be regarded as a meeting point of vision research, mathematics, computing science, and the application fields of image analy sis, computer vision, and artificial intelligence. The NATO Advanced Research Workshop "Shape in Picture" was organised with a twofold objective: first, it should provide all participants with an overview of relevant developments in these different disciplines; second, it should stimulate researchers to exchange original results and ideas across the boundaries of these disciplines. This book comprises a widely drawn selection of papers presented at the workshop, and many contributions have been revised to reflect further progress in the field. The focus of this collection is on mathematical approaches to the construction of shape descriptions from grey-level images. The book is divided into five parts, each devoted to a different discipline. Each part contains papers that have tutorial sections; these are intended to assist the reader in becoming acquainted with the variety of approaches to the problem."
Graph Separators with Applications is devoted to techniques for obtaining upper and lower bounds on the sizes of graph separators - upper bounds being obtained via decomposition algorithms. The book surveys the main approaches to obtaining good graph separations, while the main focus of the book is on techniques for deriving lower bounds on the sizes of graph separators. This asymmetry in focus reflects our perception that the work on upper bounds, or algorithms, for graph separation is much better represented in the standard theory literature than is the work on lower bounds, which we perceive as being much more scattered throughout the literature on application areas. Given the multitude of notions of graph separator that have been developed and studied over the past (roughly) three decades, there is a need for a central, theory-oriented repository for the mass of results. The need is absolutely critical in the area of lower-bound techniques for graph separators, since these techniques have virtually never appeared in articles having the word separator' or any of its near-synonyms in the title. Graph Separators with Applications fills this need.
This volume contains the text of papers presented at the NATO Advanced Research Workshop on Emergent Computing Methods in Engineering Design, held in Nafplio, Greece, August 25-27, 1994. The workshop convened together some thirty or so researchers from Canada, France, Germany, Greece, Israel, Taiwan, The Netherlands, United Kingdom and the United States of America, to address issues related to the application of such emergent computing methods as genetic algorithms, neural networks and simulated annealing in problems of engineering design. The volume is essentially organized into three parts, with each part having some theoretical papers and other papers of a more practical nature. The frrst part, which comprises the largest number of papers, deals with genetic algorithms and evolutionary computing and presents subject matter ranging from proposed improvements to the computing methodology to specific applications in engineering design. The second part deals with neural networks and considers such topics as their application as approximation tools in design, their adaptation in control system design and theoretical issues of interpretation. The third part of the volume presents a collection of papers that examine such diverse topics as the combined use of genetic algorithms and neural networks, the application of simulated annealing techniques, problem decomposition techniques and the computer recognition and interpretation of emerging objects in engineering design.
The 14 contributed chapters in this book survey the most recent developments in high-performance algorithms for NGS data, offering fundamental insights and technical information specifically on indexing, compression and storage; error correction; alignment; and assembly. The book will be of value to researchers, practitioners and students engaged with bioinformatics, computer science, mathematics, statistics and life sciences.
During the past few years, data mining has grown rapidly in visibility and importance within information processing and decision analysis. This is par ticularly true in the realm of e-commerce, where data mining is moving from a "nice-to-have" to a "must-have" status. In a different though related context, a new computing methodology called granular computing is emerging as a powerful tool for the conception, analysis and design of information/intelligent systems. In essence, data mining deals with summarization of information which is resident in large data sets, while granular computing plays a key role in the summarization process by draw ing together points (objects) which are related through similarity, proximity or functionality. In this perspective, granular computing has a position of centrality in data mining. Another methodology which has high relevance to data mining and plays a central role in this volume is that of rough set theory. Basically, rough set theory may be viewed as a branch of granular computing. However, its applications to data mining have predated that of granular computing."
In recent years, new algorithms for dealing with rings of differential operators have been discovered and implemented. A main tool is the theory of Gröbner bases, which is reexamined here from the point of view of geometric deformations. Perturbation techniques have a long tradition in analysis; Gröbner deformations of left ideals in the Weyl algebra are the algebraic analogue to classical perturbation techniques. The algorithmic methods introduced here are particularly useful for studying the systems of multidimensional hypergeometric PDEs introduced by Gelfand, Kapranov and Zelevinsky. The Gröbner deformation of these GKZ hypergeometric systems reduces problems concerning hypergeometric functions to questions about commutative monomial ideals, and leads to an unexpected interplay between analysis and combinatorics. This book contains a number of original research results on holonomic systems and hypergeometric functions, and raises many open problems for future research in this area.
This book introduces readers to the basic concepts of and latest findings in the area of differential equations with uncertain factors. It covers the analytic method and numerical method for solving uncertain differential equations, as well as their applications in the field of finance. Furthermore, the book provides a number of new potential research directions for uncertain differential equation. It will be of interest to researchers, engineers and students in the fields of mathematics, information science, operations research, industrial engineering, computer science, artificial intelligence, automation, economics, and management science.
Special functions are pervasive in all fields of science and industry. The most well-known application areas are in physics, engineering, chemistry, computer science and statistics. Because of their importance, several books and websites (see for instance http: functions.wolfram.com) and a large collection of papers have been devoted to these functions. Of the standard work on the subject, namely the Handbook of Mathematical Functions with formulas, graphs and mathematical tables edited by Milton Abramowitz and Irene Stegun, the American National Institute of Standards claims to have sold over 700.000 copies But so far no project has been devoted to the systematic study of continued fraction representations for these functions. This handbook is the result of such an endeavour. We emphasise that only 10% of the continued fractions contained in this book, can also be found in the Abramowitz and Stegun project or at the Wolfram website
The authors focus on the mathematical models and methods that support most data mining applications and solution techniques.
Parsing technology traditionally consists of two branches, which correspond to the two main application areas of context-free grammars and their generalizations. Efficient deterministic parsing algorithms have been developed for parsing programming languages, and quite different algorithms are employed for analyzing natural language. The Functional Treatment of Parsing provides a functional framework within which the different traditional techniques are restated and unified. The resulting theory provides new recursive implementations of parsers for context-free grammars. The new implementations, called recursive ascent parsers, avoid explicit manipulation of parse stacks and parse matrices, and are in many ways superior to conventional implementations. They are applicable to grammars for programming languages as well as natural languages. The book has been written primarily for students and practitioners of parsing technology. With its emphasis on modern functional methods, however, the book will also be of benefit to scientists interested in functional programming. The Functional Treatment of Parsing is an excellent reference and can be used as a text for a course on the subject.
This book comprises nine selected works on numerical and computational methods for solving multiobjective optimization, game theory, and machine learning problems. It provides extended versions of selected papers from various fields of science such as computer science, mathematics and engineering that were presented at EVOLVE 2013 held in July 2013 at Leiden University in the Netherlands. The internationally peer-reviewed papers include original work on important topics in both theory and applications, such as the role of diversity in optimization, statistical approaches to combinatorial optimization, computational game theory, and cell mapping techniques for numerical landscape exploration. Applications focus on aspects including robustness, handling multiple objectives, and complex search spaces in engineering design and computational biology.
N.G. de Bruijn was a well established mathematician before deciding
in 1967 at the age of 49 to work on a new direction related to
Automating Mathematics. In the 1960s he became fascinated by the
new computer technology and decided to start the Automath project
where he could check, with the help of the computer, the
correctness of books on mathematics. Through his work on Automath
de Bruijn started a revolution in using the computer for
verification, and since, we have seen more and more proof-checking
and theorem-proving systems.
The book focuses on three related areas in the theory of computation. The areas are modern cryptography, the study of probabilistic proof systems, and the theory of computational pseudorandomness. The common theme is the interplay between randomness and computation. The book offers an introduction and extensive survey to each of these areas, presenting both the basic notions and the most important (sometimes advanced) results. The presentation is focused on the essentials and does not elaborate on details. In some cases it offers a novel and illuminating perspective. The reader may obtain from the book 1. A clear view of what each of these areas is all above. 2. Knowledge of the basic important notions and results in each area. 3. New insights into each of these areas. It is believed that the book may thus be useful both to a beginner (who has only some background in the theory of computing), and an expert in any of these areas.
Aimed at mathematicians and computer scientists who will only be exposed to one course in this area, Computability: A Mathematical Sketchbook provides a brief but rigorous introduction to the abstract theory of computation, sometimes also referred to as recursion theory. It develops major themes in computability theory, such as Rice's theorem and the recursion theorem, and provides a systematic account of Blum's complexity theory as well as an introduction to the theory of computable real numbers and functions. The book is intended as a university text, but it may also be used for self-study; appropriate exercises and solutions are included.
In recent years statistical physics has made significant progress as a result of advances in numerical techniques. While good textbooks exist on the general aspects of statistical physics, the numerical methods and the new developments based on large-scale computing are not usually adequately presented. In this book 16 experts describe the application of methods of statistical physics to various areas in physics such as disordered materials, quasicrystals, semiconductors, and also to other areas beyond physics, such as financial markets, game theory, evolution, and traffic planning, in which statistical physics has recently become significant. In this way the universality of the underlying concepts and methods such as fractals, random matrix theory, time series, neural networks, evolutionary algorithms, becomes clear. The topics are covered by introductory, tutorial presentations.
hebookpresentedtothereaderisdevotedtotime-dependentscheduling. TScheduling problems, in general, consist in the allocation of resources over time in order to perform a set of jobs. Any allocation that meets all requirements concerning the jobs and resources is called a feasible schedule. The quality of a schedule is measured by a criterion function. The aim of scheduling is to ?nd, among all feasible schedules, a schedule that optimizes the criterion function. A solution to an arbitrary scheduling problem consists in giving a polynomial-time algorithm generating either an optimal schedule or a schedule that is close to the optimal one, if the given scheduling problem has been proved to be computationally intractable. The scheduling problems are subject of interest of the scheduling theory, originated in mid-?fties of the twentieth century. The theory has been developing dynamically and new research areas constantly come into existence. The subject of this book, ti- dependent scheduling, is one of such areas. In time-dependent scheduling, the processing time of a job is variable and depends on the starting time of the job. This crucial assumption allows us to apply the scheduling theory to a broader spectrum of problems. For example, in the framework of the time-dependent scheduling theory we may consider the problems of repayment of multiple loans, ?re ?ghting and maintenance assignments. In this book, we will discuss algorithms and complexity issues concerning various time-dependent scheduling problems.
The book presents new clustering schemes, dynamical systems and pattern recognition algorithms in geophysical, geodynamical and natural hazard applications. The original mathematical technique is based on both classical and fuzzy sets models. Geophysical and natural hazard applications are mostly original. However, the artificial intelligence technique described in the book can be applied far beyond the limits of Earth science applications. The book is intended for research scientists, tutors, graduate students, scientists in geophysics and engineers
Neural networks provide a powerful new technology to model and control nonlinear and complex systems. In this book, the authors present a detailed formulation of neural networks from the information-theoretic viewpoint. They show how this perspective provides new insights into the design theory of neural networks. In particular they show how these methods may be applied to the topics of supervised and unsupervised learning including feature extraction, linear and non-linear independent component analysis, and Boltzmann machines. Readers are assumed to have a basic understanding of neural networks, but all the relevant concepts from information theory are carefully introduced and explained. Consequently, readers from several different scientific disciplines, notably cognitive scientists, engineers, physicists, statisticians, and computer scientists, will find this to be a very valuable introduction to this topic.
A collection of surveys and research papers on mathematical software and algorithms. The common thread is that the field of mathematical applications lies on the border between algebra and geometry. Topics include polyhedral geometry, elimination theory, algebraic surfaces, Gröbner bases, triangulations of point sets and the mutual relationship. This diversity is accompanied by the abundance of available software systems which often handle only special mathematical aspects. This is why the volume also focuses on solutions to the integration of mathematical software systems. This includes low-level and XML based high-level communication channels as well as general frameworks for modular systems.
Walter Gautschi has written extensively on topics ranging from special functions, quadrature and orthogonal polynomials to difference and differential equations, software implementations, and the history of mathematics. He is world renowned for his pioneering work in numerical analysis and constructive orthogonal polynomials, including a definitive textbook in the former, and a monograph in the latter area. This three-volume set, Walter Gautschi: Selected Works with Commentaries, is a compilation of Gautschi s most influential papers and includes commentaries by leading experts. The work begins with a detailed biographical section and ends with a section commemorating Walter s prematurely deceased twin brother. This title will appeal to graduate students and researchers in numerical analysis, as well as to historians of science. Selected Works with Commentaries, Vol. 1 Numerical Conditioning Special Functions Interpolation and Approximation Selected Works with Commentaries, Vol. 2 Orthogonal Polynomials on the Real Line Orthogonal Polynomials on the Semicircle Chebyshev Quadrature Kronrod and Other Quadratures Gauss-type Quadrature Selected Works with Commentaries, Vol. 3 Linear Difference Equations Ordinary Differential Equations Software History and Biography Miscellanea Works of Werner Gautschi Numerical Conditioning Special Functions Interpolation and Approximation Selected Works with Commentaries, Vol. 2 Orthogonal Polynomials on the Real Line Orthogonal Polynomials on the Semicircle Chebyshev Quadrature Kronrod and Other Quadratures Gauss-type Quadrature Selected Works with Commentaries, Vol. 3 Linear Difference Equations Ordinary Differential Equations Software History and Biography Miscellanea Works of Werner Gautschi
This book features 13 papers presented at the Fifth International Symposium on Recurrence Plots, held August 2013 in Chicago, IL. It examines recent applications and developments in recurrence plots and recurrence quantification analysis (RQA) with special emphasis on biological and cognitive systems and the analysis of coupled systems using cross-recurrence methods. Readers will discover new applications and insights into a range of systems provided by recurrence plot analysis and new theoretical and mathematical developments in recurrence plots. Recurrence plot based analysis is a powerful tool that operates on real-world complex systems that are nonlinear, non-stationary, noisy, of any statistical distribution, free of any particular model type and not particularly long. Quantitative analyses promote the detection of system state changes, synchronized dynamical regimes or classification of system states. The book will be of interest to an interdisciplinary audience of recurrence plot users and researchers interested in time series analysis of complex systems in general. |
![]() ![]() You may like...
Computational Intelligence in Data…
Vallidevi Krishnamurthy, Suresh Jaganathan, …
Hardcover
R2,640
Discovery Miles 26 400
Research Software Engineering with…
Damien Irving, Kate Hertweck, …
Paperback
R1,874
Discovery Miles 18 740
Machine Vision and Navigation
Oleg Sergiyenko, Wendy Flores-Fuentes, …
Hardcover
R7,744
Discovery Miles 77 440
Statistical Applications from Clinical…
Jianchang Lin, Bushi Wang, …
Hardcover
R6,400
Discovery Miles 64 000
Networked and Distributed Predictive…
Panagiotis D. Christofides, Jinfeng Liu, …
Hardcover
R2,894
Discovery Miles 28 940
ABC Transporters and Cancer, Volume 125
Toshihisa Ishikawa, John Schuetz
Hardcover
R3,924
Discovery Miles 39 240
|