![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
C++ is capable of tackling a whole range of programming tasks. The purpose of this book is to give breadth and depth to C++ programmers’ existing experience of the language by representing a large number of algorithms, most of them implemented as ready-to-run (and standalone) programs. The programs are as readable as possible without sacrificing too great a degree of efficiency, generality, portability and robustness. Both the classes and programs are designed to demonstrate major programming principles. There is coverage of two key language features - templates and exception handling - apart from which the reader is assumed to have working knowledge of C++. Besides traditional subjects, such as quicksort and binary trees, this book also covers some less well-known topics, including multi-precision arithmetic, route planning and external sorting. Demonstration programs for these and many other exciting applications are based on C++ classes which you can also use in programs of your own.
"The NCL Natural Constraint Language"" "presents the NCL language which is a description language in conventional mathematical logic for modeling and solving constraint satisfaction problems. NCL differs from other declarative languages: It models problems naturally in a simplified form of first-order logic with quantifiers, Boolean logic, numeric constraints, set operations and logical functions; it solves problems by mixed set programming over the mixed domain of real numbers, integers, Booleans, dates/times, references, and in particular sets. The book uses plenty of examples and tutorials to illustrate NCL and its applications. It is intended for researchers and developers in the fields of logic programming, constraint programming, optimization, modeling, operations research and artificial intelligence, who will learn from a new programming language and theoretical foundations for industrial applications. Dr. Jianyang Zhou is the inventor of NCL and has worked for its industrialization for more than 10 years.
This two-volume set on Mathematical Principles of the Internet provides a comprehensive overview of the mathematical principles of Internet engineering. The books do not aim to provide all of the mathematical foundations upon which the Internet is based. Instead, they cover a partial panorama and the key principles. Volume 1 explores Internet engineering, while the supporting mathematics is covered in Volume 2. The chapters on mathematics complement those on the engineering episodes, and an effort has been made to make this work succinct, yet self-contained. Elements of information theory, algebraic coding theory, cryptography, Internet traffic, dynamics and control of Internet congestion, and queueing theory are discussed. In addition, stochastic networks, graph-theoretic algorithms, application of game theory to the Internet, Internet economics, data mining and knowledge discovery, and quantum computation, communication, and cryptography are also discussed. In order to study the structure and function of the Internet, only a basic knowledge of number theory, abstract algebra, matrices and determinants, graph theory, geometry, analysis, optimization theory, probability theory, and stochastic processes, is required. These mathematical disciplines are defined and developed in the books to the extent that is needed to develop and justify their application to Internet engineering.
This two-volume set on Mathematical Principles of the Internet provides a comprehensive overview of the mathematical principles of Internet engineering. The books do not aim to provide all of the mathematical foundations upon which the Internet is based. Instead, they cover a partial panorama and the key principles. Volume 1 explores Internet engineering, while the supporting mathematics is covered in Volume 2. The chapters on mathematics complement those on the engineering episodes, and an effort has been made to make this work succinct, yet self-contained. Elements of information theory, algebraic coding theory, cryptography, Internet traffic, dynamics and control of Internet congestion, and queueing theory are discussed. In addition, stochastic networks, graph-theoretic algorithms, application of game theory to the Internet, Internet economics, data mining and knowledge discovery, and quantum computation, communication, and cryptography are also discussed. In order to study the structure and function of the Internet, only a basic knowledge of number theory, abstract algebra, matrices and determinants, graph theory, geometry, analysis, optimization theory, probability theory, and stochastic processes, is required. These mathematical disciplines are defined and developed in the books to the extent that is needed to develop and justify their application to Internet engineering.
Automata and Computability is a class-tested textbook which provides a comprehensive and accessible introduction to the theory of automata and computation. The author uses illustrations, engaging examples, and historical remarks to make the material interesting and relevant for students. It incorporates modern/handy ideas, such as derivative-based parsing and a Lambda reducer showing the universality of Lambda calculus. The book also shows how to sculpt automata by making the regular language conversion pipeline available through a simple command interface. A Jupyter notebook will accompany the book to feature code, YouTube videos, and other supplements to assist instructors and students Features Uses illustrations, engaging examples, and historical remarks to make the material accessible Incorporates modern/handy ideas, such as derivative-based parsing and a Lambda reducer showing the universality of Lambda calculus Shows how to "sculpt" automata by making the regular language conversion pipeline available through simple command interface Uses a mini functional programming (FP) notation consisting of lambdas, maps, filters, and set comprehension (supported in Python) to convey math through PL constructs that are succinct and resemble math Provides all concepts are encoded in a compact Functional Programming code that will tesselate with Latex markup and Jupyter widgets in a document that will accompany the books. Students can run code effortlessly. All the code can be accessed here.
The aim of this volume is to present modern developments in semantics and logics of computation in a way that is accessible to graduate students. The book is based on a summer school at the Isaac Newton Institute and consists of a sequence of linked lecture courses by international authorities in the area. The whole set have been edited to form a coherent introduction to these topics, most of which have not been presented pedagogically before.
From the Foreword: "While large-scale machine learning and data mining have greatly impacted a range of commercial applications, their use in the field of Earth sciences is still in the early stages. This book, edited by Ashok Srivastava, Ramakrishna Nemani, and Karsten Steinhaeuser, serves as an outstanding resource for anyone interested in the opportunities and challenges for the machine learning community in analyzing these data sets to answer questions of urgent societal interest...I hope that this book will inspire more computer scientists to focus on environmental applications, and Earth scientists to seek collaborations with researchers in machine learning and data mining to advance the frontiers in Earth sciences." --Vipin Kumar, University of Minnesota Large-Scale Machine Learning in the Earth Sciences provides researchers and practitioners with a broad overview of some of the key challenges in the intersection of Earth science, computer science, statistics, and related fields. It explores a wide range of topics and provides a compilation of recent research in the application of machine learning in the field of Earth Science. Making predictions based on observational data is a theme of the book, and the book includes chapters on the use of network science to understand and discover teleconnections in extreme climate and weather events, as well as using structured estimation in high dimensions. The use of ensemble machine learning models to combine predictions of global climate models using information from spatial and temporal patterns is also explored. The second part of the book features a discussion on statistical downscaling in climate with state-of-the-art scalable machine learning, as well as an overview of methods to understand and predict the proliferation of biological species due to changes in environmental conditions. The problem of using large-scale machine learning to study the formation of tornadoes is also explored in depth. The last part of the book covers the use of deep learning algorithms to classify images that have very high resolution, as well as the unmixing of spectral signals in remote sensing images of land cover. The authors also apply long-tail distributions to geoscience resources, in the final chapter of the book.
Praise for the First Edition "A very useful book for self study and reference." "Very well written. It is concise and really packs a lot of material in a valuable reference book." "An informative and well-written book . . . presented in an easy-to-understand style with many illustrative numerical examples taken from engineering and scientific studies." Practicing engineers and scientists often have a need to utilize statistical approaches to solving problems in an experimental setting. Yet many have little formal training in statistics. Statistical Design and Analysis of Experiments gives such readers a carefully selected, practical background in the statistical techniques that are most useful to experimenters and data analysts who collect, analyze, and interpret data. The First Edition of this now-classic book garnered praise in the field. Now its authors update and revise their text, incorporating readers’ suggestions as well as a number of new developments. Statistical Design and Analysis of Experiments, Second Edition emphasizes the strategy of experimentation, data analysis, and the interpretation of experimental results, presenting statistics as an integral component of experimentation from the planning stage to the presentation of conclusions. Giving an overview of the conceptual foundations of modern statistical practice, the revised text features discussions of:
Ideal for both students and professionals, this focused and cogent reference has proven to be an excellent classroom textbook with numerous examples. It deserves a place among the tools of every engineer and scientist working in an experimental setting.
This established and authoritative text focuses on the design and analysis of nonlinear control systems. The author considers the latest research results and techniques in this updated and extended edition. Examples are given from mechanical, electrical and aerospace engineering. The approach consists of a rigorous mathematical formulation of control problems and respective methods of solution. The two appendices outline the most important concepts of differential geometry and present some specific findings not often found in other standard works. The book is, therefore, suitable both as a graduate and undergraduate text and as a source for reference.
This textbook covers the mathematical foundations of the analysis of algorithms. The gist of the book is how to argue, without the burden of excessive formalism, that a given algorithm does what it is supposed to do. The two key ideas of the proof of correctness, induction and invariance, are employed in the framework of pre/post-conditions and loop invariants. The algorithms considered are the basic and traditional algorithms of computer science, such as Greedy, Dynamic and Divide & Conquer. In addition, two classes of algorithms that rarely make it into introductory textbooks are discussed. Randomized algorithms, which are now ubiquitous because of their applications to cryptography; and Online algorithms, which are essential in fields as diverse as operating systems (caching, in particular) and stock-market predictions. This self-contained book is intended for undergraduate students in computer science and mathematics.
In recent years there has been an explosion of research into linear programming, as well as further steady advances in integer programming. This research has been reported in the research literature but there has been little done from the view of a "combined whole". This book aims to overcome this. With an international authorship of contributors from acknowledged experts in their field, this book provides a clear exposition on such topics as simplex algorithms, and interior point algorithms, both from a theoretical and a computational viewpoint. Surveying recent research that is currently only available in journals this topical book will be of interest not only in the field of mathematics, but also in computer science and operations research as well.
Visual Tracking in Conventional Minimally Invasive Surgery introduces the various tools and methodologies that can be used to enhance a conventional surgical setup with some degree of automation. The main focus of this book is on methods for tracking surgical tools and how they can be used to assist the surgeon during the surgical operation. Various notions associated with surgeon-computer interfaces and image-guided navigation are explored, with a range of experimental results. The book starts with some basic motivations for minimally invasive surgery and states the various distinctions between robotic and non-robotic (conventional) versions of this procedure. Common components of this type of operation are presented with a review of the literature addressing the automation aspects of such a setup. Examples of tracking results are shown for both motion and gesture recognition of surgical tools, which can be used as part of the surgeon-computer interface. In the case of marker-less tracking, where no special visual markers can be added to the surgical tools, the tracking results are divided into two types of methodology, depending on the nature and the estimate of the visual noise. Details of the tracking methods are presented using standard Kalman filters and particle filters. The last part of the book provides approaches for tracking a region on the surgical scene defined by the surgeon. Examples of how these tracking approaches can be used as part of image-guided navigation are demonstrated. This book is designed for control engineers interested in visual tracking, computer vision researchers and system designers involved with surgical automation, as well as surgeons, biomedical engineers, and robotic researchers.
Essentially there are two variational theories of liquid crystals explained in this book. The theory put forward by Zocher, Oseen and Frank is classical, while that proposed by Ericksen is newer in its mathematical formulation although it has been postulated in the physical literature for the past two decades. The newer theory provides a better explanation of defects in liquid crystals, especially of those concentrated on lines and surfaces, which escape the scope of the classical theory. The book opens the way to the wealth of applications that will follow.
Although there are hundreds of books about MATLAB, there are no books that fully explore its value in the field of business economics. Few books describe how geographic information can be explicitly incorporated in business decisions, or explain how sophisticated MATLAB applications can be provided to users via the Internet using a remote-hosted, thin client environment.
This book presents the state-of-the-art methodology and detailed analytical models and methods used to assess the reliability of complex systems and related applications in statistical reliability engineering. It is a textbook based mainly on the author's recent research and publications as well as experience of over 30 years in this field. The book covers a wide range of methods and models in reliability, and their applications, including: statistical methods and model selection for machine learning; models for maintenance and software reliability; statistical reliability estimation of complex systems; and statistical reliability analysis of k out of n systems, standby systems and repairable systems. Offering numerous examples and solved problems within each chapter, this comprehensive text provides an introduction to reliability engineering graduate students, a reference for data scientists and reliability engineers, and a thorough guide for researchers and instructors in the field.
More than ever before, complicated mathematical procedures are integral to the success and advancement of technology, engineering, and even industrial production. Knowledge of and experience with these procedures is therefore vital to present and future scientists, engineers and technologists.
Originally published in 1979. An Input/output database is an information system carrying current data on the intermediate consumption of any product or service by all the specified major firms that consume it. This book begins with a survey of how the interrelationships of an economic system can be represented in a two-dimensional model which traces the output of each economic sector to all other sectors. It talks about how the use of such databases to identify major buyers and sellers can illuminate problems of economic policy at the national, regional, and corporate level and aid in analyzing factors affecting the control of inflation, energy use, transportation, and environmental pollution. The book discusses how advances in database technology, have brought to the fore such issues as the right to individual privacy, corporate secrecy, the public's right of access to stored data, and the use of such information for national planning in a free-enterprise society.
How can one be assured that computer codes that solve differential equations are correct? Standard practice using benchmark testing no longer provides full coverage because today's production codes solve more complex equations using more powerful algorithms. By verifying the order-of-accuracy of the numerical algorithm implemented in the code, one can detect most any coding mistake that would prevent correct solutions from being computed.
'Points, questions, stories, and occasional rants introduce the 24 chapters of this engaging volume. With a focus on mathematics and peppered with a scattering of computer science settings, the entries range from lightly humorous to curiously thought-provoking. Each chapter includes sections and sub-sections that illustrate and supplement the point at hand. Most topics are self-contained within each chapter, and a solid high school mathematics background is all that is needed to enjoy the discussions. There certainly is much to enjoy here.'CHOICEEver notice how people sometimes use math words inaccurately? Or how sometimes you instinctively know a math statement is false (or not known)?Each chapter of this book makes a point like those above and then illustrates the point by doing some real mathematics through step-by-step mathematical techniques.This book gives readers valuable information about how mathematics and theoretical computer science work, while teaching them some actual mathematics and computer science through examples and exercises. Much of the mathematics could be understood by a bright high school student. The points made can be understood by anyone with an interest in math, from the bright high school student to a Field's medal winner.
Fuzzy Cluster Analysis presents advanced and powerful fuzzy clustering techniques. This thorough and self-contained introduction to fuzzy clustering methods and applications covers classification, image recognition, data analysis and rule generation. Combining theoretical and practical perspectives, each method is analysed in detail and fully illustrated with examples. Features include:
Among the most exciting developments in science today is the design and construction of the quantum computer. Its realization will be the result of multidisciplinary efforts, but ultimately, it is mathematics that lies at the heart of theoretical quantum computer science.
Recently molecular biology has undergone unprecedented development generating vast quantities of data needing sophisticated computational methods for analysis, processing and archiving. This requirement has given birth to the truly interdisciplinary field of computational biology, or bioinformatics, a subject reliant on both theoretical and practical contributions from statistics, mathematics, computer science and biology.
This book explores and articulates the concepts of the continuous and the infinitesimal from two points of view: the philosophical and the mathematical. The first section covers the history of these ideas in philosophy. Chapter one, entitled 'The continuous and the discrete in Ancient Greece, the Orient and the European Middle Ages,' reviews the work of Plato, Aristotle, Epicurus, and other Ancient Greeks; the elements of early Chinese, Indian and Islamic thought; and early Europeans including Henry of Harclay, Nicholas of Autrecourt, Duns Scotus, William of Ockham, Thomas Bradwardine and Nicolas Oreme. The second chapter of the book covers European thinkers of the sixteenth and seventeenth centuries: Galileo, Newton, Leibniz, Descartes, Arnauld, Fermat, and more. Chapter three, 'The age of continuity,' discusses eighteenth century mathematicians including Euler and Carnot, and philosophers, among them Hume, Kant and Hegel. Examining the nineteenth and early twentieth centuries, the fourth chapter describes the reduction of the continuous to the discrete, citing the contributions of Bolzano, Cauchy and Reimann. Part one of the book concludes with a chapter on divergent conceptions of the continuum, with the work of nineteenth and early twentieth century philosophers and mathematicians, including Veronese, Poincare, Brouwer, and Weyl. Part two of this book covers contemporary mathematics, discussing topology and manifolds, categories, and functors, Grothendieck topologies, sheaves, and elementary topoi. Among the theories presented in detail are non-standard analysis, constructive and intuitionist analysis, and smooth infinitesimal analysis/synthetic differential geometry. No other book so thoroughly covers the history and development of the concepts of the continuous and the infinitesimal.
Rapid developments in the field of genetic algorithms along with the popularity of the first edition precipitated this completely revised, thoroughly updated second edition of The Practical Handbook of Genetic Algorithms. Like its predecessor, this edition helps practitioners stay up to date on recent developments in the field and provides material they can use productively in their own endeavors. |
![]() ![]() You may like...
Advances in Service and Industrial…
Said Zeghloul, Med Amine Laribi, …
Hardcover
R7,671
Discovery Miles 76 710
Advances in Condition Monitoring of…
Alfonso Fernandez Del Rincon, Fernando Viadero-Rueda, …
Hardcover
R6,319
Discovery Miles 63 190
Particle Damping Technology Based…
Zheng Lu, Sami F. Masri, …
Hardcover
R4,404
Discovery Miles 44 040
Dynamics and Control of Advanced…
Valerii P. Matveenko, Michael Krommer, …
Hardcover
R2,882
Discovery Miles 28 820
Observer Design for Nonlinear Dynamical…
Driss Boutat, Gang Zheng
Hardcover
R3,617
Discovery Miles 36 170
Proceedings of the 10th International…
Katia Lucchesi Cavalca, Hans Ingo Weber
Hardcover
R5,689
Discovery Miles 56 890
Models, Simulation, and Experimental…
Michel Fremond, Franco Maceri, …
Hardcover
R4,625
Discovery Miles 46 250
EuCoMeS 2018 - Proceedings of the 7th…
Burkhard Corves, Philippe Wenger, …
Hardcover
R4,422
Discovery Miles 44 220
|