![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
Floating-point arithmetic is ubiquitous in modern computing, as it is the tool of choice to approximate real numbers. Due to its limited range and precision, its use can become quite involved and potentially lead to numerous failures. One way to greatly increase confidence in floating-point software is by computer-assisted verification of its correctness proofs. This book provides a comprehensive view of how to formally specify and verify tricky floating-point algorithms with the Coq proof assistant. It describes the Flocq formalization of floating-point arithmetic and some methods to automate theorem proofs. It then presents the specification and verification of various algorithms, from error-free transformations to a numerical scheme for a partial differential equation. The examples cover not only mathematical algorithms but also C programs as well as issues related to compilation.
First published in 1975, this classic book gives a systematic account of transcendental number theory, that is, the theory of those numbers that cannot be expressed as the roots of algebraic equations having rational coefficients. Their study has developed into a fertile and extensive theory, which continues to see rapid progress today. Expositions are presented of theories relating to linear forms in the logarithms of algebraic numbers, of Schmidt's generalization of the Thue-Siegel-Roth theorem, of Shidlovsky's work on Siegel's E-functions and of Sprindzuk's solution to the Mahler conjecture. This edition includes an introduction written by David Masser describing Baker's achievement, surveying the content of each chapter and explaining the main argument of Baker's method in broad strokes. A new afterword lists recent developments related to Baker's work.
In modern computing a program is usually distributed among several processes. The fundamental challenge when developing reliable and secure distributed programs is to support the cooperation of processes required to execute a common task, even when some of these processes fail. Failures may range from crashes to adversarial attacks by malicious processes. Cachin, Guerraoui, and Rodrigues present an introductory description of fundamental distributed programming abstractions together with algorithms to implement them in distributed systems, where processes are subject to crashes and malicious attacks. The authors follow an incremental approach by first introducing basic abstractions in simple distributed environments, before moving to more sophisticated abstractions and more challenging environments. Each core chapter is devoted to one topic, covering reliable broadcast, shared memory, consensus, and extensions of consensus. For every topic, many exercises and their solutions enhance the understanding This book represents the second edition of "Introduction to Reliable Distributed Programming". Its scope has been extended to include security against malicious actions by non-cooperating processes. This important domain has become widely known under the name "Byzantine fault-tolerance".
Stochastic games have an element of chance: the state of the next round is determined probabilistically depending upon players' actions and the current state. Successful players need to balance the need for short-term payoffs while ensuring future opportunities remain high. The various techniques needed to analyze these often highly non-trivial games are a showcase of attractive mathematics, including methods from probability, differential equations, algebra, and combinatorics. This book presents a course on the theory of stochastic games going from the basics through to topics of modern research, focusing on conceptual clarity over complete generality. Each of its chapters introduces a new mathematical tool - including contracting mappings, semi-algebraic sets, infinite orbits, and Ramsey's theorem, among others - before discussing the game-theoretic results they can be used to obtain. The author assumes no more than a basic undergraduate curriculum and illustrates the theory with numerous examples and exercises, with solutions available online.
Stochastic games have an element of chance: the state of the next round is determined probabilistically depending upon players' actions and the current state. Successful players need to balance the need for short-term payoffs while ensuring future opportunities remain high. The various techniques needed to analyze these often highly non-trivial games are a showcase of attractive mathematics, including methods from probability, differential equations, algebra, and combinatorics. This book presents a course on the theory of stochastic games going from the basics through to topics of modern research, focusing on conceptual clarity over complete generality. Each of its chapters introduces a new mathematical tool - including contracting mappings, semi-algebraic sets, infinite orbits, and Ramsey's theorem, among others - before discussing the game-theoretic results they can be used to obtain. The author assumes no more than a basic undergraduate curriculum and illustrates the theory with numerous examples and exercises, with solutions available online.
This text discusses the applications and optimization of emerging smart technologies in the field of healthcare. It further explains different modeling scenarios of the latest technologies in the health care system and compare the results to better understand the nature and progress of the disease in the human body that leads to early diagnosis and better cure of disease and treatment with the help of distributed technology. Covers the implementation models using technologies such as artificial intelligence, machine learning, deep learning with distributed systems for better diagnosis and treatment of diseases. Gives in-depth review of the technological advancements like advanced sensing technologies like Plasmonic sensors, usage of RFIDs and electronic diagnostic tools in the field of healthcare engineering Discusses possibilities of augmented reality and virtual reality interventions for providing unique solutions in medical science, clinical research, psychology, and neurological disorders Highlights the future challenges and risks involved in the application of smart technologies like Cloud computing, fog computing, IOT and distributed computing in heathcare. Confers to utilize the AI and ML and associated aids in healthcare sectors in the post Covid 19 to revitalize the medical set up Contributions included in the book will motivate the technological developers and researchers to develop new algorithms and protocols in healthcare field. It will serve as the vast place for knowledge regarding healthcare health care delivery, health care management, health care in governance, and health monitoring approaches using distributed environments. It will serve as an ideal reference text for graduate students and researchers in diverse engineering fields including electrical, electronics and communication, computer, and biomedical.
The Christoffel-Darboux kernel, a central object in approximation theory, is shown to have many potential uses in modern data analysis, including applications in machine learning. This is the first book to offer a rapid introduction to the subject, illustrating the surprising effectiveness of a simple tool. Bridging the gap between classical mathematics and current evolving research, the authors present the topic in detail and follow a heuristic, example-based approach, assuming only a basic background in functional analysis, probability and some elementary notions of algebraic geometry. They cover new results in both pure and applied mathematics and introduce techniques that have a wide range of potential impacts on modern quantitative and qualitative science. Comprehensive notes provide historical background, discuss advanced concepts and give detailed bibliographical references. Researchers and graduate students in mathematics, statistics, engineering or economics will find new perspectives on traditional themes, along with challenging open problems.
Legal Programming: Designing Legally Compliant RFID and Software Agent Architectures for Retail Processes and Beyond provides a process-oriented discussion of the legal concerns presented by agent-based technologies, processes and programming. It offers a general outline of the potential legal difficulties that could arise in relation to them, focusing on the programming of negotiation and contracting processes in a privacy, consumer and commercial context. The authors will elucidate how it is possible to create form of legal framework and design methodology for transaction agents, applicable in any environment and not just in a specific proprietary framework, that provides the right level of compliance and trust. Key elements considered include the design and programming of legally compliant methods, the determination of rights in respect of objects and variables, and ontologies and programming frameworks for agent interactions. Examples are used to illustrate the points made and provide a practical perspective.
Handbook on Numerical Methods for Hyperbolic Problems: Applied and Modern Issues details the large amount of literature in the design, analysis, and application of various numerical algorithms for solving hyperbolic equations that has been produced in the last several decades. This volume provides concise summaries from experts in different types of algorithms, so that readers can find a variety of algorithms under different situations and become familiar with their relative advantages and limitations.
Understand advanced data analytics concepts such as time series and principal component analysis with ETL, supervised learning, and PySpark using Python. This book covers architectural patterns in data analytics, text and image classification, optimization techniques, natural language processing, and computer vision in the cloud environment. Generic design patterns in Python programming is clearly explained, emphasizing architectural practices such as hot potato anti-patterns. You'll review recent advances in databases such as Neo4j, Elasticsearch, and MongoDB. You'll then study feature engineering in images and texts with implementing business logic and see how to build machine learning and deep learning models using transfer learning. Advanced Analytics with Python, 2nd edition features a chapter on clustering with a neural network, regularization techniques, and algorithmic design patterns in data analytics with reinforcement learning. Finally, the recommender system in PySpark explains how to optimize models for a specific application. What You'll Learn Build intelligent systems for enterprise Review time series analysis, classifications, regression, and clustering Explore supervised learning, unsupervised learning, reinforcement learning, and transfer learning Use cloud platforms like GCP and AWS in data analytics Understand Covers design patterns in Python Who This Book Is For Data scientists and software developers interested in the field of data analytics.
Is everything Information? This is a tantalizing question which emerges in modern physics, life sciences, astronomy and in today's information and technology-driven society. In Powers of Two expert authors undertake a unique expedition - in words and images - throughout the world (and scales) of information. The story resembles, in a way, the classic Powers of Ten journeys through space: from us to the macro and the micro worlds . However, by following Powers of Two through the world of information, a completely different and timely paradigm unfolds. Every power of two, 1, 2, 4, 8.... tells us a different story: starting from the creation of the very first bit at the Big Bang and the evolution of life, through 50 years of computational science, and finally into deep space, describing the information in black holes and even in the entire universe and beyond.... All this to address one question: Is our universe made of information? In this book, we experience the Information Universe in nature and in our society and how information lies at the very foundation of our understanding of the Universe. From the Foreword by Robbert Dijkgraaf: This book is in many ways a vastly extended version of Shannon's one-page blueprint. It carries us all the way to the total information content of the Universe. And it bears testimony of how widespread the use of data has become in all aspects of life. Information is the connective tissue of the modern sciences. [...] Undoubtedly, future generations will look back at this time, so much enthralled by Big Data and quantum computers, as beholden to the information metaphor. But that is exactly the value of this book. With its crisp descriptions and evocative illustrations, it brings the reader into the here and now, at the very frontier of scientific research, including the excitement and promise of all the outstanding questions and future discoveries. Message for the e-reader of the book Powers of Two The book has been designed to be read in two-page spreads in full screen mode. For optimal reader experience in a downloaded .pdf file we strongly recommend you use the following settings in Adobe Acrobat Reader: - Taskbar: View > Page Display > two page view - Taskbar: View > Page Display > Show Cover Page in Two Page View - Taskbar: ^ Preferences > Full Screen > deselect " Fill screen with one page at a time" - Taskbar: View > Full screen mode or ctrl L (cmd L on a Mac) ***** Note: for reading the previews on Spinger link (and on-line reading in a browser), the full screen two-page view only works with these browsers: Firefox - Taskbar: on top of the text, at the uppermost right you will see then >> (which is a drop-down menu) >> even double pages - Fullscreen: F11 or Control+Cmd+F with Mac Edge - Taskbar middle: Two-page view and select show cover page separately
For 80 years, mathematics has driven fundamental innovation in computing and communications. This timely book provides a panorama of some recent ideas in mathematics and how they will drive continued innovation in computing, communications and AI in the coming years. It provides a unique insight into how the new techniques that are being developed can be used to provide theoretical foundations for technological progress, just as mathematics was used in earlier times by Turing, von Neumann, Shannon and others. Edited by leading researchers in the field, chapters cover the application of new mathematics in computer architecture, software verification, quantum computing, compressed sensing, networking, Bayesian inference, machine learning, reinforcement learning and many other areas.
Big Data: Principles and Paradigms captures the state-of-the-art research on the architectural aspects, technologies, and applications of Big Data. The book identifies potential future directions and technologies that facilitate insight into numerous scientific, business, and consumer applications. To help realize Big Data's full potential, the book addresses numerous challenges, offering the conceptual and technological solutions for tackling them. These challenges include life-cycle data management, large-scale storage, flexible processing infrastructure, data modeling, scalable machine learning, data analysis algorithms, sampling techniques, and privacy and ethical issues.
Enterprises all over the world are experiencing a rapid development of networked computing for applications that are required for the daily survival of an organization. Client-server computing offers great potential for cost-effective networked computing. However, many organizations have now learned that the cost of maintenance and support of these networked distributed systems far exceeds the cost of buying them. Computer Supported Creative Work (CSCW) is the new evolving area that promotes the understanding of business processes and relevant communication technologies. Cooperative Management of Enterprise Networks uses CSCW as the medium for conveying ideas on the integration of business processes with network and systems management. This book will be useful for systems management professionals wishing to know about business process integration; business managers wishing to integrate their tasks with network/systems management; software system developers wishing to adopt participatory design practices; and students and researchers.
Current database technology and computer hardware allow us to gather, store, access, and manipulate massive volumes of raw data in an efficient and inexpensive manner. In addition, the amount of data collected and warehoused in all industries is growing every year at a phenomenal rate. Nevertheless, our ability to discover critical, non-obvious nuggets of useful information in data that could influence or help in the decision making process, is still limited. Knowledge discovery (KDD) and Data Mining (DM) is a new, multidisciplinary field that focuses on the overall process of information discovery from large volumes of data. The field combines database concepts and theory, machine learning, pattern recognition, statistics, artificial intelligence, uncertainty management, and high-performance computing. To remain competitive, businesses must apply data mining techniques such as classification, prediction, and clustering using tools such as neural networks, fuzzy logic, and decision trees to facilitate making strategic decisions on a daily basis. Knowledge Discovery for Business Information Systems contains a collection of 16 high quality articles written by experts in the KDD and DM field from the following countries: Austria, Australia, Bulgaria, Canada, China (Hong Kong), Estonia, Denmark, Germany, Italy, Poland, Singapore and USA.
Algorithms for Automating Open Source Intelligence (OSINT) presents information on the gathering of information and extraction of actionable intelligence from openly available sources, including news broadcasts, public repositories, and more recently, social media. As OSINT has applications in crime fighting, state-based intelligence, and social research, this book provides recent advances in text mining, web crawling, and other algorithms that have led to advances in methods that can largely automate this process. The book is beneficial to both practitioners and academic researchers, with discussions of the latest advances in applications, a coherent set of methods and processes for automating OSINT, and interdisciplinary perspectives on the key problems identified within each discipline. Drawing upon years of practical experience and using numerous examples, editors Robert Layton, Paul Watters, and a distinguished list of contributors discuss Evidence Accumulation Strategies for OSINT, Named Entity Resolution in Social Media, Analyzing Social Media Campaigns for Group Size Estimation, Surveys and qualitative techniques in OSINT, and Geospatial reasoning of open data.
A Comprehensive Study of SQL - Practice and Implementation is designed as a textbook and provides a comprehensive approach to SQL (Structured Query Language), the standard programming language for defining, organizing, and exploring data in relational databases. It demonstrates how to leverage the two most vital tools for data query and analysis - SQL and Excel - to perform comprehensive data analysis without the need for a sophisticated and expensive data mining tool or application. Features The book provides a complete collection of modeling techniques, beginning with fundamentals and gradually progressing through increasingly complex real-world case studies It explains how to build, populate, and administer high-performance databases and develop robust SQL-based applications It also gives a solid foundation in best practices and relational theory The book offers self-contained lessons on key SQL concepts or techniques at the end of each chapter using numerous illustrations and annotated examples This book is aimed primarily at advanced undergraduates and graduates with a background in computer science and information technology. Researchers and professionals will also find this book useful.
With Chromatic Graph Theory, Second Edition, the authors present various fundamentals of graph theory that lie outside of graph colorings, including basic terminology and results, trees and connectivity, Eulerian and Hamiltonian graphs, matchings and factorizations, and graph embeddings. Readers will see that the authors accomplished the primary goal of this textbook, which is to introduce graph theory with a coloring theme and to look at graph colorings in various ways. The textbook also covers vertex colorings and bounds for the chromatic number, vertex colorings of graphs embedded on surfaces, and a variety of restricted vertex colorings. The authors also describe edge colorings, monochromatic and rainbow edge colorings, complete vertex colorings, several distinguishing vertex and edge colorings. Features of the Second Edition: The book can be used for a first course in graph theory as well as a graduate course The primary topic in the book is graph coloring The book begins with an introduction to graph theory so assumes no previous course The authors are the most widely-published team on graph theory Many new examples and exercises enhance the new edition
Paul Erdos published more papers during his lifetime than any other mathematician, especially in discrete mathematics. He had a nose for beautiful, simply-stated problems with solutions that have far-reaching consequences across mathematics. This captivating book, written for students, provides an easy-to-understand introduction to discrete mathematics by presenting questions that intrigued Erdos, along with his brilliant ways of working toward their answers. It includes young Erdos's proof of Bertrand's postulate, the Erdos-Szekeres Happy End Theorem, De Bruijn-Erdos theorem, Erdos-Rado delta-systems, Erdos-Ko-Rado theorem, Erdos-Stone theorem, the Erdos-Renyi-Sos Friendship Theorem, Erdos-Renyi random graphs, the Chvatal-Erdos theorem on Hamilton cycles, and other results of Erdos, as well as results related to his work, such as Ramsey's theorem or Deza's theorem on weak delta-systems. Its appendix covers topics normally missing from introductory courses. Filled with personal anecdotes about Erdos, this book offers a behind-the-scenes look at interactions with the legendary collaborator.
In recent years, machine learning has gained a lot of interest. Due to the advances in processor technology and the availability of large amounts of data, machine learning techniques have provided astounding results in areas such as object recognition or natural language processing. New approaches, e.g. deep learning, have provided groundbreaking outcomes in fields such as multimedia mining or voice recognition. Machine learning is now used in virtually every domain and deep learning algorithms are present in many devices such as smartphones, cars, drones, healthcare equipment, or smart home devices. The Internet, cloud computing and the Internet of Things produce a tsunami of data and machine learning provides the methods to effectively analyze the data and discover actionable knowledge. This book describes the most common machine learning techniques such as Bayesian models, support vector machines, decision tree induction, regression analysis, and recurrent and convolutional neural networks. It first gives an introduction into the principles of machine learning. It then covers the basic methods including the mathematical foundations. The biggest part of the book provides common machine learning algorithms and their applications. Finally, the book gives an outlook into some of the future developments and possible new research areas of machine learning and artificial intelligence in general. This book is meant to be an introduction into machine learning. It does not require prior knowledge in this area. It covers some of the basic mathematical principle but intends to be understandable even without a background in mathematics. It can be read chapter wise and intends to be comprehensible, even when not starting in the beginning. Finally, it also intends to be a reference book. Key Features: Describes real world problems that can be solved using Machine Learning Provides methods for directly applying Machine Learning techniques to concrete real world problems Demonstrates how to apply Machine Learning techniques using different frameworks such as TensorFlow, MALLET, R
Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that, properly regulated, risk assessment tools can be a crucial means of safely and humanely dismantling our massive jail and prison complex. The book explains how risk algorithms work, the types of legal questions they should answer, and the criteria for judging whether they do so in a way that minimizes bias and respects human dignity. It also shows how risk assessment instruments can provide leverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of Christopher Slobogin's insightful analysis is to develop the principles that should govern, in both the pretrial and sentencing settings, the criminal justice system's consideration of risk.
Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that, properly regulated, risk assessment tools can be a crucial means of safely and humanely dismantling our massive jail and prison complex. The book explains how risk algorithms work, the types of legal questions they should answer, and the criteria for judging whether they do so in a way that minimizes bias and respects human dignity. It also shows how risk assessment instruments can provide leverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of Christopher Slobogin's insightful analysis is to develop the principles that should govern, in both the pretrial and sentencing settings, the criminal justice system's consideration of risk.
In the last few years, Algorithms for Convex Optimization have revolutionized algorithm design, both for discrete and continuous optimization problems. For problems like maximum flow, maximum matching, and submodular function minimization, the fastest algorithms involve essential methods such as gradient descent, mirror descent, interior point methods, and ellipsoid methods. The goal of this self-contained book is to enable researchers and professionals in computer science, data science, and machine learning to gain an in-depth understanding of these algorithms. The text emphasizes how to derive key algorithms for convex optimization from first principles and how to establish precise running time bounds. This modern text explains the success of these algorithms in problems of discrete optimization, as well as how these methods have significantly pushed the state of the art of convex optimization itself.
Paul Erdos published more papers during his lifetime than any other mathematician, especially in discrete mathematics. He had a nose for beautiful, simply-stated problems with solutions that have far-reaching consequences across mathematics. This captivating book, written for students, provides an easy-to-understand introduction to discrete mathematics by presenting questions that intrigued Erdos, along with his brilliant ways of working toward their answers. It includes young Erdos's proof of Bertrand's postulate, the Erdos-Szekeres Happy End Theorem, De Bruijn-Erdos theorem, Erdos-Rado delta-systems, Erdos-Ko-Rado theorem, Erdos-Stone theorem, the Erdos-Renyi-Sos Friendship Theorem, Erdos-Renyi random graphs, the Chvatal-Erdos theorem on Hamilton cycles, and other results of Erdos, as well as results related to his work, such as Ramsey's theorem or Deza's theorem on weak delta-systems. Its appendix covers topics normally missing from introductory courses. Filled with personal anecdotes about Erdos, this book offers a behind-the-scenes look at interactions with the legendary collaborator.
Every other day we hear about new ways to put deep learning to good use: improved medical imaging, accurate credit card fraud detection, long range weather forecasting, and more. PyTorch puts these superpowers in your hands, providing a comfortable Python experience that gets you started quickly and then grows with you as you, and your deep learning skills, become more sophisticated. Deep Learning with PyTorch teaches you how to implement deep learning algorithms with Python and PyTorch. This book takes you into a fascinating case study: building an algorithm capable of detecting malignant lung tumors using CT scans. As the authors guide you through this real example, you'll discover just how effective and fun PyTorch can be. Key features * Using the PyTorch tensor API * Understanding automatic differentiation in PyTorch * Training deep neural networks * Monitoring training and visualizing results * Interoperability with NumPy Audience Written for developers with some knowledge of Python as well as basic linear algebra skills. Some understanding of deep learning will be helpful, however no experience with PyTorch or other deep learning frameworks is required. About the technology PyTorch is a machine learning framework with a strong focus on deep neural networks. Because it emphasizes GPU-based acceleration, PyTorch performs exceptionally well on readily-available hardware and scales easily to larger systems. Eli Stevens has worked in Silicon Valley for the past 15 years as a software engineer, and the past 7 years as Chief Technical Officer of a startup making medical device software. Luca Antiga is co-founder and CEO of an AI engineering company located in Bergamo, Italy, and a regular contributor to PyTorch. |
You may like...
Advances in Quantum Monte Carlo
Shigenori Tanaka, Stuart M. Rothstein, …
Hardcover
R5,469
Discovery Miles 54 690
Experimental Statistical Designs and…
Chu-Hua Kuei, Christian Madu
Hardcover
R2,534
Discovery Miles 25 340
|