![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
The Christoffel-Darboux kernel, a central object in approximation theory, is shown to have many potential uses in modern data analysis, including applications in machine learning. This is the first book to offer a rapid introduction to the subject, illustrating the surprising effectiveness of a simple tool. Bridging the gap between classical mathematics and current evolving research, the authors present the topic in detail and follow a heuristic, example-based approach, assuming only a basic background in functional analysis, probability and some elementary notions of algebraic geometry. They cover new results in both pure and applied mathematics and introduce techniques that have a wide range of potential impacts on modern quantitative and qualitative science. Comprehensive notes provide historical background, discuss advanced concepts and give detailed bibliographical references. Researchers and graduate students in mathematics, statistics, engineering or economics will find new perspectives on traditional themes, along with challenging open problems.
Anyone browsing at the stationery store will see an incredible array of pop-up cards available for any occasion. The workings of pop-up cards and pop-up books can be remarkably intricate. Behind such designs lies beautiful geometry involving the intersection of circles, cones, and spheres, the movements of linkages, and other constructions. The geometry can be modelled by algebraic equations, whose solutions explain the dynamics. For example, several pop-up motions rely on the intersection of three spheres, a computation made every second for GPS location. Connecting the motions of the card structures with the algebra and geometry reveals abstract mathematics performing tangible calculations. Beginning with the nephroid in the 19th-century, the mathematics of pop-up design is now at the frontiers of rigid origami and algorithmic computational complexity. All topics are accessible to those familiar with high-school mathematics; no calculus required. Explanations are supplemented by 140+ figures and 20 animations.
Is everything Information? This is a tantalizing question which emerges in modern physics, life sciences, astronomy and in today's information and technology-driven society. In Powers of Two expert authors undertake a unique expedition - in words and images - throughout the world (and scales) of information. The story resembles, in a way, the classic Powers of Ten journeys through space: from us to the macro and the micro worlds . However, by following Powers of Two through the world of information, a completely different and timely paradigm unfolds. Every power of two, 1, 2, 4, 8.... tells us a different story: starting from the creation of the very first bit at the Big Bang and the evolution of life, through 50 years of computational science, and finally into deep space, describing the information in black holes and even in the entire universe and beyond.... All this to address one question: Is our universe made of information? In this book, we experience the Information Universe in nature and in our society and how information lies at the very foundation of our understanding of the Universe. From the Foreword by Robbert Dijkgraaf: This book is in many ways a vastly extended version of Shannon's one-page blueprint. It carries us all the way to the total information content of the Universe. And it bears testimony of how widespread the use of data has become in all aspects of life. Information is the connective tissue of the modern sciences. [...] Undoubtedly, future generations will look back at this time, so much enthralled by Big Data and quantum computers, as beholden to the information metaphor. But that is exactly the value of this book. With its crisp descriptions and evocative illustrations, it brings the reader into the here and now, at the very frontier of scientific research, including the excitement and promise of all the outstanding questions and future discoveries. Message for the e-reader of the book Powers of Two The book has been designed to be read in two-page spreads in full screen mode. For optimal reader experience in a downloaded .pdf file we strongly recommend you use the following settings in Adobe Acrobat Reader: - Taskbar: View > Page Display > two page view - Taskbar: View > Page Display > Show Cover Page in Two Page View - Taskbar: ^ Preferences > Full Screen > deselect " Fill screen with one page at a time" - Taskbar: View > Full screen mode or ctrl L (cmd L on a Mac) ***** Note: for reading the previews on Spinger link (and on-line reading in a browser), the full screen two-page view only works with these browsers: Firefox - Taskbar: on top of the text, at the uppermost right you will see then >> (which is a drop-down menu) >> even double pages - Fullscreen: F11 or Control+Cmd+F with Mac Edge - Taskbar middle: Two-page view and select show cover page separately
A Comprehensive Study of SQL - Practice and Implementation is designed as a textbook and provides a comprehensive approach to SQL (Structured Query Language), the standard programming language for defining, organizing, and exploring data in relational databases. It demonstrates how to leverage the two most vital tools for data query and analysis - SQL and Excel - to perform comprehensive data analysis without the need for a sophisticated and expensive data mining tool or application. Features The book provides a complete collection of modeling techniques, beginning with fundamentals and gradually progressing through increasingly complex real-world case studies It explains how to build, populate, and administer high-performance databases and develop robust SQL-based applications It also gives a solid foundation in best practices and relational theory The book offers self-contained lessons on key SQL concepts or techniques at the end of each chapter using numerous illustrations and annotated examples This book is aimed primarily at advanced undergraduates and graduates with a background in computer science and information technology. Researchers and professionals will also find this book useful.
For 80 years, mathematics has driven fundamental innovation in computing and communications. This timely book provides a panorama of some recent ideas in mathematics and how they will drive continued innovation in computing, communications and AI in the coming years. It provides a unique insight into how the new techniques that are being developed can be used to provide theoretical foundations for technological progress, just as mathematics was used in earlier times by Turing, von Neumann, Shannon and others. Edited by leading researchers in the field, chapters cover the application of new mathematics in computer architecture, software verification, quantum computing, compressed sensing, networking, Bayesian inference, machine learning, reinforcement learning and many other areas.
Based on a new classification of algorithm design techniques and a clear delineation of analysis methods, Introduction to the Design and Analysis of Algorithms presents the subject in a coherent and innovative manner. Written in a student-friendly style, the book emphasises the understanding of ideas over excessively formal treatment while thoroughly covering the material required in an introductory algorithms course. Popular puzzles are used to motivate students' interest and strengthen their skills in algorithmic problem solving. Other learning-enhancement features include chapter summaries, hints to the exercises, and a detailed solution manual.
In the last few years, Algorithms for Convex Optimization have revolutionized algorithm design, both for discrete and continuous optimization problems. For problems like maximum flow, maximum matching, and submodular function minimization, the fastest algorithms involve essential methods such as gradient descent, mirror descent, interior point methods, and ellipsoid methods. The goal of this self-contained book is to enable researchers and professionals in computer science, data science, and machine learning to gain an in-depth understanding of these algorithms. The text emphasizes how to derive key algorithms for convex optimization from first principles and how to establish precise running time bounds. This modern text explains the success of these algorithms in problems of discrete optimization, as well as how these methods have significantly pushed the state of the art of convex optimization itself.
Paul Erdos published more papers during his lifetime than any other mathematician, especially in discrete mathematics. He had a nose for beautiful, simply-stated problems with solutions that have far-reaching consequences across mathematics. This captivating book, written for students, provides an easy-to-understand introduction to discrete mathematics by presenting questions that intrigued Erdos, along with his brilliant ways of working toward their answers. It includes young Erdos's proof of Bertrand's postulate, the Erdos-Szekeres Happy End Theorem, De Bruijn-Erdos theorem, Erdos-Rado delta-systems, Erdos-Ko-Rado theorem, Erdos-Stone theorem, the Erdos-Renyi-Sos Friendship Theorem, Erdos-Renyi random graphs, the Chvatal-Erdos theorem on Hamilton cycles, and other results of Erdos, as well as results related to his work, such as Ramsey's theorem or Deza's theorem on weak delta-systems. Its appendix covers topics normally missing from introductory courses. Filled with personal anecdotes about Erdos, this book offers a behind-the-scenes look at interactions with the legendary collaborator.
With Chromatic Graph Theory, Second Edition, the authors present various fundamentals of graph theory that lie outside of graph colorings, including basic terminology and results, trees and connectivity, Eulerian and Hamiltonian graphs, matchings and factorizations, and graph embeddings. Readers will see that the authors accomplished the primary goal of this textbook, which is to introduce graph theory with a coloring theme and to look at graph colorings in various ways. The textbook also covers vertex colorings and bounds for the chromatic number, vertex colorings of graphs embedded on surfaces, and a variety of restricted vertex colorings. The authors also describe edge colorings, monochromatic and rainbow edge colorings, complete vertex colorings, several distinguishing vertex and edge colorings. Features of the Second Edition: The book can be used for a first course in graph theory as well as a graduate course The primary topic in the book is graph coloring The book begins with an introduction to graph theory so assumes no previous course The authors are the most widely-published team on graph theory Many new examples and exercises enhance the new edition
Paul Erdos published more papers during his lifetime than any other mathematician, especially in discrete mathematics. He had a nose for beautiful, simply-stated problems with solutions that have far-reaching consequences across mathematics. This captivating book, written for students, provides an easy-to-understand introduction to discrete mathematics by presenting questions that intrigued Erdos, along with his brilliant ways of working toward their answers. It includes young Erdos's proof of Bertrand's postulate, the Erdos-Szekeres Happy End Theorem, De Bruijn-Erdos theorem, Erdos-Rado delta-systems, Erdos-Ko-Rado theorem, Erdos-Stone theorem, the Erdos-Renyi-Sos Friendship Theorem, Erdos-Renyi random graphs, the Chvatal-Erdos theorem on Hamilton cycles, and other results of Erdos, as well as results related to his work, such as Ramsey's theorem or Deza's theorem on weak delta-systems. Its appendix covers topics normally missing from introductory courses. Filled with personal anecdotes about Erdos, this book offers a behind-the-scenes look at interactions with the legendary collaborator.
In recent years, machine learning has gained a lot of interest. Due to the advances in processor technology and the availability of large amounts of data, machine learning techniques have provided astounding results in areas such as object recognition or natural language processing. New approaches, e.g. deep learning, have provided groundbreaking outcomes in fields such as multimedia mining or voice recognition. Machine learning is now used in virtually every domain and deep learning algorithms are present in many devices such as smartphones, cars, drones, healthcare equipment, or smart home devices. The Internet, cloud computing and the Internet of Things produce a tsunami of data and machine learning provides the methods to effectively analyze the data and discover actionable knowledge. This book describes the most common machine learning techniques such as Bayesian models, support vector machines, decision tree induction, regression analysis, and recurrent and convolutional neural networks. It first gives an introduction into the principles of machine learning. It then covers the basic methods including the mathematical foundations. The biggest part of the book provides common machine learning algorithms and their applications. Finally, the book gives an outlook into some of the future developments and possible new research areas of machine learning and artificial intelligence in general. This book is meant to be an introduction into machine learning. It does not require prior knowledge in this area. It covers some of the basic mathematical principle but intends to be understandable even without a background in mathematics. It can be read chapter wise and intends to be comprehensible, even when not starting in the beginning. Finally, it also intends to be a reference book. Key Features: Describes real world problems that can be solved using Machine Learning Provides methods for directly applying Machine Learning techniques to concrete real world problems Demonstrates how to apply Machine Learning techniques using different frameworks such as TensorFlow, MALLET, R
Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that, properly regulated, risk assessment tools can be a crucial means of safely and humanely dismantling our massive jail and prison complex. The book explains how risk algorithms work, the types of legal questions they should answer, and the criteria for judging whether they do so in a way that minimizes bias and respects human dignity. It also shows how risk assessment instruments can provide leverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of Christopher Slobogin's insightful analysis is to develop the principles that should govern, in both the pretrial and sentencing settings, the criminal justice system's consideration of risk.
Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that, properly regulated, risk assessment tools can be a crucial means of safely and humanely dismantling our massive jail and prison complex. The book explains how risk algorithms work, the types of legal questions they should answer, and the criteria for judging whether they do so in a way that minimizes bias and respects human dignity. It also shows how risk assessment instruments can provide leverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of Christopher Slobogin's insightful analysis is to develop the principles that should govern, in both the pretrial and sentencing settings, the criminal justice system's consideration of risk.
Ever since Lorensen and Cline published their paper on the Marching Cubes algorithm, isosurfaces have been a standard technique for the visualization of 3D volumetric data. Yet there is no book exclusively devoted to isosurfaces. Isosurfaces: Geometry, Topology, and Algorithms represents the first book to focus on basic algorithms for isosurface construction. It also gives a rigorous mathematical perspective on some of the algorithms and results. In color throughout, the book covers the Marching Cubes algorithm and variants, dual contouring algorithms, multilinear interpolation, multiresolution isosurface extraction, isosurfaces in four dimensions, interval volumes, and contour trees. It also describes data structures for faster isosurface extraction as well as methods for selecting significant isovalues. For designers of visualization software, the book presents an organized overview of the various algorithms associated with isosurfaces. For graduate students, it provides a solid introduction to research in this area. For visualization researchers, the book serves as a reference to the vast literature on isosurfaces.
Algorithms influence every facet of modern life: criminal justice, education, housing, entertainment, elections, social media, news feeds, work... the list goes on. Delegating important decisions to machines, however, gives rise to deep moral concerns about responsibility, transparency, freedom, fairness, and democracy. Algorithms and Autonomy connects these concerns to the core human value of autonomy in the contexts of algorithmic teacher evaluation, risk assessment in criminal sentencing, predictive policing, background checks, news feeds, ride-sharing platforms, social media, and election interference. Using these case studies, the authors provide a better understanding of machine fairness and algorithmic transparency. They explain why interventions in algorithmic systems are necessary to ensure that algorithms are not used to control citizens' participation in politics and undercut democracy. This title is also available as Open Access on Cambridge Core.
Algorithms influence every facet of modern life: criminal justice, education, housing, entertainment, elections, social media, news feeds, work... the list goes on. Delegating important decisions to machines, however, gives rise to deep moral concerns about responsibility, transparency, freedom, fairness, and democracy. Algorithms and Autonomy connects these concerns to the core human value of autonomy in the contexts of algorithmic teacher evaluation, risk assessment in criminal sentencing, predictive policing, background checks, news feeds, ride-sharing platforms, social media, and election interference. Using these case studies, the authors provide a better understanding of machine fairness and algorithmic transparency. They explain why interventions in algorithmic systems are necessary to ensure that algorithms are not used to control citizens' participation in politics and undercut democracy. This title is also available as Open Access on Cambridge Core.
Genetic Programming Theory and Practice VI was developed from the sixth workshop at the University of Michigan s Center for the Study of Complex Systems to facilitate the exchange of ideas and information related to the rapidly advancing field of Genetic Programming (GP). Contributions from the foremost international researchers and practitioners in the GP arena examine the similarities and differences between theoretical and empirical results on real-world problems. The text explores the synergy between theory and practice, producing a comprehensive view of the state of the art in GP application. These contributions address several significant interdependent themes which emerged from this year s workshop, including: (1) Making efficient and effective use of test data. (2) Sustaining the long-term evolvability of our GP systems. (3) Exploiting discovered subsolutions for reuse. (4) Increasing the role of a Domain Expert."
This is the first comprehensive overview of the 'science of science,' an emerging interdisciplinary field that relies on big data to unveil the reproducible patterns that govern individual scientific careers and the workings of science. It explores the roots of scientific impact, the role of productivity and creativity, when and what kind of collaborations are effective, the impact of failure and success in a scientific career, and what metrics can tell us about the fundamental workings of science. The book relies on data to draw actionable insights, which can be applied by individuals to further their career or decision makers to enhance the role of science in society. With anecdotes and detailed, easy-to-follow explanations of the research, this book is accessible to all scientists and graduate students, policymakers, and administrators with an interest in the wider scientific enterprise.
Here, the authors propose a method for the formal development of parallel programs - or multiprograms as they prefer to call them. They accomplish this with a minimum of formal gear, i.e. with the predicate calculus and the well- established theory of Owicki and Gries. They show that the Owicki/Gries theory can be effectively put to work for the formal development of multiprograms, regardless of whether these algorithms are distributed or not.
In recent years, the United Kingdom's Home Office has started using automated systems to make immigration decisions. These systems promise faster, more accurate, and cheaper decision-making, but in practice they have exposed people to distress, disruption, and even deportation. This book identifies a pattern of risky experimentation with automated systems in the Home Office. It analyses three recent case studies including: a voice recognition system used to detect fraud in English-language testing; an algorithm for identifying 'risky' visa applications; and automated decision-making in the EU Settlement Scheme. The book argues that a precautionary approach is essential to ensure that society benefits from government automation without exposing individuals to unacceptable risks.
There are no silver bullets in algorithm design, and no single algorithmic idea is powerful and flexible enough to solve every computational problem. Nor are there silver bullets in algorithm analysis, as the most enlightening method for analyzing an algorithm often depends on the problem and the application. However, typical algorithms courses rely almost entirely on a single analysis framework, that of worst-case analysis, wherein an algorithm is assessed by its worst performance on any input of a given size. The purpose of this book is to popularize several alternatives to worst-case analysis and their most notable algorithmic applications, from clustering to linear programming to neural network training. Forty leading researchers have contributed introductions to different facets of this field, emphasizing the most important models and results, many of which can be taught in lectures to beginning graduate students in theoretical computer science and machine learning.
The book is a concise, self-contained and fully updated introduction to automata theory - a fundamental topic of computer sciences and engineering. The material is presented in a rigorous yet convincing way and is supplied with a wealth of examples, exercises and down-to-the earth convincing explanatory notes. An ideal text to a spectrum of one-term courses in computer sciences, both at the senior undergraduate and graduate students.
Want to kill it at your job interview in the tech industry? Want to win that coding competition? Learn all the algorithmic techniques and programming skills you need from two experienced coaches, problem setters, and jurors for coding competitions. The authors highlight the versatility of each algorithm by considering a variety of problems and show how to implement algorithms in simple and efficient code. Readers can expect to master 128 algorithms in Python and discover the right way to tackle a problem and quickly implement a solution of low complexity. Classic problems like Dijkstra's shortest path algorithm and Knuth-Morris-Pratt's string matching algorithm are featured alongside lesser known data structures like Fenwick trees and Knuth's dancing links. The book provides a framework to tackle algorithmic problem solving, including: Definition, Complexity, Applications, Algorithm, Key Information, Implementation, Variants, In Practice, and Problems. Python code included in the book and on the companion website.
The real world is perceived and broken down as data, models and algorithms in the eyes of physicists and engineers. Data is noisy by nature and classical statistical tools have so far been successful in dealing with relatively smaller levels of randomness. The recent emergence of Big Data and the required computing power to analyse them have rendered classical tools outdated and insufficient. Tools such as random matrix theory and the study of large sample covariance matrices can efficiently process these big data sets and help make sense of modern, deep learning algorithms. Presenting an introductory calculus course for random matrices, the book focusses on modern concepts in matrix theory, generalising the standard concept of probabilistic independence to non-commuting random variables. Concretely worked out examples and applications to financial engineering and portfolio construction make this unique book an essential tool for physicists, engineers, data analysts, and economists.
Networks powered by algorithms are pervasive. Major contemporary technology trends - Internet of Things, Big Data, Digital Platform Power, Blockchain, and the Algorithmic Society - are manifestations of this phenomenon. The internet, which once seemed an unambiguous benefit to society, is now the basis for invasions of privacy, massive concentrations of power, and wide-scale manipulation. The algorithmic networked world poses deep questions about power, freedom, fairness, and human agency. The influential 1997 Federal Communications Commission whitepaper "Digital Tornado" hailed the "endless spiral of connectivity" that would transform society, and today, little remains untouched by digital connectivity. Yet fundamental questions remain unresolved, and even more serious challenges have emerged. This important collection, which offers a reckoning and a foretelling, features leading technology scholars who explain the legal, business, ethical, technical, and public policy challenges of building pervasive networks and algorithms for the benefit of humanity. This title is also available as Open Access on Cambridge Core. |
You may like...
Concept Parsing Algorithms (CPA) for…
Uri Shafrir, Masha Etkind
Hardcover
R3,276
Discovery Miles 32 760
Computational and Statistical Methods…
Shen Liu, James McGree, …
Hardcover
R1,802
Discovery Miles 18 020
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
Handbook of Research on Modeling…
Sujata Dash, B. K. Tripathy, …
Hardcover
R6,518
Discovery Miles 65 180
|