![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
The World Wide Web exploded into public consciousness in 1995, a year which saw the coming of age of the Internet. People are communicating, working, shopping, learning, and entertaining themselves, as well as satisfying carnal desires and even finding God through the simple act of connecting their computers to the wide universe of cyberspace. We are assured, at the same time, that this progress will have profound effects on work, culture, leisure--everything, including the ways in which we interact with each other. Yet just what these effects will be, how power will be distributed, and what recourse will be available to those adversely affected by the new technologies, are issues that have yet to be negotiated. Aside from the occasional panic over cyber-porn, few have considered the wide-ranging effects of our increasing reliance on interactive technologies. "Cyberfutures" offers a close examination of issues that will become increasingly important as computers, networks, and technologies occupy crucial roles in our everyday lives. Comprised of essays from a range of occupational and disciplinary perspectives, including those of Vivian Sobchack and Arturo Escobar, this volume makes essential reading for students in cultural and media studies, anthropology, as well as for citizens interested in considering the larger implications of the Information Superhighway.
No aspect of business, public, or private lives in developed economies can be discussed today without acknowledging the role of information and communication technologies (ICT). A shortage of studies still exists, however, on how ICTs can help developing economies. Leveraging Developing Economies with the Use of Information Technology: Trends and Tools moves toward filling the gap in research on ICT and developing nations, bringing these countries one step closer to advancement through technology. This essential publication will bring together ideas, views, and perspectives helpful to government officials, business professionals, and other individuals worldwide as they consider the use of ICT for socio-economic progress in the developing world.
This unique text/reference presents a fresh look at nonlinear processing through nonlinear eigenvalue analysis, highlighting how one-homogeneous convex functionals can induce nonlinear operators that can be analyzed within an eigenvalue framework. The text opens with an introduction to the mathematical background, together with a summary of classical variational algorithms for vision. This is followed by a focus on the foundations and applications of the new multi-scale representation based on non-linear eigenproblems. The book then concludes with a discussion of new numerical techniques for finding nonlinear eigenfunctions, and promising research directions beyond the convex case. Topics and features: introduces the classical Fourier transform and its associated operator and energy, and asks how these concepts can be generalized in the nonlinear case; reviews the basic mathematical notion, briefly outlining the use of variational and flow-based methods to solve image-processing and computer vision algorithms; describes the properties of the total variation (TV) functional, and how the concept of nonlinear eigenfunctions relate to convex functionals; provides a spectral framework for one-homogeneous functionals, and applies this framework for denoising, texture processing and image fusion; proposes novel ways to solve the nonlinear eigenvalue problem using special flows that converge to eigenfunctions; examines graph-based and nonlocal methods, for which a TV eigenvalue analysis gives rise to strong segmentation, clustering and classification algorithms; presents an approach to generalizing the nonlinear spectral concept beyond the convex case, based on pixel decay analysis; discusses relations to other branches of image processing, such as wavelets and dictionary based methods. This original work offers fascinating new insights into established signal processing techniques, integrating deep mathematical concepts from a range of different fields, which will be of great interest to all researchers involved with image processing and computer vision applications, as well as computations for more general scientific problems.
This book is for people worrying about their sinking ship. Based on experience, it is a guide for navigating the blockers, buzzwords and bloody-mindedness that doom any analogue organisation trapped into thinking that while the internet has changed the world, it won't change their world. Companies that grew up on the web have changed our expectations of the services we rely on. We demand simplicity, speed and low cost. Organizations founded before the Internet aren't keeping up - despite spending millions on IT, marketing and 'innovation'. This revised, expanded second edition of Digital Transformation at Scale is a guide to building a digital institution. It explains how a growing band of reformers in businesses and governments around the world have helped their organizations pivot to this new way of working, and what lessons others can learn from their experience. It is based on the authors' experience designing and helping to deliver the UK's Government Digital Service (GDS). The GDS was a new institution made responsible for the digital transformation of government, designing public services for the Internet era. It snipped GBP4 billion off the government's technology bill, opened up public sector contracts to thousands of new suppliers, and delivered online services so good that citizens chose to use them over the offline alternatives, without a big marketing campaign. Other countries and companies noticed, with the GDS model now being copied around the world.
This book presents advances in alternative swarm development that have proved to be effective in several complex problems. Swarm intelligence (SI) is a problem-solving methodology that results from the cooperation between a set of agents with similar characteristics. The study of biological entities, such as animals and insects, manifesting social behavior has resulted in several computational models of swarm intelligence. While there are numerous books addressing the most widely known swarm methods, namely ant colony algorithms and particle swarm optimization, those discussing new alternative approaches are rare. The focus on developments based on the simple modification of popular swarm methods overlooks the opportunity to discover new techniques and procedures that can be useful in solving problems formulated by the academic and industrial communities. Presenting various novel swarm methods and their practical applications, the book helps researchers, lecturers, engineers and practitioners solve their own optimization problems.
As governments and policy makers take advantage of information and communication technologies, leaders must understand how to navigate the ever-shifting landscape of modern technologies in order to be most effective in enacting change and leading their constituents. The Handbook of Research on Advanced ICT Integration for Governance and Policy Modeling builds on the available literature, research, and recent advances in e-governance to explore advanced methods and applications of digital tools in government. This collection of the latest research in the field presents an essential reference for academics, researchers, and advanced-level students, as well as government leaders, policy makers, and experts in international relations.
This book provides an overview of the confluence of ideas in Turing's era and work and examines the impact of his work on mathematical logic and theoretical computer science. It combines contributions by well-known scientists on the history and philosophy of computability theory as well as on generalised Turing computability. By looking at the roots and at the philosophical and technical influence of Turing's work, it is possible to gather new perspectives and new research topics which might be considered as a continuation of Turing's working ideas well into the 21st century. The Stored-Program Universal Computer: Did Zuse Anticipate Turing and von Neumann?" is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com
Studying narratives is often the best way to gain a good understanding of how various aspects of human information are organized and integrated-the narrator employs specific informational methods to build the whole structure of a narrative through combining temporally constructed events in light of an array of relationships to the narratee and these methods reveal the interaction of the rational and the sensitive aspects of human information. Computational and Cognitive Approaches to Narratology discusses issues of narrative-related information and communication technologies, cognitive mechanism and analyses, and theoretical perspectives on narratives and the story generation process. Focusing on emerging research as well as applications in a variety of fields including marketing, philosophy, psychology, art, and literature, this timely publication is an essential reference source for researchers, professionals, and graduate students in various information technology, cognitive studies, design, and creative fields.
This book reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. Features: presents an overview of the underlying mathematical theory, covering vector spaces, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations; reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces; examines techniques for computing curvature from polygonal meshes; describes algorithms for mesh smoothing, mesh parametrization, and mesh optimization and simplification; discusses point location databases and convex hulls of point sets; investigates the reconstruction of triangle meshes from point clouds, including methods for registration of point clouds and surface reconstruction; provides additional material at a supplementary website; includes self-study exercises throughout the text.
From the reviews of the 1st edition: "This book provides a comprehensive and detailed account of different topics in algorithmic 3-dimensional topology, culminating with the recognition procedure for Haken manifolds and including the up-to-date results in computer enumeration of 3-manifolds. Originating from lecture notes of various courses given by the author over a decade, the book is intended to combine the pedagogical approach of a graduate textbook (without exercises) with the completeness and reliability of a research monograph... All the material, with few exceptions, is presented from the peculiar point of view of special polyhedra and special spines of 3-manifolds. This choice contributes to keep the level of the exposition really elementary. In conclusion, the reviewer subscribes to the quotation from the back cover: "the book fills a gap in the existing literature and will become a standard reference for algorithmic 3-dimensional topology both for graduate students and researchers." Zentralblatt fur Mathematik 2004 For this 2nd edition, new results, new proofs, and commentaries for a better orientation of the reader have been added. In particular, in Chapter 7 several new sections concerning applications of the computer program "3-Manifold Recognizer" have been included. "
The computer is the great technological and scientific innovation of the last half of the twentieth century. It has revolutionized how we organize information, how we communicate with each other, and even the way that we think about the human mind. Computers have eased the drudgery of such tasks as calculating sums and clerical work, making them both more bearable and more efficient. The computer has become ubiquitous in many aspects of business, recreation, and everyday life, and the trend is that they are becoming both more powerful and easier to use. Computers: The Life Story of a Technology provides an accessible overview of this ever changing technology history, giving students and lay readers an understanding of the complete scope of its history from ancient times to the present day. In addition to providing a concise biography of how this technology developed, this book provides insights into how the computer has changed our lives: * Demonstrates how, just as the invention of the steam engine in the 1700s stimulated scientists to think of the laws of nature in terms of machines, the success of the computer in the late 1900s prompted scientists to think of the basic laws of the universe as being similar to the operation of a computer. * Provides a worldwide examination of computing, and how such needs as security and defense during the Cold War drove the development of computing technology. * Shows how the computer has entered almost every aspect of daily life in the 21st century The volume includes a glossary of terms, a timeline of important events, and a selected bibliography of useful resources for further information.
This book describes the benefits that emerge when the fields of constraint programming and concurrency meet. On the one hand, constraints can be used in concurrency theory to increase the conciseness and the expressive power of concurrent languages from a pragmatic point of view. On the other hand, problems modeled by using constraints can be solved faster and more efficiently using a concurrent system. Both directions are explored providing two separate lines of development. Firstly the expressive power of a concurrent language is studied, namely Constraint Handling Rules, that supports constraints as a primitive construct. The features of this language which make it Turing powerful are shown. Then a framework is proposed to solve constraint problems that is intended to be deployed on a concurrent system. For the development of this framework the concurrent language Jolie following the Service Oriented paradigm is used. Based on this experience, an extension to Service Oriented Languages is also proposed in order to overcome some of their limitations and to improve the development of concurrent applications.
Information security and copyright protection are more important today than before. Digital watermarking is one of the widely used techniques used in the world in the area of information security. This book introduces a number of digital watermarking techniques and is divided into four parts. The first part introduces the importance of watermarking techniques and intelligent technology. The second part includes a number of watermarking techniques. The third part includes the hybrid watermarking techniques and the final part presents conclusions. This book is directed to students, professors, researchers and application engineers who are interested in the area of information security.
later versions. In addition, the CD-ROM contains a complete solutions manual that includes detailed solutions to all the problems in the book. If the reader does not wish to consult these solutions, then a brief list of answers is provided in printed form at the end of the book. Iwouldliketothankmyfamilymembersfortheirhelpandcontinuedsupportwi- out which this book would not have been possible. I would also like to acknowledge the help of the editior at Springer-Verlag (Dr. Thomas Ditzinger) for his assistance in bringing this book out in its present form. Finally, I would like to thank my brother, Nicola, for preparing most of the line drawings in both editions. In this edition, I am providing two email addresses for my readers to contact me (pkattan@tedata. net. jo and pkattan@lsu. edu). The old email address that appeared in the ?rst edition was cancelled in 2004. December 2006 Peter I. Kattan PrefacetotheFirstEdition 3 This is a book for people who love ?nite elements and MATLAB . We will use the popular computer package MATLAB as a matrix calculator for doing ?nite element analysis. Problems will be solved mainly using MATLAB to carry out the tedious and lengthy matrix calculations in addition to some manual manipulations especially when applying the boundary conditions. In particular the steps of the ?nite element method are emphasized in this book. The reader will not ?nd ready-made MATLAB programsforuseasblackboxes. Insteadstep-by-stepsolutionsof?niteelementpr- lems are examined in detail using MATLAB.
Fundamental Problems in Computing is in honor of Professor Daniel J. Rosenkrantz, a distinguished researcher in Computer Science. Professor Rosenkrantz has made seminal contributions to many subareas of Computer Science including formal languages and compilers, automata theory, algorithms, database systems, very large scale integrated systems, fault-tolerant computing and discrete dynamical systems. For many years, Professor Rosenkrantz served as the Editor-in-Chief of the Journal of the Association for Computing Machinery (JACM), a very prestigious archival journal in Computer Science. His contributions to Computer Science have earned him many awards including the Fellowship from ACM and the ACM SIGMOD Contributions Award.
Competitive intelligence uses public sources to obtain valuable information on competition and competitors. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. An absolutely indispensable playbook for anyone who has to compete during the information explosion. "Martin Sikora, Editor, Mergers and AcquisitionS" Competitive intelligence uses public sources to obtain valuable information on competition and competitors. In an open society such as our own, businesses place a great deal of information in the public domain. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. In fact, the authors contend that as much as 90 percent of the information required to decide on a course of litigation, acquisitions, expansion, new product introduction, or financing, is available through competitive intelligence.
Metamath is a computer language and an associated computer program for archiving, verifying, and studying mathematical proofs. The Metamath language is simple and robust, with an almost total absence of hard-wired syntax, and we believe that it provides about the simplest possible framework that allows essentially all of mathematics to be expressed with absolute rigor. While simple, it is also powerful; the Metamath Proof Explorer (MPE) database has over 23,000 proven theorems and is one of the top systems in the "Formalizing 100 Theorems" challenge. This book explains the Metamath language and program, with specific emphasis on the fundamentals of the MPE database.
Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting potentials can be identified. The book presents the relevant theoretical background and measuring methods as well as proposed solutions. An evaluation of network monitors and checklists rounds out the work.
This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7-10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
This volume represents the papers reviewed and accepted for the HOIT2007 conference held at the Indian Institute of Technology Madras in the city of Chennai, India in August 2007. This volume addresses many of the major themes of current interest in the field, with a particular focus on community-based technologies. This comprehensive book is divided into five different sections reflecting the most up-to-date research on computers and society.
Jack Ganssle has been forming the careers of embedded engineers for
20+ years. He has done this with four books, over 500 articles, a
weekly column, and continuous lecturing. Technology moves fast and
since the first edition of this best-selling classic much has
changed. The new edition will reflect the author's new and ever
evolving philosophy in the face of new technology and realities.
|
You may like...
European Traditions in Didactics of…
Marja Van Den Heuvel-Panhuizen, Rudolf Strasser, …
Hardcover
R1,350
Discovery Miles 13 500
PowerShell, IT Pro Solutions…
William R. Stanek, William Stanek
Hardcover
R1,434
Discovery Miles 14 340
Research Anthology on Agile Software…
Information R Management Association
Hardcover
R14,547
Discovery Miles 145 470
|