Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
With the advent of the World Wide Web, electronic commerce has revolutionized traditional commerce, boosting sales and facilitating exchanges of merchandise and information. The emergence of wireless and mobile networks has made possible the introduction of electronic commerce to a new application and research area: mobile commerce. Handheld Computing for Mobile Commerce: Applications, Concepts and Technologies offers 22 outstanding chapters from 71 world-renowned scholars and IT professionals covering themes such as handheld computing for mobile commerce, handheld computing research and technologies, wireless networks and handheld/mobile security, and handheld images and video. It includes research and development results of lasting significance in the theory, design, implementation, analysis, and application of handheld computing. This book is essential for IT students, researchers, and professionals seeking to better understand handheld devices and concepts, thereby producing more useful and effective handheld applications and products.
This book is a concise navigator across the history of cybernetics, its state-of-the-art and prospects. The evolution of cybernetics (from N. Wiener to the present day) and the reasons of its ups and downs are presented. The correlation of cybernetics with the philosophy and methodology of control, as well as with system theory and systems analysis is clearly demonstrated. The book presents a detailed analysis focusing on the modern trends of research in cybernetics. A new development stage of cybernetics (the so-called cybernetics 2.0) is discussed as a science on general regularities of systems organization and control. The author substantiates the topicality of elaborating a new branch of cybernetics, i.e. organization theory which studies an organization as a property, process and system. The book is intended for theoreticians and practitioners, as well as for students, postgraduates and doctoral candidates. In the first place, the target audience includes tutors and lecturers preparing courses on cybernetics, control theory and systems science.
This text presents an algebraic approach to the construction of several important families of quantum codes derived from classical codes by applying the well-known Calderbank-Shor-Steane (CSS), Hermitian, and Steane enlargement constructions to certain classes of classical codes. In addition, the book presents families of asymmetric quantum codes with good parameters and provides a detailed description of the procedures adopted to construct families of asymmetric quantum convolutional codes.Featuring accessible language and clear explanations, the book is suitable for use in advanced undergraduate and graduate courses as well as for self-guided study and reference. It provides an expert introduction to algebraic techniques of code construction and, because all of the constructions are performed algebraically, it enables the reader to construct families of codes, rather than only codes with specific parameters. The text offers an abundance of worked examples, exercises, and open-ended problems to motivate the reader to further investigate this rich area of inquiry. End-of-chapter summaries and a glossary of key terms allow for easy review and reference.
This introductory textbook is designed for a one-semester course on the use of the matrix and analytical methods for the performance analysis of telecommunication systems. It provides an introduction to the modelling and analysis of telecommunication systems for a broad interdisciplinary audience of students in mathematics and applied disciplines such as computer science, electronics engineering, and operations research.
We are now entering an era where the human world assumes recognition of itself as data. Much of humanity's basis for existence is becoming subordinate to software processes that tabulate, index, and sort the relations that comprise what we perceive as reality. The acceleration of data collection threatens to relinquish ephemeral modes of representation to ceaseless processes of computation. This situation compels the human world to form relations with non-human agencies, to establish exchanges with software processes in order to allow a profound upgrade of our own ontological understanding. By mediating with a higher intelligence, we may be able to rediscover the inner logic of the age of intelligent machines. In The End of the Future, Stephanie Polsky conceives an understanding of the digital through its dynamic intersection with the advent and development of the nation-state, race, colonization, navigational warfare, mercantilism, and capitalism, and the mathematical sciences over the past five centuries, the era during which the world became "modern." The book animates the twenty-first century as an era in which the screen has split off from itself and proliferated onto multiple surfaces, allowing an inverted image of totalitarianism to flash up and be altered to support our present condition of binary apperception. It progresses through a recognition of atomized political power, whose authority lies in the control not of the means of production, but of information, and in which digital media now serves to legitimize and promote a customized micropolitics of identity management. On this new apostolate plane, humanity may be able to shape a new world in which each human soul is captured and reproduced as an autonomous individual bearing affects and identities. The digital infrastructure of the twenty-first century makes it possible for power to operate through an esoteric mathematical means, and for factual material to be manipulated in the interest of advancing the means of control. This volume travels a course from Elizabethan England, to North American slavery, through cybernetic Social Engineering, Cold War counterinsurgency, and the (neo)libertarianism of Silicon Valley in order to arrive at a place where an organizing intelligence that started from an ambition to resourcefully manipulate physical bodies has ended with their profound neutralization.
Technology has impacted how many teachers develop methods of instruction in their classroom settings. The Continuous Practice Improvement (CPI) professional development program introduces teachers to infusing computers into the curriculum and classroom activities seamlessly. ""Infusing Technology into the Classroom: Continuous Practice Improvement"" retells compelling stories of a successful computer-related professional development program that was implemented into Kindergarten through eighth grade classrooms of a Philadelphia school. Through an analysis of the study, a theoretical model to guide technology infused professional development for teachers is discovered.
This volume offers readers various perspectives and visions for cutting-edge research in ubiquitous healthcare. The topics emphasize large-scale architectures and high performance solutions for smart healthcare, healthcare monitoring using large-scale computing techniques, Internet of Things (IoT) and big data analytics for healthcare, Fog Computing, mobile health, large-scale medical data mining, advanced machine learning methods for mining multidimensional sensor data, smart homes, and resource allocation methods for the BANs. The book contains high quality chapters contributed by leading international researchers working in domains, such as e-Health, pervasive and context-aware computing, cloud, grid, cluster, and big-data computing. We are optimistic that the topics included in this book will provide a multidisciplinary research platform to the researchers, practitioners, and students from biomedical engineering, health informatics, computer science, and computer engineering.
This book offers readers an easy introduction into quantum computing as well as into the design for corresponding devices. The authors cover several design tasks which are important for quantum computing and introduce corresponding solutions. A special feature of the book is that those tasks and solutions are explicitly discussed from a design automation perspective, i.e., utilizing clever algorithms and data structures which have been developed by the design automation community for conventional logic (i.e., for electronic devices and systems) and are now applied for this new technology. By this, relevant design tasks can be conducted in a much more efficient fashion than before - leading to improvements of several orders of magnitude (with respect to runtime and other design objectives). Describes the current state of the art for designing quantum circuits, for simulating them, and for mapping them to real hardware; Provides a first comprehensive introduction into design automation for quantum computing that tackles practically relevant tasks; Targets the quantum computing community as well as the design automation community, showing both perspectives to quantum computing, and what impressive improvements are possible when combining the knowledge of both communities.
This new edition of a well-received textbook provides a concise introduction to both the theoretical and experimental aspects of quantum information at the graduate level. While the previous edition focused on theory, the book now incorporates discussions of experimental platforms. Several chapters on experimental implementations of quantum information protocols have been added: implementations using neutral atoms, trapped ions, optics, and solidstate systems are each presented in its own chapter. Previous chapters on entanglement, quantum measurements, quantum dynamics, quantum cryptography, and quantum algorithms have been thoroughly updated, and new additions include chapters on the stabilizer formalism and the Gottesman-Knill theorem as well as aspects of classical and quantum information theory. To facilitate learning, each chapter starts with a clear motivation to the topic and closes with exercises and a recommended reading list. Quantum Information Processing: Theory and Implementation will be essential to graduate students studying quantum information as well as and researchers in other areas of physics who wish to gain knowledge in the field.
When researchers in computer-mediated communications discuss digital textuality, they rarely venture beyond the now commonplace notion that computer textuality embodies contemporary post-structuralist theories. Written for students and faculty of contemporary literature and composition theories, this book is the first to move from general to specific considerations. Advancing from general considerations of how computers are changing literacy, Digital Fictions moves on to a specific consideration of how computers are altering one particular set of literature practices: reading and writing fiction. Suffused through the sensibility of a creative writer, this book includes an historical overview of writing stories on computers. In addition, Sloane conducts interviews with the makers of hypertext fictions (including Stuart Moulthrop, Michael Joyce, and Carolyn Guyer) and offers close reading of digital fictions. Making careful analyses of the meaning-making activities of both readers and writers of this emerging genre, this work is embedded in a perspective both feminist and semiotic. Digital Fictions explores and distinguishes among four distinct iterations of text-based digital fictions; text adventures, Carnegie Mellon University Oz Project, hypertext fictions, and MUDs. Ultimately, Sloane revises the rhetorical triangle and proposes a new rhetorical theory, one that attends to the materials, processes, and locations of stories told on-line.
Alfred Tarski was one of the two giants of the twentieth-century development of logic, along with Kurt Goedel. The four volumes of this collection contain all of Tarski's papers and abstracts published during his lifetime, as well as a comprehensive bibliography. Here will be found many of the works, spanning the period 1921 through 1979, which are the bedrock of contemporary areas of logic, whether in mathematics or philosophy. These areas include the theory of truth in formalized languages, decision methods and undecidable theories, foundations of geometry, set theory, and model theory, algebraic logic, and universal algebra.
This timely text/reference presents a comprehensive review of the workflow scheduling algorithms and approaches that are rapidly becoming essential for a range of software applications, due to their ability to efficiently leverage diverse and distributed cloud resources. Particular emphasis is placed on how workflow-based automation in software-defined cloud centers and hybrid IT systems can significantly enhance resource utilization and optimize energy efficiency. Topics and features: describes dynamic workflow and task scheduling techniques that work across multiple (on-premise and off-premise) clouds; presents simulation-based case studies, and details of real-time test bed-based implementations; offers analyses and comparisons of a broad selection of static and dynamic workflow algorithms; examines the considerations for the main parameters in projects limited by budget and time constraints; covers workflow management systems, workflow modeling and simulation techniques, and machine learning approaches for predictive workflow analytics. This must-read work provides invaluable practical insights from three subject matter experts in the cloud paradigm, which will empower IT practitioners and industry professionals in their daily assignments. Researchers and students interested in next-generation software-defined cloud environments will also greatly benefit from the material in the book.
This book presents new efficient methods for optimization in realistic large-scale, multi-agent systems. These methods do not require the agents to have the full information about the system, but instead allow them to make their local decisions based only on the local information, possibly obtained during communication with their local neighbors. The book, primarily aimed at researchers in optimization and control, considers three different information settings in multi-agent systems: oracle-based, communication-based, and payoff-based. For each of these information types, an efficient optimization algorithm is developed, which leads the system to an optimal state. The optimization problems are set without such restrictive assumptions as convexity of the objective functions, complicated communication topologies, closed-form expressions for costs and utilities, and finiteness of the system's state space.
This book presents selected papers from the 3rd International Workshop on Computational Engineering held in Stuttgart from October 6 to 10, 2014, bringing together innovative contributions from related fields with computer science and mathematics as an important technical basis among others. The workshop discussed the state of the art and the further evolution of numerical techniques for simulation in engineering and science. We focus on current trends in numerical simulation in science and engineering, new requirements arising from rapidly increasing parallelism in computer architectures, and novel mathematical approaches. Accordingly, the chapters of the book particularly focus on parallel algorithms and performance optimization, coupled systems, and complex applications and optimization.
Telemedicine has the potential to significantly alter structures, procedures, and eventually outcomes in healthcare structures worldwide. Today the field of telemedicine is still dominated by very committed efforts of research and development. However, there is a growing number of concepts and applications which have been implemented in clinical routine or are ready to be implemented. Health care providers, patients, third party payers, and not at least policy makers should be informed about these rapidly emerging applications, which could have a considerable impact on the delivery of health care. This book offers a comprehensive source of information not only for experts but also for the target groups mentioned above. It provides background information on legal aspects, issues concerning further development, and evaluation of telemedicine applications. The work also presents numerous projects covering the clinical fields of emergency medicine, surgery, oncology, cardiology, endocrinology, ophthalmology, dermatology, radiology, pathology, psychiatry and other clinical specialties. Recognizing the Internet as a major source of information on issues related to telemedicine and information technology in general, a 'Webliography' provides links to a selection of the most relevant Web sites on the Internet.
The authors describe systematic methods for uncovering scientific laws a priori, on the basis of intuition, or "Gedanken Experiments". Mathematical expressions of scientific laws are, by convention, constrained by the rule that their form must be invariant with changes of the units of their variables. This constraint makes it possible to narrow down the possible forms of the laws. It is closely related to, but different from, dimensional analysis. It is a mathematical book, largely based on solving functional equations. In fact, one chapter is an introduction to the theory of functional equations.
This is a volume of chapters on the historical study of information, computing, and society written by seven of the most senior, distinguished members of the History of Computing field. These are edited, expanded versions of papers presented in a distinguished lecture series in 2018 at the University of Colorado Boulder - in the shadow of the Flatirons, the front range of the Rocky Mountains. Topics range widely across the history of computing. They include the digitalization of computer and communication technologies, gender history of computing, the history of data science, incentives for innovation in the computing field, labor history of computing, and the process of standardization. Authors were given wide latitude to write on a topic of their own choice, so long as the result is an exemplary article that represents the highest level of scholarship in the field, producing articles that scholars in the field will still look to read twenty years from now. The intention is to publish articles of general interest, well situated in the research literature, well grounded in source material, and well-polished pieces of writing. The volume is primarily of interest to historians of computing, but individual articles will be of interest to scholars in media studies, communication, computer science, cognitive science, general and technology history, and business.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study's parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware. Bionic Optimization means finding the best solution to a problem using methods found in nature. As Evolutionary Strategies and Particle Swarm Optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them. A set of sample applications shows how Bionic Optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for Multi-Objective-Optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating Bionic Optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented. The closing section focuses on an overview and outlook on reliable and robust as well as on Multi-Objective-Optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
This unique text/reference provides an overview of crossbar-based interconnection networks, offering novel perspectives on these important components of high-performance, parallel-processor systems. A particular focus is placed on solutions to the blocking and scalability problems. Topics and features: introduces the fundamental concepts in interconnection networks in multi-processor systems, including issues of blocking, scalability, and crossbar networks; presents a classification of interconnection networks, and provides information on recognizing each of the networks; examines the challenges of blocking and scalability, and analyzes the different solutions that have been proposed; reviews a variety of different approaches to improve fault tolerance in multistage interconnection networks; discusses the scalable crossbar network, which is a non-blocking interconnection network that uses small-sized crossbar switches as switching elements. This invaluable work will be of great benefit to students, researchers and practitioners interested in computer networks, parallel processing and reliability engineering. The text is also essential reading for course modules on interconnection network design and reliability.
Cloud service benchmarking can provide important, sometimes surprising insights into the quality of services and leads to a more quality-driven design and engineering of complex software architectures that use such services. Starting with a broad introduction to the field, this book guides readers step-by-step through the process of designing, implementing and executing a cloud service benchmark, as well as understanding and dealing with its results. It covers all aspects of cloud service benchmarking, i.e., both benchmarking the cloud and benchmarking in the cloud, at a basic level. The book is divided into five parts: Part I discusses what cloud benchmarking is, provides an overview of cloud services and their key properties, and describes the notion of a cloud system and cloud-service quality. It also addresses the benchmarking lifecycle and the motivations behind running benchmarks in particular phases of an application lifecycle. Part II then focuses on benchmark design by discussing key objectives (e.g., repeatability, fairness, or understandability) and defining metrics and measurement methods, and by giving advice on developing own measurement methods and metrics. Next, Part III explores benchmark execution and implementation challenges and objectives as well as aspects like runtime monitoring and result collection. Subsequently, Part IV addresses benchmark results, covering topics such as an abstract process for turning data into insights, data preprocessing, and basic data analysis methods. Lastly, Part V concludes the book with a summary, suggestions for further reading and pointers to benchmarking tools available on the Web. The book is intended for researchers and graduate students of computer science and related subjects looking for an introduction to benchmarking cloud services, but also for industry practitioners who are interested in evaluating the quality of cloud services or who want to assess key qualities of their own implementations through cloud-based experiments.
This self-contained essay collection is published to commemorate half a century of Bell's theorem. Like its much acclaimed predecessor "Quantum [Un]Speakables: From Bell to Quantum Information" (published 2002), it comprises essays by many of the worlds leading quantum physicists and philosophers. These revisit the foundations of quantum theory as well as elucidating the remarkable progress in quantum technologies achieved in the last couple of decades. Fundamental concepts such as entanglement, nonlocality and contextuality are described in an accessible manner and, alongside lively descriptions of the various theoretical and experimental approaches, the book also delivers interesting philosophical insights. The collection as a whole will serve as a broad introduction for students and newcomers as well as delighting the scientifically literate general reader.
This book discusses the fusion of mobile and WiFi network data with semantic technologies and diverse context sources for offering semantically enriched context-aware services in the telecommunications domain. It presents the OpenMobileNetwork as a platform for providing estimated and semantically enriched mobile and WiFi network topology data using the principles of Linked Data. This platform is based on the OpenMobileNetwork Ontology consisting of a set of network context ontology facets that describe mobile network cells as well as WiFi access points from a topological perspective and geographically relate their coverage areas to other context sources. The book also introduces Linked Crowdsourced Data and its corresponding Context Data Cloud Ontology, which is a crowdsourced dataset combining static location data with dynamic context information. Linked Crowdsourced Data supports the OpenMobileNetwork by providing the necessary context data richness for more sophisticated semantically enriched context-aware services. Various application scenarios and proof of concept services as well as two separate evaluations are part of the book. As the usability of the provided services closely depends on the quality of the approximated network topologies, it compares the estimated positions for mobile network cells within the OpenMobileNetwork to a small set of real-world cell positions. The results prove that context-aware services based on the OpenMobileNetwork rely on a solid and accurate network topology dataset. The book also evaluates the performance of the exemplary Semantic Tracking as well as Semantic Geocoding services, verifying the applicability and added value of semantically enriched mobile and WiFi network data. |
You may like...
Systems Analysis And Design In A…
John Satzinger, Robert Jackson, …
Hardcover
(1)
Discovering Computers (c)2017
Mark Frydenberg, Misty Vermaat, …
Paperback
(3)
R966 Discovery Miles 9 660
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,163
Discovery Miles 21 630
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|