![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
As society rushes to digitize sensitive information and services, it is imperative to adopt adequate security protections. However, such protections fundamentally conflict with the benefits we expect from commodity computers. In other words, consumers and businesses value commodity computers because they provide good performance and an abundance of features at relatively low costs. Meanwhile, attempts to build secure systems from the ground up typically abandon such goals, and hence are seldomadopted. In this book, I argue that we can resolve the tension between security and features by leveraging the trust a user has in one device to enable her to securely use another commodity device or service, without sacrificing the performance and features expected of commodity systems. At a high level, we support this premise by developing techniques to allow a user to employ a small, trusted, portable device to securely learn what code is executing on her local computer. Rather than entrusting her data to the mountain of buggy code likely running on her computer, we construct an on-demand secure execution environment which can perform security-sensitive tasks and handle private data in complete isolation from all other software (and most hardware) on the system. Meanwhile, non-security-sensitive software retains the same abundance of features and performance it enjoys today. Having established an environment for secure code execution on an individual computer, we then show how to extend trust in this environment to network elements in a secure and efficient manner. This allows us to reexamine the design of network protocols and defenses, since we can now execute code on endhosts and trust the results within the network. Lastly, we extend the user's trust one more step to encompass computations performed on a remote host (e.g., in the cloud). We design, analyze, and prove secure a protocol that allows a user to outsource arbitrary computations to commodity computers run by an untrusted remote party (or parties) who may subject the computers to both software and hardware attacks. Our protocol guarantees that the user can both verify that the results returned are indeed the correct results of the specified computations on the inputs provided, and protect the secrecy of both the inputs and outputs of the computations. These guarantees are provided in a non-interactive, asymptotically optimal (with respect to CPU and bandwidth) manner. Thus, extending a user's trust, via software, hardware, and cryptographic techniques, allows us to provide strong security protections for both local and remote computations on sensitive data, while still preserving the performance and features of commodity computers.
Real-time computing plays a vital role in ultra-reliable and safety-critical applications in fields as diverse as flight control, telecommunication systems, nuclear plant supervision and surgical operation monitoring. Providing a comprehensive overview, this book examines the most significant real-time scheduling policies in use today. Scheduling in Real-Time Systems presents:
An increasing number of global institutions look to advancements in technology to enhance access to learning and development and, in doing so, seek collaborative opportunities to maximize the benefits of educational technology. Cases on Technology Enhanced Learning through Collaborative Opportunities analyzes and evaluates how organizations and institutions of learning in the developing and developed world are adapting to technology enhanced learning environments and exploring transnational collaborative opportunities, providing prospects for learning, growth and development through a blend of traditional and technological methods.
The only book of its kind to look at how our legal system needs to change to accommodate a world in which machines, in addition to people, make decisions. For years, robots were solely a matter of science fiction. Today, artificial intelligence technologies serve to accelerate our already fast-paced lives even further. From Apple's Siri to the Google Car to GPS, machines and technologies that make decisions and take action without direct human supervision have become commonplace in our daily lives. As a result, laws must be amended to protect companies that produce robots and the people that buy and use them. This book provides an extensive examination of how numerous legal areas-including liability, traffic, zoning, and international and constitutional law-must adapt to the widespread use of artificial intelligence in nearly every area of our society. The author scrutinizes the laws governing such fields as transportation, medicine, law enforcement, childcare, and real estate development. Describes court cases, regulations, and statutes that are affected by the technological advances of artificial intelligence Eschews overtly technical or legalistic discussions to provide clear, accessible information Discusses a number of popular, topical, and controversial technologies, providing historical background for each and their legal implications Focuses on devices that are already in use to illustrate where the law falls short in governing artificial intelligence and how legal models should be amended
In this book the editors have gathered a number of contributions by persons who have been working on problems of Cognitive Technology (CT). The present collection initiates explorations of the human mind via the technologies the mind produces. These explorations take as their point of departure the question What happens when humans produce new technologies? Two interdependent perspectives from which such a production can be approached are adopted: - How and why constructs that have their origins in human mental life are embodied in physical environments when people fabricate their habitat, even to the point of those constructs becoming that very habitat - How and why these fabricated habitats affect, and feed back into, human mental life. The aim of the CT research programme is to determine, in general, which technologies, and in particular, which interactive computer-based technologies, are humane with respect to the cognitive development and evolutionary adaptation of their end users. But what does it really mean to be humane in a technological world? To shed light on this central issue other pertinent questions are raised, e.g. - Why are human minds externalised, i.e., what purpose does the process of externalisation serve? - What can we learn about the human mind by studying how it externalises itself? - How does the use of externalised mental constructs (the objects we call 'tools') change people fundamentally? - To what extent does human interaction with technology serve as an amplification of human cognition, and to what extent does it lead to a atrophy of the human mind? The book calls for a reflection on what a tool is. Strong parallels between CT andenvironmentalism are drawn: both are seen as trends having originated in our need to understand how we manipulate, by means of the tools we have created, our natural habitat consisting of, on the one hand, the cognitive environment which generates thought and determines action, and on the other hand, the physical environment in which thought and action are realised. Both trends endeavour to protect the human habitat from the unwanted or uncontrolled impact of technology, and are ultimately concerned with the ethics and aesthetics of tool design and tool use. Among the topics selected by the contributors to the book, the following themes emerge (the list is not exhaustive): using technology to empower the cognitively impaired; the ethics versus aesthetics of technology; the externalisation of emotive and affective life and its special dialectic ('mirror') effects; creativity enhancement: cognitive space, problem tractability; externalisation of sensory life and mental imagery; the engineering and modelling aspects of externalised life; externalised communication channels and inner dialogue; externalised learning protocols; relevance analysis as a theoretical framework for cognitive technology.
This two-volume set focuses on fundamental concepts and design goals (i.e., a switch/router's key features), architectures, and practical applications of switch/routers in IP networks. The discussion includes practical design examples to illustrate how switch/routers are designed and how the key features are implemented. Designing Switch/Routers: Fundamental Concepts, Design Methods, Architectures, and Applications begins by providing an introductory level discussion that covers the functions and architectures of the switch/router. The first book considers the switch/router as a generic Layer 2 and Layer 3 forwarding device without placing emphasis on any particular manufacturer's device. The underlining concepts and design methods are not only positioned to be applicable to this generic switch/router, but also to the typical switch/router seen in the industry. The discussion provides a better insight into the protocols, methods, processes, and tools that go into designing switch/routers. The second volume explains the design and architectural considerations, as well as, the typical processes and steps used to build practical switch/routers. It then discusses the advantages of using Ethernet in today's networks and why Ethernet continues to play a bigger role in Local Area Network (LAN), Metropolitan Area Network (MAN), and Wide Area Network (WAN) design. This book set provides a discussion of the design of switch/routers and is written in a style to appeal to undergraduate and graduate-level students, engineers, and researchers in the networking and telecoms industry, as well as academics and other industry professionals. The material and discussion are structured in such a way that they could serve as standalone teaching material for networking and telecom courses and/or supplementary material for such courses.
This two-volume set focuses on fundamental concepts and design goals (i.e., a switch/router's key features), architectures, and practical applications of switch/routers in IP networks. The discussion includes practical design examples to illustrate how switch/routers are designed and how the key features are implemented. Designing Switch/Routers: Fundamental Concepts, Design Methods, Architectures, and Applications begins by providing an introductory level discussion that covers the functions and architectures of the switch/router. The first book considers the switch/router as a generic Layer 2 and Layer 3 forwarding device without placing emphasis on any particular manufacturer's device. The underlining concepts and design methods are not only positioned to be applicable to this generic switch/router, but also to the typical switch/router seen in the industry. The discussion provides a better insight into the protocols, methods, processes, and tools that go into designing switch/routers. The second volume explains the design and architectural considerations, as well as, the typical processes and steps used to build practical switch/routers. It then discusses the advantages of using Ethernet in today's networks and why Ethernet continues to play a bigger role in Local Area Network (LAN), Metropolitan Area Network (MAN), and Wide Area Network (WAN) design. This book set provides a discussion of the design of switch/routers and is written in a style to appeal to undergraduate and graduate-level students, engineers, and researchers in the networking and telecoms industry, as well as academics and other industry professionals. The material and discussion are structured in such a way that they could serve as standalone teaching material for networking and telecom courses and/or supplementary material for such courses.
As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the entire industry. A discussion of three commonly used subject classification systems precedes an annotated bibliography of over 500 items. As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the computer industry: teaching institutions, research institutes and laboratories, manufacturers, standardization organizations, professional associations and societies, and publishers. He begins with a discussion of the three subject classification systems most commonly used to describe, index, and manage computer science information: the Association for Computing Machinery, Inspec, and the Library of Congress. An annotated bibliography of over 500 items follows, grouped by material type, and featuring a mix of classic works and current sources.
This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.
The wireless medium is a shared resource. If nearby devices transmit at the same time, their signals interfere, resulting in a collision. In traditional networks, collisions cause the loss of the transmitted information. For this reason, wireless networks have been designed with the assumption that interference is intrinsically harmful and must be avoided. This book, a revised version of the author's award-winning Ph.D. dissertation, takes an alternate approach: Instead of viewing interference as an inherently counterproductive phenomenon that should to be avoided, we design practical systems that transform interference into a harmless, and even a beneficial phenomenon. To achieve this goal, we consider how wireless signals interact when they interfere, and use this understanding in our system designs. Specifically, when interference occurs, the signals get mixed on the wireless medium. By understanding the parameters of this mixing, we can invert the mixing and decode the interfered packets; thus, making interference harmless. Furthermore, we can control this mixing process to create strategic interference that allow decodability at a particular receiver of interest, but prevent decodability at unintended receivers and adversaries. Hence, we can transform interference into a beneficial phenomenon that provides security. Building on this approach, we make four main contributions: We present the first WiFi receiver that can successfully reconstruct the transmitted information in the presence of packet collisions. Next, we introduce a WiFi receiver design that can decode in the presence of high-power cross-technology interference from devices like baby monitors, cordless phones, microwave ovens, or even unknown technologies. We then show how we can harness interference to improve security. In particular, we develop the first system that secures an insecure medical implant without any modification to the implant itself. Finally, we present a solution that establishes secure connections between any two WiFi devices, without having users enter passwords or use pre-shared secret keys.
Learn to Create and Write Your Own Apps Do you have a great idea for an app or a game? Would you like to make your dream a reality? Do you need the tools and skills to start making your own apps? When you purchase Swift Programming Guide: Create a Fully Functioning App in a Day, you'll learn how to make your own apps and programs right away! These fun and easy tips transform the dreaded chore of learning programming code into a fun hobby. You'll be proud to show off your creations to your friends, coworkers, and family! Would you like to know more about: Playgrounds? Classes and Methods? Arrays and For Loops? Creating Your First iOS App? Storyboards and Interface Builders? This helpful book explains how to use Xcode and Apple's new coding language, Swift, to create amazing new products. It takes you step-by-step through the process of writing your first app! Download Swift Programming Guide: Create a Fully Functioning App in a Day now, and start making your own apps TODAY!
Ambient Intelligence (AmI) is an emerging paradigm for knowledge discovery, which originally emerged as a design language for invisible computing and smart environments. Since its introduction in the late 1990's, AmI has matured and evolved, having inspired fields including computer science, interaction design, mobile computing, and cognitive science. Ubiquitous Developments in Ambient Computing and Intelligence: Human-Centered Applications provides a comprehensive collection of knowledge in cutting-edge research in fields as diverse as distributed computing, human computer interaction, ubiquitous computing, embedded systems, and other interdisciplinary areas which all contribute to AmI. Predicting the technologies that will shape our ever changing world is difficult however, in this book it is discussed that Ambient Intelligent technology will develop considerably in the future.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of significant, lasting value in this rapidly
expanding field.
The first comprehensive guide to explore the growing field of electronic information, The Text in the Machine: Electronic Texts in the Humanities will help you create and use electronic texts. This book explains the processes involved in developing computerized books on library Web sites, CD-ROMs, or your own Web site. With the information provided by The Text in the Machine, you?ll be able to successfully transfer written words to a digitized form and increase access to any kind of information. Keeping the perspectives of scholars, students, librarians, users, and publishers in mind, this book outlines the necessary steps for electronic conversion in a comprehensive manner. The Text in the Machine addresses many variables that need to be taken into consideration to help you digitize texts, such as: defining types of markup, markup systems, and their uses identifying characteristics of the written text, such as its linguistic and physical nature, before choosing a markup scheme ensuring accuracy in electronic texts by keying in information up to three times and choosing software that is compatible with the markup systems you are using examining the best file formats for scanning written texts and converting them to digital form explaining the delivery systems available for electronic texts, such as CD-ROMs, the Internet, magnetic tape, and the variety of software that will interpret these interfaces designing the structure of electronic texts with linear presentation, segmented text, or image files to increase readability and accessibility Containing lists of suggested readings and examples of electronic text Web sites, this book provides you with the opportunity to see how other libraries and scholars are creating and publishing digital texts. From The Text in the Machine, you?ll receive the knowledge to make this medium of information accessible and beneficial to patrons and scholars around the world.
As technology becomes further meshed into our culture and everyday lives, new mediums and outlets for creative expression and innovation are necessary. The Handbook of Research on Computational Arts and Creative Informatics covers a comprehensive range of topics regarding the interaction of the sciences and the arts. Exploring new uses of technology and investigating creative insights into concepts of art and expression, this cutting-edge Handbook of Research offers a valuable resource to academicians, researchers, and field practitioners.
As science becomes increasingly computational, the limits of what is computationally tractable become a barrier to scientific progress. Many scientific problems, however, are amenable to human problem solving skills that complement computational power. By leveraging these skills on a larger scale-beyond the relatively few individuals currently engaged in scientific inquiry-there is the potential for new scientific discoveries. This book presents a framework for mapping open scientific problems into video games. The game framework combines computational power with human problem solving and creativity to work toward solving scientific problems that neither computers nor humans could previously solve alone. To maximize the potential contributors to scientific discovery, the framework designs a game to be played by people with no formal scientific background and incentivizes long-term engagement with a myriad of collaborative or competitive reward structures. The framework allows for the continual coevolution of the players and the game to each other: as players gain expertise through gameplay, the game changes to become a better tool. The framework is validated by being applied to proteomics problems with the video game Foldit. Foldit players have contributed to novel discoveries in protein structure prediction, protein design, and protein structure refinement algorithms. The coevolution of human problem solving and computer tools in an incentivized game framework is an exciting new scientific pathway that can lead to discoveries currently unreachable by other methods.
Useful to healthcare providers, severity indices conclude which patients are most at risk for infection as well as the intensity of illness while in the hospital. ""Text Mining Techniques for Healthcare Provider Quality Determination: Methods for Rank Comparisons"" discusses the general practice of defining a patient severity index for risk adjustments and comparison of patient outcomes to assess quality factors. This ""Premier Reference Source"" examines the consequences of patient severity models and investigates the general assumptions required to perform standard severity adjustment.
A comprehensive one-year graduate (or advanced undergraduate)
course in mathematical logic and foundations of mathematics. No
previous knowledge of logic is required; the book is suitable for
self-study. Many exercises (with hints) are included.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
The International Conference on Informatics and Management Science (IMS) 2012 will be held on November 16-19, 2012, in Chongqing, China, which is organized by Chongqing Normal University, Chongqing University, Shanghai Jiao Tong University, Nanyang Technological University, University of Michigan, Chongqing University of Arts and Sciences, and sponsored by National Natural Science Foundation of China (NSFC). The objective of IMS 2012 is to facilitate an exchange of information on best practices for the latest research advances in a range of areas. Informatics and Management Science contains over 600 contributions to suggest and inspire solutions and methods drawing from multiple disciplines including: Computer Science Communications and Electrical Engineering Management Science Service Science Business Intelligence
This edited volume collects the research results presented at the 14th International Symposium on Computer Methods in Biomechanics and Biomedical Engineering, Tel Aviv, Israel, 2016. The topical focus includes, but is not limited to, cardiovascular fluid dynamics, computer modeling of tissue engineering, skin and spine biomechanics, as well as biomedical image analysis and processing. The target audience primarily comprises research experts in the field of bioengineering, but the book may also be beneficial for graduate students alike. |
You may like...
The Oxford Handbook of Music and…
Sheila Whiteley, Shara Rambarran
Hardcover
R4,713
Discovery Miles 47 130
Discovering Computers - Digital…
Misty Vermaat, Mark Ciampa, …
Paperback
|