![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
The development of social technologies has brought about a new era of political planning and government interactions. In addition to reducing costs in city resource management, ICT and social media can be used in emergency situations as a mechanism for citizen engagement, to facilitate public administration communication, etc. In spite of all these advantages, the application of technologies by governments and the public sector has also fostered debate in terms of cyber security due to the vulnerabilities and risks that can befall different stakeholders. It is necessary to review the most recent research about the implementation of ICTs in the public sector with the aim of understanding both the strengths and the vulnerabilities that the management models can entail. Special Applications of ICTs in Digital Government and the Public Sector: Emerging Research and Opportunities is a collection of innovative research on the methods and applications of ICT implementation in the public sector that seeks to allow readers to understand how ICTs have forced public administrations to undertake reforms to both their workflow and their means of interacting with citizens. While highlighting topics including e-government, emergency communications, and urban planning, this book is ideally designed for government officials, public administrators, public managers, policy holders, policymakers, public consultants, professionals, academicians, students, and researchers seeking current research on the digital communication channels between elected officials and the citizens they represent.
Filling the gaps between subjective vehicle assessment, classical vehicle dynamics and computer-based multibody approaches, "The Multibody Systems Approach to Vehicle Dynamics" offers unique coverage of both the virtual and practical aspects of vehicle dynamics from concept design to system analysis and handling development. The book provides valuable foundation knowledge of vehicle dynamics as well as drawing on laboratory studies, test-track work, and finished vehicle applications to gel theory with practical examples and observations. Combined with insights into the capabilities and limitations of multibody simulation, this comprehensive mix provides the background understanding, practical reality and simulation know-how needed to make and interpret useful models. New to this edition you will find coverage of the latest tire
models, changes to the modeling of light commercial vehicles,
developments in active safety systems, torque vectoring, and
examples in AView, as well as updates to theory, simulation, and
modeling techniques throughout.
As society rushes to digitize sensitive information and services, it is imperative to adopt adequate security protections. However, such protections fundamentally conflict with the benefits we expect from commodity computers. In other words, consumers and businesses value commodity computers because they provide good performance and an abundance of features at relatively low costs. Meanwhile, attempts to build secure systems from the ground up typically abandon such goals, and hence are seldomadopted. In this book, I argue that we can resolve the tension between security and features by leveraging the trust a user has in one device to enable her to securely use another commodity device or service, without sacrificing the performance and features expected of commodity systems. At a high level, we support this premise by developing techniques to allow a user to employ a small, trusted, portable device to securely learn what code is executing on her local computer. Rather than entrusting her data to the mountain of buggy code likely running on her computer, we construct an on-demand secure execution environment which can perform security-sensitive tasks and handle private data in complete isolation from all other software (and most hardware) on the system. Meanwhile, non-security-sensitive software retains the same abundance of features and performance it enjoys today. Having established an environment for secure code execution on an individual computer, we then show how to extend trust in this environment to network elements in a secure and efficient manner. This allows us to reexamine the design of network protocols and defenses, since we can now execute code on endhosts and trust the results within the network. Lastly, we extend the user's trust one more step to encompass computations performed on a remote host (e.g., in the cloud). We design, analyze, and prove secure a protocol that allows a user to outsource arbitrary computations to commodity computers run by an untrusted remote party (or parties) who may subject the computers to both software and hardware attacks. Our protocol guarantees that the user can both verify that the results returned are indeed the correct results of the specified computations on the inputs provided, and protect the secrecy of both the inputs and outputs of the computations. These guarantees are provided in a non-interactive, asymptotically optimal (with respect to CPU and bandwidth) manner. Thus, extending a user's trust, via software, hardware, and cryptographic techniques, allows us to provide strong security protections for both local and remote computations on sensitive data, while still preserving the performance and features of commodity computers.
Real-time computing plays a vital role in ultra-reliable and safety-critical applications in fields as diverse as flight control, telecommunication systems, nuclear plant supervision and surgical operation monitoring. Providing a comprehensive overview, this book examines the most significant real-time scheduling policies in use today. Scheduling in Real-Time Systems presents:
In this book the editors have gathered a number of contributions by persons who have been working on problems of Cognitive Technology (CT). The present collection initiates explorations of the human mind via the technologies the mind produces. These explorations take as their point of departure the question What happens when humans produce new technologies? Two interdependent perspectives from which such a production can be approached are adopted: - How and why constructs that have their origins in human mental life are embodied in physical environments when people fabricate their habitat, even to the point of those constructs becoming that very habitat - How and why these fabricated habitats affect, and feed back into, human mental life. The aim of the CT research programme is to determine, in general, which technologies, and in particular, which interactive computer-based technologies, are humane with respect to the cognitive development and evolutionary adaptation of their end users. But what does it really mean to be humane in a technological world? To shed light on this central issue other pertinent questions are raised, e.g. - Why are human minds externalised, i.e., what purpose does the process of externalisation serve? - What can we learn about the human mind by studying how it externalises itself? - How does the use of externalised mental constructs (the objects we call 'tools') change people fundamentally? - To what extent does human interaction with technology serve as an amplification of human cognition, and to what extent does it lead to a atrophy of the human mind? The book calls for a reflection on what a tool is. Strong parallels between CT andenvironmentalism are drawn: both are seen as trends having originated in our need to understand how we manipulate, by means of the tools we have created, our natural habitat consisting of, on the one hand, the cognitive environment which generates thought and determines action, and on the other hand, the physical environment in which thought and action are realised. Both trends endeavour to protect the human habitat from the unwanted or uncontrolled impact of technology, and are ultimately concerned with the ethics and aesthetics of tool design and tool use. Among the topics selected by the contributors to the book, the following themes emerge (the list is not exhaustive): using technology to empower the cognitively impaired; the ethics versus aesthetics of technology; the externalisation of emotive and affective life and its special dialectic ('mirror') effects; creativity enhancement: cognitive space, problem tractability; externalisation of sensory life and mental imagery; the engineering and modelling aspects of externalised life; externalised communication channels and inner dialogue; externalised learning protocols; relevance analysis as a theoretical framework for cognitive technology.
The wireless medium is a shared resource. If nearby devices transmit at the same time, their signals interfere, resulting in a collision. In traditional networks, collisions cause the loss of the transmitted information. For this reason, wireless networks have been designed with the assumption that interference is intrinsically harmful and must be avoided. This book, a revised version of the author's award-winning Ph.D. dissertation, takes an alternate approach: Instead of viewing interference as an inherently counterproductive phenomenon that should to be avoided, we design practical systems that transform interference into a harmless, and even a beneficial phenomenon. To achieve this goal, we consider how wireless signals interact when they interfere, and use this understanding in our system designs. Specifically, when interference occurs, the signals get mixed on the wireless medium. By understanding the parameters of this mixing, we can invert the mixing and decode the interfered packets; thus, making interference harmless. Furthermore, we can control this mixing process to create strategic interference that allow decodability at a particular receiver of interest, but prevent decodability at unintended receivers and adversaries. Hence, we can transform interference into a beneficial phenomenon that provides security. Building on this approach, we make four main contributions: We present the first WiFi receiver that can successfully reconstruct the transmitted information in the presence of packet collisions. Next, we introduce a WiFi receiver design that can decode in the presence of high-power cross-technology interference from devices like baby monitors, cordless phones, microwave ovens, or even unknown technologies. We then show how we can harness interference to improve security. In particular, we develop the first system that secures an insecure medical implant without any modification to the implant itself. Finally, we present a solution that establishes secure connections between any two WiFi devices, without having users enter passwords or use pre-shared secret keys.
As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the entire industry. A discussion of three commonly used subject classification systems precedes an annotated bibliography of over 500 items. As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the computer industry: teaching institutions, research institutes and laboratories, manufacturers, standardization organizations, professional associations and societies, and publishers. He begins with a discussion of the three subject classification systems most commonly used to describe, index, and manage computer science information: the Association for Computing Machinery, Inspec, and the Library of Congress. An annotated bibliography of over 500 items follows, grouped by material type, and featuring a mix of classic works and current sources.
The only book of its kind to look at how our legal system needs to change to accommodate a world in which machines, in addition to people, make decisions. For years, robots were solely a matter of science fiction. Today, artificial intelligence technologies serve to accelerate our already fast-paced lives even further. From Apple's Siri to the Google Car to GPS, machines and technologies that make decisions and take action without direct human supervision have become commonplace in our daily lives. As a result, laws must be amended to protect companies that produce robots and the people that buy and use them. This book provides an extensive examination of how numerous legal areas-including liability, traffic, zoning, and international and constitutional law-must adapt to the widespread use of artificial intelligence in nearly every area of our society. The author scrutinizes the laws governing such fields as transportation, medicine, law enforcement, childcare, and real estate development. Describes court cases, regulations, and statutes that are affected by the technological advances of artificial intelligence Eschews overtly technical or legalistic discussions to provide clear, accessible information Discusses a number of popular, topical, and controversial technologies, providing historical background for each and their legal implications Focuses on devices that are already in use to illustrate where the law falls short in governing artificial intelligence and how legal models should be amended
Software history has a deep impact on current software designers, computer scientists, and technologists. System constraints imposed in the past and the designs that responded to them are often unknown or poorly understood by students and practitioners, yet modern software systems often include "old" software and "historical" programming techniques. This work looks at software history through specific software areas to develop student-consumable practices, design principles, lessons learned, and trends useful in current and future software design. It also exposes key areas that are widely used in modern software, yet infrequently taught in computing programs. Written as a textbook, this book uses specific cases from the past and present to explore the impact of software trends and techniques. Building on concepts from the history of science and technology, software history examines such areas as fundamentals, operating systems, programming languages, programming environments, networking, and databases. These topics are covered from their earliest beginnings to their modern variants. There are focused case studies on UNIX, APL, SAGE, GNU Emacs, Autoflow, internet protocols, System R, and others. Extensive problems and suggested projects enable readers to deeply delve into the history of software in areas that interest them most.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of significant, lasting value in this rapidly
expanding field.
The first comprehensive guide to explore the growing field of electronic information, The Text in the Machine: Electronic Texts in the Humanities will help you create and use electronic texts. This book explains the processes involved in developing computerized books on library Web sites, CD-ROMs, or your own Web site. With the information provided by The Text in the Machine, you?ll be able to successfully transfer written words to a digitized form and increase access to any kind of information. Keeping the perspectives of scholars, students, librarians, users, and publishers in mind, this book outlines the necessary steps for electronic conversion in a comprehensive manner. The Text in the Machine addresses many variables that need to be taken into consideration to help you digitize texts, such as: defining types of markup, markup systems, and their uses identifying characteristics of the written text, such as its linguistic and physical nature, before choosing a markup scheme ensuring accuracy in electronic texts by keying in information up to three times and choosing software that is compatible with the markup systems you are using examining the best file formats for scanning written texts and converting them to digital form explaining the delivery systems available for electronic texts, such as CD-ROMs, the Internet, magnetic tape, and the variety of software that will interpret these interfaces designing the structure of electronic texts with linear presentation, segmented text, or image files to increase readability and accessibility Containing lists of suggested readings and examples of electronic text Web sites, this book provides you with the opportunity to see how other libraries and scholars are creating and publishing digital texts. From The Text in the Machine, you?ll receive the knowledge to make this medium of information accessible and beneficial to patrons and scholars around the world.
As technology becomes further meshed into our culture and everyday lives, new mediums and outlets for creative expression and innovation are necessary. The Handbook of Research on Computational Arts and Creative Informatics covers a comprehensive range of topics regarding the interaction of the sciences and the arts. Exploring new uses of technology and investigating creative insights into concepts of art and expression, this cutting-edge Handbook of Research offers a valuable resource to academicians, researchers, and field practitioners.
This edited volume collects the research results presented at the 14th International Symposium on Computer Methods in Biomechanics and Biomedical Engineering, Tel Aviv, Israel, 2016. The topical focus includes, but is not limited to, cardiovascular fluid dynamics, computer modeling of tissue engineering, skin and spine biomechanics, as well as biomedical image analysis and processing. The target audience primarily comprises research experts in the field of bioengineering, but the book may also be beneficial for graduate students alike.
This Festschrift is in honor of Marilyn Wolf, on the occasion of her 60th birthday. Prof. Wolf is a renowned researcher and educator in Electrical and Computer Engineering, who has made pioneering contributions in all of the major areas in Embedded, Cyber-Physical, and Internet of Things (IoT) Systems. This book provides a timely collection of contributions that cover important topics related to Smart Cameras, Hardware/Software Co-Design, and Multimedia applications. Embedded systems are everywhere; cyber-physical systems enable monitoring and control of complex physical processes with computers; and IoT technology is of increasing relevance in major application areas, including factory automation, and smart cities. Smart cameras and multimedia technologies introduce novel opportunities and challenges in embedded, cyber-physical and IoT applications. Advanced hardware/software co-design methodologies provide valuable concepts and tools for addressing these challenges. The diverse topics of the chapters in this Festschrift help to reflect the great breadth and depth of Marilyn Wolf's contributions in research and education. The chapters have been written by some of Marilyn's closest collaborators and colleagues.
An increasing number of global institutions look to advancements in technology to enhance access to learning and development and, in doing so, seek collaborative opportunities to maximize the benefits of educational technology. Cases on Technology Enhanced Learning through Collaborative Opportunities analyzes and evaluates how organizations and institutions of learning in the developing and developed world are adapting to technology enhanced learning environments and exploring transnational collaborative opportunities, providing prospects for learning, growth and development through a blend of traditional and technological methods.
Useful to healthcare providers, severity indices conclude which patients are most at risk for infection as well as the intensity of illness while in the hospital. ""Text Mining Techniques for Healthcare Provider Quality Determination: Methods for Rank Comparisons"" discusses the general practice of defining a patient severity index for risk adjustments and comparison of patient outcomes to assess quality factors. This ""Premier Reference Source"" examines the consequences of patient severity models and investigates the general assumptions required to perform standard severity adjustment.
As science becomes increasingly computational, the limits of what is computationally tractable become a barrier to scientific progress. Many scientific problems, however, are amenable to human problem solving skills that complement computational power. By leveraging these skills on a larger scale-beyond the relatively few individuals currently engaged in scientific inquiry-there is the potential for new scientific discoveries. This book presents a framework for mapping open scientific problems into video games. The game framework combines computational power with human problem solving and creativity to work toward solving scientific problems that neither computers nor humans could previously solve alone. To maximize the potential contributors to scientific discovery, the framework designs a game to be played by people with no formal scientific background and incentivizes long-term engagement with a myriad of collaborative or competitive reward structures. The framework allows for the continual coevolution of the players and the game to each other: as players gain expertise through gameplay, the game changes to become a better tool. The framework is validated by being applied to proteomics problems with the video game Foldit. Foldit players have contributed to novel discoveries in protein structure prediction, protein design, and protein structure refinement algorithms. The coevolution of human problem solving and computer tools in an incentivized game framework is an exciting new scientific pathway that can lead to discoveries currently unreachable by other methods.
A comprehensive one-year graduate (or advanced undergraduate)
course in mathematical logic and foundations of mathematics. No
previous knowledge of logic is required; the book is suitable for
self-study. Many exercises (with hints) are included.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
Modern optimization approaches have attracted an increasing number of scientists, decision makers, and researchers. As new issues in this field emerge, different optimization methodologies must be developed and implemented. Exploring Critical Approaches of Evolutionary Computation is a vital scholarly publication that explores the latest developments, methods, approaches, and applications of evolutionary models in a variety of fields. It also emphasizes evolutionary models of computation such as genetic algorithms, evolutionary strategies, classifier systems, evolutionary programming, genetic programming, and related fields such as swarm intelligence and other evolutionary computation techniques. Highlighting a range of pertinent topics such as neural networks, data mining, and data analytics, this book is designed for IT developers, IT theorists, computer engineers, researchers, practitioners, and upper-level students seeking current research on enhanced information exchange methods and practical aspects of computational systems.
The International Conference on Informatics and Management Science (IMS) 2012 will be held on November 16-19, 2012, in Chongqing, China, which is organized by Chongqing Normal University, Chongqing University, Shanghai Jiao Tong University, Nanyang Technological University, University of Michigan, Chongqing University of Arts and Sciences, and sponsored by National Natural Science Foundation of China (NSFC). The objective of IMS 2012 is to facilitate an exchange of information on best practices for the latest research advances in a range of areas. Informatics and Management Science contains over 600 contributions to suggest and inspire solutions and methods drawing from multiple disciplines including: Computer Science Communications and Electrical Engineering Management Science Service Science Business Intelligence
As a field, computer science occupies a unique scientific space, in that its subject matter can exist in both physical and abstract realms. An artifact such as software is both tangible and not, and must be classified as something in between, or "liminal." The study and production of liminal artifacts allows for creative possibilities that are, and have been, possible only in computer science. In It Began With Babbage, Subrata Dasgupta examines the unique history of computer science in terms of its creative innovations, spanning back to Charles Babbage in 1819. Since all artifacts of computer science are conceived with a use in mind, the computer scientist is not concerned with the natural laws that govern disciplines like physics or chemistry; the computer scientist is more concerned with the concept of purpose. This requirement lends itself to a type of creative thinking that, as Dasgupta shows us, has exhibited itself throughout the history of computer science. From Babbage's Difference Engine, through the Second World War, to the establishment of the term "Computer Science" in 1956, It Began With Babbage traces a lively and complete history of computer science.
Augmented reality is not a technology.Augmented reality is a medium. Likewise, a book on augmented reality that only addresses the technology that is required to support the medium of augmented reality falls far short of providing the background that is needed to produce, or critically consume augmented reality applications.One "reads" a book.One "watches" a movie.One "experiences" augmented reality."Understanding Augmented Reality" addresses the elements that are required to create "compelling "augmented reality experiences. The technology that supports augmented reality will come and go, evolve and change.The underlying principles for creating exciting, useful augmented reality experiences are timeless. Augmented reality designed from a purely technological perspective will lead to an AR experience that is novel and fun for one-time consumption-but is no more than a toy. Imagine a filmmaking book that discussed cameras and special effects software, but ignored cinematography and storytelling In order to create compelling augmented reality experiences that stand the test of time and cause the participant in the AR experience to focus on the "content" of the experience - rather than the technology - one must consider how to maximally exploit the affordances of the medium. "Understanding Augmented Reality" addresses core conceptual issues regarding the medium of augmented reality as well as the technology required to support compelling augmented reality. By addressing AR as a medium at the conceptual level in addition to the technological level, the reader will learn to conceive of AR applications that are not limited by today s technology. At the same time, ample examples are provided that show what is possible with current technology. Explore the different techniques, technologies and approaches
used in developing AR applications. This book helps untangle the
seemingly endless different approaches that are being taken in the
market today.
Software has long been perceived as complex, at least within
Software Engineering circles. We have been living in a recognised
state of crisis since the first NATO Software Engineering
conference in 1968. Time and again we have been proven unable to
engineer reliable software as easily/cheaply as we imagined. Cost
overruns and expensive failures are the norm. |
You may like...
Mass Politics in Tough Times - Opinions…
Larry Bartels, Nancy Bermeo
Hardcover
R3,857
Discovery Miles 38 570
MID-INFRARED FIBER PHOTONICS - Glass…
Stuart Jackson, Real Vallee, …
Paperback
R4,703
Discovery Miles 47 030
Advances in Italian Mechanism Science…
Vincenzo Niola, Alessandro Gasparetto
Hardcover
R5,340
Discovery Miles 53 400
Who Leads the State and the City? Duties…
Universal Politics
Hardcover
|