![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Social & legal aspects of computing > Human-computer interaction
Cognitive task analysis is a broad area consisting of tools and
techniques for describing the knowledge and strategies required for
task performance. Cognitive task analysis has implications for the
development of expert systems, training and instructional design,
expert decision making and policymaking. It has been applied in a
wide range of settings, with different purposes, for instance:
specifying user requirements in system design or specifying
training requirements in training needs analysis. The topics to be
covered by this work include: general approaches to cognitive task
analysis, system design, instruction, and cognitive task analysis
for teams. The work settings to which the tools and techniques
described in this work have been applied include: 911 dispatching,
faultfinding on board naval ships, design aircraft, and various
support systems.
This book reports on the latest advances in the modeling, analysis and efficient management of information in Internet of Things (IoT) applications in the context of 5G access technologies. It presents cutting-edge applications made possible by the implementation of femtocell networks and millimeter wave communications solutions, examining them from the perspective of the universally and constantly connected IoT. Moreover, it describes novel architectural approaches to the IoT and presents the new framework possibilities offered by 5G mobile networks, including middleware requirements, node-centrality and the location of extensive functionalities at the edge. By providing researchers and professionals with a timely snapshot of emerging mobile communication systems, and highlighting the main pitfalls and potential solutions, the book fills an important gap in the literature and will foster the further developments of 5G hosting IoT devices.
This special issue contains essays regarding the CHI '95 conference, which featured a panel titled, Discount or Disservice? Discount Usability Analysis: Evaluation at a Bargain Price or Simply Damaged Merchandise? Wayne Gray, who organized the panel, presented a controversial critique of studies that had evaluated various usability evaluation methods (UEMs). The level of interest in this discussion led Gray to propose a review article that dealt with the issues in a more systematic fashion. The resulting essay, written by Gray and his collaborator Marilyn Salzman, conducted an in-depth review of a series of influential studies that used experimental methods to compare a variety of UEMs. Gray and Salzman's analysis was framed using Cook and Campbell's (1979) well-known discussion of various forms of validity. They used this to evaluate numerous details of these comparative studies, and they concluded that the studies fell short on the criteria by which good experimental studies are designed and interpreted.
This volume reveals the history of Information Architecture (IA), reflects on the relationship between practice and research within the discipline, and presents educators with the latest models, frameworks and theories that have emerged from the Information Architecture Academics and Practitioners Roundtable between 2014 and 2019. The most comprehensive and up-to-date overview of Information Architecture so far, this collection is a valuable tool for teachers, researchers, and practitioners interested in recent advances in information architecture in areas such as pervasive computing and embodiment, artificial intelligence, design practice, diversity and ethics in design, and critique. The information landscape has grown more complex, porous and connected-the information challenges of smart phones, sensors and IoT demand focused attention from organizations that often embrace a 'move fast and break things' ethos. This book not only explores the shift from Classical IA to Contemporary IA-it asks, are today's creators prepared to solve the challenges ahead? Have industry-led disciplines abdicated their responsibility to the people who inhabit current information environments? Will this discipline persist? Advances in Information Architecture examines the maturity of the field, revisits the discipline's efforts to transform itself in 2013 with the publication of "Reframing Information Architecture", and considers the opportunities that remain to bridge the academic and practitioner communities.
Noise is everywhere and in most applications that are related to audio and speech, such as human-machine interfaces, hands-free communications, voice over IP (VoIP), hearing aids, teleconferencing/telepresence/telecollaboration systems, and so many others, the signal of interest (usually speech) that is picked up by a microphone is generally contaminated by noise. As a result, the microphone signal has to be cleaned up with digital signal processing tools before it is stored, analyzed, transmitted, or played out. This cleaning process is often called noise reduction and this topic has attracted a considerable amount of research and engineering attention for several decades. One of the objectives of this book is to present in a common framework an overview of the state of the art of noise reduction algorithms in the single-channel (one microphone) case. The focus is on the most useful approaches, i.e., filtering techniques (in different domains) and spectral enhancement methods. The other objective of Noise Reduction in Speech Processing is to derive all these well-known techniques in a rigorous way and prove many fundamental and intuitive results often taken for granted. This book is especially written for graduate students and research engineers who work on noise reduction for speech and audio applications and want to understand the subtle mechanisms behind each approach. Many new and interesting concepts are presented in this text that we hope the readers will find useful and inspiring.
Over the past 5 years, the concept of big data has matured, data science has grown exponentially, and data architecture has become a standard part of organizational decision-making. Throughout all this change, the basic principles that shape the architecture of data have remained the same. There remains a need for people to take a look at the "bigger picture" and to understand where their data fit into the grand scheme of things. Data Architecture: A Primer for the Data Scientist, Second Edition addresses the larger architectural picture of how big data fits within the existing information infrastructure or data warehousing systems. This is an essential topic not only for data scientists, analysts, and managers but also for researchers and engineers who increasingly need to deal with large and complex sets of data. Until data are gathered and can be placed into an existing framework or architecture, they cannot be used to their full potential. Drawing upon years of practical experience and using numerous examples and case studies from across various industries, the authors seek to explain this larger picture into which big data fits, giving data scientists the necessary context for how pieces of the puzzle should fit together.
Computers are increasingly able to mimic abilities we often think of as exclusively human - memory, decision-making and now, speech. A new generation of speech recognition systems can make at least some attempt at understanding what is said to them and can respond accordingly. These systems are coming into daily use for home banking, for airline flights enquiries and for placing orders over the telephone and are fast becoming more powerful and more pervasive. Using data taken from a major, European Union funded project on speech understanding, the SunDial project, this book shows how this data may be analyzed to yield important conclusions about the organization of both human-human and human-computer information dialogues. It describes the Wizard-of-Oz method of collecting speech dialogues from people who believe they are interacting with a speech understanding system before that system has been fully designed or built and it shows how the resulting dialogues may be analyzed to guide further design. This book provides detailed and comparative studies of human and human-computer speech dialogues, including analyses of opening and closing sequences and turn-taking.
"Dynamic Provisioning for Community Services" outlines a dynamic
provisioning and maintenance mechanism in a running distributed
system, e.g. the grid, which can be used to maximize the
utilization of computing resources and user demands. The book
includes a complete and reliable maintenance system solution for
the large-scale distributed system and an interoperation mechanism
for the grid middleware deployed in the United States, Europe, and
China. The experiments and evaluations have all been practically
implemented for ChinaGrid, and the best practices established can
help readers to construct reliable distributed systems.
Providing a comprehensive introduction into an overview of the field of pervasive healthcare applications, this volume incorporates a variety of timely topics ranging from medical sensors and hardware infrastructures, to software platforms and applications and addresses issues of user experience and technology acceptance. The recent developments in the area of information and communication technologies have laid the groundwork for new patient-centred healthcare solutions. While the majority of computer-supported healthcare tools designed in the last decades focused mainly on supporting care-givers and medical personnel, this trend changed with the introduction of pervasive healthcare technologies, which provide supportive and adaptive services for a broad variety and diverse set of end users. With contributions from key researchers the book integrates the various aspects of pervasive healthcare systems including application design, hardware development, system implementation, hardware and software infrastructures as well as end-user aspects providing an excellent overview of this important and evolving field.
This book's purpose is to offer various perspectives relating to
the development, effectiveness, and implementation of interactive
computing technology for health promotion--programs and
interventions aimed at improving various health-related outcomes
such as involvement in care, quality of life, adherence, disease
management, healthy lifestyle, and more. Its coverage includes:
This book's purpose is to offer various perspectives relating to
the development, effectiveness, and implementation of interactive
computing technology for health promotion--programs and
interventions aimed at improving various health-related outcomes
such as involvement in care, quality of life, adherence, disease
management, healthy lifestyle, and more. Its coverage includes:
The Great East Japan Earthquake, which occurred on March 11, 2011, reminded us that we were just one species within the great cycle of life on earth, that we were allowed to survive only because of nature, and that the idea that we were somehow able to conquer nature was simply an illusion. Now more than ever it is time that we confront head-on the change from the "underground resources" type of civilization to one with a new way of life and technology that embraces a sense of nature. To do so, we must learn from nature, the only sustainable society on earth, and create technology that embraces such a view of nature. We call such technology, which cleverly revives nature's greatness, Nature Technology. Taking a casual glance at nature, a nest of termites in the savanna region can be observed to maintain a steady temperature of 30 DegreesC despite the fact that the outside air temperature ranges from 50 DegreesC during the day to nearly 0 DegreesC at night. There are countless numbers of open pores just several billionths of a meter (nanometer) wide in the "earth" of the nest, which serve to regulate the temperature and humidity. In fact, all kinds of "earth" have these pores (clay mineral with aggregated structures) and air conditioners that require no electricity have been created by hardening this earth while preserving its structure; a cooling floor or wall becomes the alternative to a conventional air conditioner. This book provides many such examples of how Nature Technology can support a new lifestyle that is both environmentally sound and spiritually uplifting.
There is perhaps no facet of modern society where the influence of
computer automation has not been felt. Flight management systems
for pilots, diagnostic and surgical aids for physicians,
navigational displays for drivers, and decision-aiding systems for
air-traffic controllers, represent only a few of the numerous
domains in which powerful new automation technologies have been
introduced. The benefits that have been reaped from this
technological revolution have been many. At the same time,
automation has not always worked as planned by designers, and many
problems have arisen--from minor inefficiencies of operation to
large-scale, catastrophic accidents. Understanding how humans
interact with automation is vital for the successful design of new
automated systems that are both safe and efficient.
Based on a symposium honoring the extensive work of Allen Newell --
one of the founders of artificial intelligence, cognitive science,
human-computer interaction, and the systematic study of
computational architectures -- this volume demonstrates how
unifying themes may be found in the diversity that characterizes
current research on computers and cognition. The subject matter
includes:
The recent evolution of western societies has been characterized by
an increasing emphasis on information and communication. As the
amount of available information increases, however, the user --
worker, student, citizen -- faces a new problem: selecting and
accessing relevant information. More than ever it is crucial to
find efficient ways for users to interact with information systems
in a way that prevents them from being overwhelmed or simply
missing their targets. As a result, hypertext systems have been
developed as a means of facilitating the interactions between
readers and text. In hypertext, information is organized as a
network in which nodes are text chunks (e.g., lists of items,
paragraphs, pages) and links are relationships between the nodes
(e.g., semantic associations, expansions, definitions, examples --
virtually any kind of relation that can be imagined between two
text passages). Unfortunately, the many ways in which these
hypertext interfaces can be designed has caused a complexity that
extends far beyond the processing abilities of regular users.
Therefore, it has become widely recognized that a more rational
approach based on a thorough analysis of information users' needs,
capacities, capabilities, and skills is needed. This volume seeks
to meet that need.
Man-machine interaction is the gateway providing access to functions and services, which, due to the ever increasing complexity of smart systems, threatens to become a bottleneck. This book therefore introduces not only advanced interfacing concepts, but also gives insight into the related theoretical background.This refers mainly to the realization of video-based multimodal interaction via gesture, mimics, and speech, but also to interacting with virtual object in virtual environments, cooperating with local or remote robots, and user assistance. While most publications in the field of human factors engineering focus on interface design, this book puts special emphasis on implementation aspects. To this end it is accompanied by software development environments for image processing, classification, and virtual environment implementation. In addition a test data base is included for gestures, head pose, facial expressions, full-body person recognition, and people tracking. These data are used for the examples throughout the book, but are also meant to encourage the reader to start experimentation on his own. Thus the book may serve as a self-contained introduction both for researchers and developers of man-machine interfaces. It may also be used for graduate-level university courses.
"A Journey Through Cultures" addresses one of the hottest topics in contemporary HCI: cultural diversity amongst users. For a number of years the HCI community has been investigating alternatives to enhance the design of cross-cultural systems. Most contributions to date have followed either a 'design for each' or a 'design for all' strategy. "A Journey Through Cultures "takes a very different approach. Proponents of CVM - the Cultural Viewpoint Metaphors perspective - the authors invite HCI practitioners to think of how to expose and communicate the idea of cultural diversity. A detailed case study is included which assesses the metaphors' potential in cross-cultural design and evaluation. The results show that cultural viewpoint metaphors have strong epistemic power, leveraged by a combination of theoretic foundations coming from Anthropology, Semiotics and the authors' own work in HCI and Semiotic Engineering. Luciana Salgado, Carla Leitao and Clarisse de Souza are members of SERG, the Semiotic Engineering Research Group at the Departamento de Informatica of Rio de Janeiro's Pontifical Catholic University (PUC-Rio)."
The perception-action cycle is the circular flow of information that takes place between the organism and its environment in the course of a sensory-guided sequence of behaviour towards a goal. Each action causes changes in the environment that are analyzed bottom-up through the perceptual hierarchy and lead to the processing of further action, top-down through the executive hierarchy, toward motor effectors. These actions cause new changes that are analyzed and lead to new action, and so the cycle continues. The Perception-action cycle: Models, architectures and hardware book provides focused and easily accessible reviews of various aspects of the perception-action cycle. It is an unparalleled resource of information that will be an invaluable companion to anyone in constructing and developing models, algorithms and hardware implementations of autonomous machines empowered with cognitive capabilities. The book is divided into three main parts. In the first part, leading computational neuroscientists present brain-inspired models of perception, attention, cognitive control, decision making, conflict resolution and monitoring, knowledge representation and reasoning, learning and memory, planning and action, and consciousness grounded on experimental data. In the second part, architectures, algorithms, and systems with cognitive capabilities and minimal guidance from the brain, are discussed. These architectures, algorithms, and systems are inspired from the areas of cognitive science, computer vision, robotics, information theory, machine learning, computer agents and artificial intelligence. In the third part, the analysis, design and implementation of hardware systems with robust cognitive abilities from the areas of mechatronics, sensing technology, sensor fusion, smart sensor networks, control rules, controllability, stability, model/knowledge representation, and reasoning are discussed.
There is a growing consensus in the human factors/ergonomics
community that human factors research has had little impact on
significant applied problems. Some have suggested that the problem
lies in the fact that much HF/E research has been based on the
wrong type of psychology, an information processing view of
psychology that is reductionistic and context-free. Ecological
psychology offers a viable alternative, presenting a richer view of
human behavior that is holistic and contextualized. The papers
presented in these two volumes show the conceptual impact that
ecological psychology can have on HF/E, as well as presenting a
number of specific examples illustrating the ecological approach to
human-machine systems. It is the first collection of papers that
explicitly draws a connection between these two fields. While work
in this area is only just beginning, the evidence available
suggests that taking an ecological approach to human
factors/ergonomics helps bridge the existing gap between basic
research and applied problems.
Technological development has changed the nature of industrial
production so that it is no longer a question of humans working
with a machine, but rather that a joint human machine system is
performing the task. This development, which started in the 1940s,
has become even more pronounced with the proliferation of computers
and the invasion of digital technology in all wakes of working
life. It may appear that the importance of human work has been
reduced compared to what can be achieved by intelligent software
systems, but in reality, the opposite is true: the more complex a
system, the more vital the human operator's task. The conditions
have changed, however, whereas people used to be in control of
their own tasks, today they have become supervisors of tasks which
are shared between humans and machines.
Proposing a new paradigm for Computer Supported Cooperative Work (CSCW), this ground-breaking book presents a research agenda for developing and testing that paradigm. It constitutes the first attempt to outline a comprehensive model of collaboration that integrates the cognitive/conceptual and social dynamics of groups. br br The challenge faced by all groups engaged in intellectual work is, on the one hand, to divide the task so that efforts of i individual members /i may proceed in parallel and, on the other hand, to synthesize their separate contributions to form a coherent whole. Addressing this challenge, Smith examines the general form of a theory of computer-based collaboration that extends across different tasks and working situations. He uses the work of Newell, Simon, and Anderson as a base from which to consider a group as a form of distributed information processing system. Within groups, there are constructs analogous to human long-term and short-term memory, conceptual processes, and problem solving and knowledge-construction strategies. He discusses two metacognitive issues -- awareness and control -- as they occur in collaborative behavior. And he reviews a number of advanced computer systems that support collaboration, focusing on their impact on the thinking and behavior of groups. br br Smith's theoretical framework combines elements of Information Processing System theory -- and its detailed process models of cognitive behavior -- with the situated perspective of activity theory. The book suggests new and useful ways of conceiving problems and solutions to all those interested in the ways in which people interact with each other and with computers to achievegoals. br |
You may like...
Designing the User Interface: Strategies…
Ben Shneiderman, Catherine Plaisant, …
Paperback
R2,037
Discovery Miles 20 370
Next-Generation Applications and…
Filipe Portela, Ricardo Queiros
Hardcover
R6,648
Discovery Miles 66 480
Brain Machine Interfaces for Space…
Luca Rossini, Dario Izzo
Hardcover
R4,841
Discovery Miles 48 410
Customized Production Through 3D…
Lin Zhang, Longfei Zhou, …
Paperback
R3,925
Discovery Miles 39 250
|