![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Social & legal aspects of computing > Human-computer interaction
Usability Testing Essentials presents a practical, step-by-step approach to learning the entire process of planning and conducting a usability test. It explains how to analyze and apply the results and what to do when confronted with budgetary and time restrictions. This is the ideal book for anyone involved in usability or user-centered design-from students to seasoned professionals. Filled with new examples and case studies, Usability Testing Essentials, Second Edition is completely updated to reflect the latest approaches, tools and techniques needed to begin usability testing or to advance in this area.
Cognitive task analysis is a broad area consisting of tools and
techniques for describing the knowledge and strategies required for
task performance. Cognitive task analysis has implications for the
development of expert systems, training and instructional design,
expert decision making and policymaking. It has been applied in a
wide range of settings, with different purposes, for instance:
specifying user requirements in system design or specifying
training requirements in training needs analysis. The topics to be
covered by this work include: general approaches to cognitive task
analysis, system design, instruction, and cognitive task analysis
for teams. The work settings to which the tools and techniques
described in this work have been applied include: 911 dispatching,
faultfinding on board naval ships, design aircraft, and various
support systems.
How do we create new ways of looking at the world? Join award-winning data storyteller RJ Andrews as he pushes beyond the usual how-to, and takes you on an adventure into the rich art of informing. Creating Info We Trust is a craft that puts the world into forms that are strong and true. It begins with maps, diagrams, and charts -- but must push further than dry defaults to be truly effective. How do we attract attention? How can we offer audiences valuable experiences worth their time? How can we help people access complexity? Dark and mysterious, but full of potential, data is the raw material from which new understanding can emerge. Become a hero of the information age as you learn how to dip into the chaos of data and emerge with new understanding that can entertain, improve, and inspire. Whether you call the craft data storytelling, data visualization, data journalism, dashboard design, or infographic creation -- what matters is that you are courageously confronting the chaos of it all in order to improve how people see the world. Info We Trust is written for everyone who straddles the domains of data and people: data visualization professionals, analysts, and all who are enthusiastic for seeing the world in new ways. This book draws from the entirety of human experience, quantitative and poetic. It teaches advanced techniques, such as visual metaphor and data transformations, in order to create more human presentations of data. It also shows how we can learn from print advertising, engineering, museum curation, and mythology archetypes. This human-centered approach works with machines to design information for people. Advance your understanding beyond by learning from a broad tradition of putting things "in formation" to create new and wonderful ways of opening our eyes to the world. Info We Trust takes a thoroughly original point of attack on the art of informing. It builds on decades of best practices and adds the creative enthusiasm of a world-class data storyteller. Info We Trust is lavishly illustrated with hundreds of original compositions designed to illuminate the craft, delight the reader, and inspire a generation of data storytellers.
This special issue contains essays regarding the CHI '95 conference, which featured a panel titled, Discount or Disservice? Discount Usability Analysis: Evaluation at a Bargain Price or Simply Damaged Merchandise? Wayne Gray, who organized the panel, presented a controversial critique of studies that had evaluated various usability evaluation methods (UEMs). The level of interest in this discussion led Gray to propose a review article that dealt with the issues in a more systematic fashion. The resulting essay, written by Gray and his collaborator Marilyn Salzman, conducted an in-depth review of a series of influential studies that used experimental methods to compare a variety of UEMs. Gray and Salzman's analysis was framed using Cook and Campbell's (1979) well-known discussion of various forms of validity. They used this to evaluate numerous details of these comparative studies, and they concluded that the studies fell short on the criteria by which good experimental studies are designed and interpreted.
This book, originally published in 1992, encapsulates ten years of research at the Open University's Human Cognition Research Laboratory. The research investigates the problems of novice programmers, and is strongly oriented toward the design and implementation of "programming environments" aimed at eliminating or easing novices' problems. A range of languages is studied: Pascal, SOLO, Lisp, Prolog and "Knowledge Engineering Programming". The primary emphasis of the empirical studies is to gain some understanding of novices' "mental models" of the inner workings of computers. Such (erroneous) models are constructed by novices in their own heads to account for the idiosyncrasies of particular programming languages. The primary emphasis of the implementations described in the book is the provision of "automatic debugging aids", i.e. artificial intelligence programs which can analyse novices' buggy programs, and make sense of them, thereby providing useful advice for the novices. Another related strand taken in some of the work is the concept of "pre-emptive design", i.e. the provision of tools such as syntax-directed editors and graphical tracers which help programmers avoid many frequently-occurring errors. A common thread throughout the book is its Cognitive Science/Artificial Intelligence orientation. AI tools are used, for instance, to construct simulation models of subjects writing programs, in order to provide insights into what their deep conceptual errors are. At the other extreme, AI programs which were developed in order to help student debug their programs are observed empirically in order to ensure that they provide facilities actually needed by real programmers. This book will be of great interest to advanced undergraduate, postgraduate, and professional researchers in Cognitive Science, Artificial Intelligence, and Human-Computer Interaction.
Noise is everywhere and in most applications that are related to audio and speech, such as human-machine interfaces, hands-free communications, voice over IP (VoIP), hearing aids, teleconferencing/telepresence/telecollaboration systems, and so many others, the signal of interest (usually speech) that is picked up by a microphone is generally contaminated by noise. As a result, the microphone signal has to be cleaned up with digital signal processing tools before it is stored, analyzed, transmitted, or played out. This cleaning process is often called noise reduction and this topic has attracted a considerable amount of research and engineering attention for several decades. One of the objectives of this book is to present in a common framework an overview of the state of the art of noise reduction algorithms in the single-channel (one microphone) case. The focus is on the most useful approaches, i.e., filtering techniques (in different domains) and spectral enhancement methods. The other objective of Noise Reduction in Speech Processing is to derive all these well-known techniques in a rigorous way and prove many fundamental and intuitive results often taken for granted. This book is especially written for graduate students and research engineers who work on noise reduction for speech and audio applications and want to understand the subtle mechanisms behind each approach. Many new and interesting concepts are presented in this text that we hope the readers will find useful and inspiring.
Computers are increasingly able to mimic abilities we often think of as exclusively human - memory, decision-making and now, speech. A new generation of speech recognition systems can make at least some attempt at understanding what is said to them and can respond accordingly. These systems are coming into daily use for home banking, for airline flights enquiries and for placing orders over the telephone and are fast becoming more powerful and more pervasive. Using data taken from a major, European Union funded project on speech understanding, the SunDial project, this book shows how this data may be analyzed to yield important conclusions about the organization of both human-human and human-computer information dialogues. It describes the Wizard-of-Oz method of collecting speech dialogues from people who believe they are interacting with a speech understanding system before that system has been fully designed or built and it shows how the resulting dialogues may be analyzed to guide further design. This book provides detailed and comparative studies of human and human-computer speech dialogues, including analyses of opening and closing sequences and turn-taking.
The three volume set provides a systematic overview of theories and technique on social network analysis.Volume 2 of the set mainly focuses on the formation and interaction of group behaviors. Users' behavior analysis, sentiment analysis, influence analysis and collective aggregation are discussed in detail as well. It is an essential reference for scientist and professionals in computer science.
This book's purpose is to offer various perspectives relating to
the development, effectiveness, and implementation of interactive
computing technology for health promotion--programs and
interventions aimed at improving various health-related outcomes
such as involvement in care, quality of life, adherence, disease
management, healthy lifestyle, and more. Its coverage includes:
This book's purpose is to offer various perspectives relating to
the development, effectiveness, and implementation of interactive
computing technology for health promotion--programs and
interventions aimed at improving various health-related outcomes
such as involvement in care, quality of life, adherence, disease
management, healthy lifestyle, and more. Its coverage includes:
Rapidly Prototyping Interfaces with InDesign guides readers to learn to create a wide range of interfaces, from mobile to desktop. With InDesign, interface prototyping takes minutes instead of days. This book is code-free and entirely hands-on with InDesign tools. This book acts as a guide for how to prototype user interfaces with InDesign, using diagrams, illustrations, and screen shots. This illustrated book concerns the creation and prototyping of eBooks, eMagazines, websites, desktop apps and movile apps. InDesign is an important tool for rapid prototyping, as no coding is involved. Key Features No available book provides this information. The reader will learn how to prototype a wide range of interfaces for both desktop and movile platforms. The book will include software screen shots and guide the reader step by step. The example prototypes will be interactive. Users can test them using interactive devices, such as desktop computers, tablets or mobile phones. The reader will learn how to prepare an effective portfolio and resume.
Based on a symposium honoring the extensive work of Allen Newell --
one of the founders of artificial intelligence, cognitive science,
human-computer interaction, and the systematic study of
computational architectures -- this volume demonstrates how
unifying themes may be found in the diversity that characterizes
current research on computers and cognition. The subject matter
includes:
There is perhaps no facet of modern society where the influence of
computer automation has not been felt. Flight management systems
for pilots, diagnostic and surgical aids for physicians,
navigational displays for drivers, and decision-aiding systems for
air-traffic controllers, represent only a few of the numerous
domains in which powerful new automation technologies have been
introduced. The benefits that have been reaped from this
technological revolution have been many. At the same time,
automation has not always worked as planned by designers, and many
problems have arisen--from minor inefficiencies of operation to
large-scale, catastrophic accidents. Understanding how humans
interact with automation is vital for the successful design of new
automated systems that are both safe and efficient.
Powerful information technologies and the complex support systems they engender are evolving faster than people's ability to adjust to them. In the workplace, this leads to troublesome task performance, added stress on users, increased organizational inefficiency, and, in some cases, a heightened risk of wide-scale disaster. In the marketplace, it makes for consumer dissatisfaction. Clearly, traditional human-computer interaction (HCI) and system design (SD) solutions to this dilemma have proven woefully inadequate. What is needed is a fresh multidisciplinary approach offering a broader, more dynamic framework for assessing needs and designing usable, efficient systems. Taking modeling concepts from engineering, psychology, cognitive science, information science, and computer science, cognitive systems engineering (CSE) provides such a framework. This book is the first comprehensive guide to the emerging new field of CSE. Providing equal parts theory and practice, it is based on the authors' many years of experience with work systems in a wide range of work domains, including process control, manufacturing, hospitals, and libraries. Throughout, the emphasis is on powerful analytical techniques that enhance the systems designer's ability to see the "big picture", and to design for all crucial aspects of human-work interaction. Applicable to highly structured technical systems such as process plants, as well as less structured user-driven systems like libraries, these analytical techniques form the basis for the evaluation and design guidelines that make up the bulk of this book. And since the proof is in the pudding, the authors provide a chapter-length case history in which theydemonstrate the success of their approach when applied to a full-scale software design project. The project, a retrieval system for public libraries, is described in detail, from field studies to concept validation experiments, and, of course, the empirical evaluation of the system while in use by the library users and personnel. Computer-based information systems are rapidly becoming a fundamental part of the human landscape. How that landscape evolves over the next decade or so, whether it becomes a hostile one or one that generously supports the needs of future generations, is in the hands of all those involved with the study and design of information systems.
The recent evolution of western societies has been characterized by
an increasing emphasis on information and communication. As the
amount of available information increases, however, the user --
worker, student, citizen -- faces a new problem: selecting and
accessing relevant information. More than ever it is crucial to
find efficient ways for users to interact with information systems
in a way that prevents them from being overwhelmed or simply
missing their targets. As a result, hypertext systems have been
developed as a means of facilitating the interactions between
readers and text. In hypertext, information is organized as a
network in which nodes are text chunks (e.g., lists of items,
paragraphs, pages) and links are relationships between the nodes
(e.g., semantic associations, expansions, definitions, examples --
virtually any kind of relation that can be imagined between two
text passages). Unfortunately, the many ways in which these
hypertext interfaces can be designed has caused a complexity that
extends far beyond the processing abilities of regular users.
Therefore, it has become widely recognized that a more rational
approach based on a thorough analysis of information users' needs,
capacities, capabilities, and skills is needed. This volume seeks
to meet that need.
There is a growing consensus in the human factors/ergonomics
community that human factors research has had little impact on
significant applied problems. Some have suggested that the problem
lies in the fact that much HF/E research has been based on the
wrong type of psychology, an information processing view of
psychology that is reductionistic and context-free. Ecological
psychology offers a viable alternative, presenting a richer view of
human behavior that is holistic and contextualized. The papers
presented in these two volumes show the conceptual impact that
ecological psychology can have on HF/E, as well as presenting a
number of specific examples illustrating the ecological approach to
human-machine systems. It is the first collection of papers that
explicitly draws a connection between these two fields. While work
in this area is only just beginning, the evidence available
suggests that taking an ecological approach to human
factors/ergonomics helps bridge the existing gap between basic
research and applied problems.
Technological development has changed the nature of industrial
production so that it is no longer a question of humans working
with a machine, but rather that a joint human machine system is
performing the task. This development, which started in the 1940s,
has become even more pronounced with the proliferation of computers
and the invasion of digital technology in all wakes of working
life. It may appear that the importance of human work has been
reduced compared to what can be achieved by intelligent software
systems, but in reality, the opposite is true: the more complex a
system, the more vital the human operator's task. The conditions
have changed, however, whereas people used to be in control of
their own tasks, today they have become supervisors of tasks which
are shared between humans and machines.
This book highlights the field of selfie biometrics, providing a clear overview and presenting recent advances and challenges. It also discusses numerous selfie authentication techniques on mobile devices. Biometric authentication using mobile devices is becoming a convenient and important means of verifying identity for secured access and services such as telebanking and electronic transactions. In this context, face and ocular biometrics in the visible spectrum has gained increased attention from the research community. However, device mobility and operation in uncontrolled environments mean that facial and ocular images captured with mobile devices exhibit substantial degradation as a result of adverse lighting conditions, specular reflections and motion and defocus blur. In addition, low spatial resolution and the small sensor of front-facing mobile cameras further degrade the sample quality, reducing the recognition accuracy of face and ocular recognition technology when integrated into smartphones. Presenting the state of the art in mobile biometric research and technology, and offering an overview of the potential problems in real-time integration of biometrics in mobile devices, this book is a valuable resource for final-year undergraduate students, postgraduate students, engineers, researchers and academics in various fields of computer engineering.
This book offers a comprehensive yet concise overview of the challenges and opportunities presented by the use of artificial intelligence in healthcare. It does so by approaching the topic from multiple perspectives, e.g. the nursing, consumer, medical practitioner, healthcare manager, and data analyst perspective. It covers human factors research, discusses patient safety issues, and addresses ethical challenges, as well as important policy issues. By reporting on cutting-edge research and hands-on experience, the book offers an insightful reference guide for health information technology professionals, healthcare managers, healthcare practitioners, and patients alike, aiding them in their decision-making processes. It will also benefit students and researchers whose work involves artificial intelligence-related research issues in healthcare.
The perception-action cycle is the circular flow of information that takes place between the organism and its environment in the course of a sensory-guided sequence of behaviour towards a goal. Each action causes changes in the environment that are analyzed bottom-up through the perceptual hierarchy and lead to the processing of further action, top-down through the executive hierarchy, toward motor effectors. These actions cause new changes that are analyzed and lead to new action, and so the cycle continues. The Perception-action cycle: Models, architectures and hardware book provides focused and easily accessible reviews of various aspects of the perception-action cycle. It is an unparalleled resource of information that will be an invaluable companion to anyone in constructing and developing models, algorithms and hardware implementations of autonomous machines empowered with cognitive capabilities. The book is divided into three main parts. In the first part, leading computational neuroscientists present brain-inspired models of perception, attention, cognitive control, decision making, conflict resolution and monitoring, knowledge representation and reasoning, learning and memory, planning and action, and consciousness grounded on experimental data. In the second part, architectures, algorithms, and systems with cognitive capabilities and minimal guidance from the brain, are discussed. These architectures, algorithms, and systems are inspired from the areas of cognitive science, computer vision, robotics, information theory, machine learning, computer agents and artificial intelligence. In the third part, the analysis, design and implementation of hardware systems with robust cognitive abilities from the areas of mechatronics, sensing technology, sensor fusion, smart sensor networks, control rules, controllability, stability, model/knowledge representation, and reasoning are discussed. |
![]() ![]() You may like...
The Aerodynamics of Heavy Vehicles II…
Fred Browand, Rose McCallen, …
Mixed media product
R4,667
Discovery Miles 46 670
Intermediate Solid Mechanics
Marko V. Lubarda, Vlado A. Lubarda
Hardcover
Advanced Introduction to Megaprojects
Nathalie Drouin, Rodney Turner
Paperback
R821
Discovery Miles 8 210
|