|
|
Books > Computing & IT > Computer programming > General
This innovative monograph focuses on a contemporary form of
computer-based literature called 'literary hypertext', a digital,
interactive, communicative form of new media writing. Canonizing
Hypertext combines theoretical and hermeneutic investigations with
empirical research into the motivational and pedagogic
possibilities of this form of literature. It focuses on key
questions for literary scholars and teachers: How can literature be
taught in such a way as to make it relevant for an increasingly
hypermedia-oriented readership? How can the rapidly evolving new
media be integrated into curricula that still seek to transmit
traditional literary competence? How can the notion of literary
competence be broadened to take into account these current trends?
This study, which argues for hypertexts integration in the literary
canon, offers a critical overview of developments in hypertext
theory, an exemplary hypertext canon and an evaluation of possible
classroom applications.
Models and simulations are an important first step in developing
computer applications to solve real-world problems. However, in
order to be truly effective, computer programmers must use formal
modeling languages to evaluate these simulations. Formal Languages
for Computer Simulation: Transdisciplinary Models and Applications
investigates a variety of programming languages used in validating
and verifying models in order to assist in their eventual
implementation. This book will explore different methods of
evaluating and formalizing simulation models, enabling computer and
industrial engineers, mathematicians, and students working with
computer simulations to thoroughly understand the progression from
simulation to product, improving the overall effectiveness of
modeling systems.
XML in Data Management is for IT managers and technical staff
involved in the creation, administration, or maintenance of a data
management infrastructure that includes XML. For most IT staff, XML
is either just a buzzword that is ignored or a silver bullet to be
used in every nook and cranny of their organization. The truth is
in between the two. This book provides the guidance necessary for
data managers to make measured decisions about XML within their
organizations. Readers will understand the uses of XML, its
component architecture, its strategic implications, and how these
apply to data management.
To view a sample chapter and read the Foreword by Thomas C. Redman,
visit http: //books.elsevier.com/mk/?isbn=0120455994
* Takes a data-centric view of XML.
* Explains how, when, and why to apply XML to data management
systems.
* Covers XML component architecture, data engineering, frameworks,
metadata, legacy systems, and more.
* Discusses the various strengths and weaknesses of XML
technologies in the context of organizational data management and
integration.
This handbook provides a comprehensive reference for firmware
developers looking to increase their skills and productivity. It
addresses each critical step of the development process in detail,
including how to optimize hardware design for better firmware.
Topics covered include real-time issues, interrupts and ISRs,
memory management (including Flash memory), handling both digital
and analog peripherals, communications interfacing, math
subroutines, error handling, design tools, and troubleshooting and
debugging. The companion CD-ROM includes all the code used in the
design examples and a searchable ebook version of the text.
This book is not for the beginner, but rather is an in-depth,
comprehensive one-volume reference that addresses all the major
issues in firmware design and development, including the pertinent
hardware issues.
* Included CD-Rom contains all the source code used in the design
examples, so engineers can easily use it in their own designs
As today's world continues to advance, Artificial Intelligence (AI)
is a field that has become a staple of technological development
and led to the advancement of numerous professional industries. An
application within AI that has gained attention is machine
learning. Machine learning uses statistical techniques and
algorithms to give computer systems the ability to understand and
its popularity has circulated through many trades. Understanding
this technology and its countless implementations is pivotal for
scientists and researchers across the world. The Handbook of
Research on Emerging Trends and Applications of Machine Learning
provides a high-level understanding of various machine learning
algorithms along with modern tools and techniques using Artificial
Intelligence. In addition, this book explores the critical role
that machine learning plays in a variety of professional fields
including healthcare, business, and computer science. While
highlighting topics including image processing, predictive
analytics, and smart grid management, this book is ideally designed
for developers, data scientists, business analysts, information
architects, finance agents, healthcare professionals, researchers,
retail traders, professors, and graduate students seeking current
research on the benefits, implementations, and trends of machine
learning.
Through a systematic view of technologies, researchers are now
finding it less complicated to examine, predict, and explain
complex interactions between fields such as engineering and
computer science. ""Emerging Systems Approaches in Information
Technologies: Concepts, Theories, and Applications"" presents
innovative research findings utilizing the incorporation of the
systems approach into fields such as systems engineering, computer
science, and software engineering. Containing philosophical
evaluations and issues related to complexity, this publication
provides academicians, practitioners, and researchers with the
first resource that fully emphasizes the integration of this
approach.
Originally designed for interpersonal communication, today mobile
devices are capable of connecting their users to a wide variety of
Internet-enabled services and applications. Multimodality in Mobile
Computing and Mobile Devices: Methods for Adaptable Usability
explores a variety of perspectives on multimodal user interface
design, describes a variety of novel multimodal applications, and
provides real-life experience reports. Containing research from
leading international experts, this innovative publication presents
core concepts that define multi-modal, multi-channel, and
multi-device interactions and their role in mobile, pervasive, and
ubiquitous computing.
Clouds are being positioned as the next-generation consolidated,
centralized, yet federated IT infrastructure for hosting all kinds
of IT platforms and for deploying, maintaining, and managing a
wider variety of personal, as well as professional applications and
services. Handbook of Research on Cloud Infrastructures for Big
Data Analytics focuses exclusively on the topic of cloud-sponsored
big data analytics for creating flexible and futuristic
organizations. This book helps researchers and practitioners, as
well as business entrepreneurs, to make informed decisions and
consider appropriate action to simplify and streamline the arduous
journey towards smarter enterprises.
In a digital context, trust is a multifaceted concept, including
trust in application usability, trust in information security, and
trust in fellow users. Mobile technologies have compounded the
impact of such considerations. Trust Management in Mobile
Environments: Autonomic and Usable Models explores current advances
in digital and mobile computing technologies from the user
perspective, evaluating trust models and autonomic trust
management. From the recent history of trust in digital
environments to prospective future developments, this book serves
as a potent reference source for professionals, graduate and
post-graduate students, researchers, and practitioners in the field
of trust management.
Data Quality: The Accuracy Dimension is about assessing the quality
of corporate data and improving its accuracy using the data
profiling method. Corporate data is increasingly important as
companies continue to find new ways to use it. Likewise, improving
the accuracy of data in information systems is fast becoming a
major goal as companies realize how much it affects their bottom
line. Data profiling is a new technology that supports and enhances
the accuracy of databases throughout major IT shops. Jack Olson
explains data profiling and shows how it fits into the larger
picture of data quality.
* Provides an accessible, enjoyable introduction to the subject of
data accuracy, peppered with real-world anecdotes.
* Provides a framework for data profiling with a discussion of
analytical tools appropriate for assessing data accuracy.
* Is written by one of the original developers of data profiling
technology.
* Is a must-read for any data management staff, IT management
staff, and CIOs of companies with data assets.
In recent years, the development of distributed systems, in
particular the Internet, has been influenced heavily by three
paradigms: peer-to-peer, autonomous agents, and service
orientation. Developing Advanced Web Services through P2P Computing
and Autonomous Agents: Trends and Innovations establishes an
understanding of autonomous peer-to-peer Web Service models and
developments as well as extends growing literature on emerging
technologies. This scholarly publication is an important reference
for researchers and academics working in the fields of peer-to-peer
computing, Web and grid services, and agent technologies.
Web service technologies are constantly being recreated,
continuously challenging Web service professionals and examiners.
Modern Technologies in Web Services Research facilitates
communication and networking among Web services and e-business
researchers and engineers in a period where considerable changes
are taking place in Web services technologies innovation. Modern
Technologies in Web Services Research provides mathematic
foundations for service oriented computing, Web services
architecture and security, frameworks for building Web service
applications, and dynamic invocation mechanisms for Web services
among other innovative approaches.
The computer graphics (CG) industry is an attractive field for
undergraduate students, but employers often find that graduates of
CG art programmes are not proficient. The result is that many
positions are left vacant, despite large numbers of job applicants.
This book investigates how student CG artists develop proficiency.
The subject is important to the rapidly growing number of educators
in this sector, employers of graduates, and students who intend to
develop proficiency for the purpose of obtaining employment.
Educators will see why teaching software-oriented knowledge to
students does not lead to proficiency, but that the development of
problem-solving and visualisation skills do. This book follows a
narrow focus, as students develop proficiency in a cognitively
challenging task known as 'NURBS modelling'. This task was chosen
due to an observed relationship between students who succeeded in
the task, and students who successfully obtained employment after
graduation. In the study this is based on, readers will be shown
that knowledge-based explanations for the development of
proficiency do not adequately account for proficiency or expertise
in this field, where visualisation has been observed to develop
suddenly rather than over an extended period of time. This is an
unusual but not unique observation. Other studies have shown rapid
development of proficiency and expertise in certain professions,
such as among telegraph operators, composers and chess players.
Based on these observations, the book argues that threshold
concepts play a key role in the development of expertise among CG
artists.
This monograph aims to provide a well-rounded and detailed account
of designs using linear codes. Most chapters of this monograph
cover on the designs of linear codes. A few chapters deal with
designs obtained from linear codes in other ways. Connections among
ovals, hyperovals, maximal arcs, ovoids, linear codes and designs
are also investigated. This book consists of both classical results
on designs from linear codes and recent results yet published by
others.This monograph is intended to be a reference for
postgraduates and researchers who work on combinatorics, or coding
theory, or digital communications, or finite geometry.
As both intelligence and software science revolutionizes the modern
world, the contributions that each make to the other combine into a
new transdisciplinary field. Breakthroughs in Software Science and
Computational Intelligence charts the new ground broken by
researchers exploring these two disciplines. A vital resource to
students and practitioners working in computer science, theoretical
software engineering, cognitive science, cognitive informatics, and
intelligence science, this book establishes itself in this new
field, emphasizing the abundance of future applications and
advancement.
Java Card is one of the latest developments in the area of
multi-application and platform-independent smart cards. As a
working tool for professionals, this easy-to-understand resource
provides clear, detailed guidance on smart cards, credit and debit
cards, Java Card and Open Card Framework (OCF). It offers in-depth
coverage of important standards, open specifications and critical
security issues, including common threats and security mechanisms
regarding the card and its connection interface. The book explains
how to program a Java Card applet, an OCF card service and a
terminal application. What's more, the book presents an informative
case study on the credit-debit application, offering a detailed
road map of the application development process.
As semantic technologies prove their value with targeted
applications, there are increasing opportunities to consider their
usefulness in social contexts for knowledge, learning, and human
development. ""Social Web Evolution: Integrating Semantic
Applications and Web 2.0 Technologies"" explores the potential of
Web 2.0 and its synergies with the Semantic Web and provides
state-of-the-art theoretical foundations and technological
applications. A reference edition for academicians, practitioners,
policy makers, and government officers eager for knowledge on Web
2.0 and social Web, this book emphasizes practical aspects of the
integration of semantic applications into social Web technologies.
|
|