|
|
Books > Computing & IT > Internet
This book provides an overview of the problems involved in
engineering scalable, elastic, and cost-efficient cloud computing
services and describes the CloudScale method - a description of
rescuing tools and the required steps to exploit these tools. It
allows readers to analyze the scalability problem in detail and
identify scalability anti-patterns and bottlenecks within an
application. With the CloudScale method, software architects can
analyze both existing and planned IT services. The method allows
readers to answer questions like: * With an increasing number of
users, can my service still deliver acceptable quality of service?
* What if each user uses the service more intensively? Can my
service still handle it with acceptable quality of service? * What
if the number of users suddenly increases? Will my service still be
able to handle it? * Will my service be cost-efficient? First the
book addresses the importance of scalability, elasticity, and
cost-efficiency as vital quality-related attributes of modern cloud
computing applications. Following a brief overview of CloudScale,
cloud computing applications are then introduced in detail and the
aspects that need to be captured in models of such applications are
discussed. In CloudScale, these aspects are captured in instances
of the ScaleDL modeling language. Subsequently, the book describes
the forward engineering part of CloudScale, which is applicable
when developing a new service. It also outlines the reverse and
reengineering parts of CloudScale, which come into play when an
existing (legacy) service is modified. Lastly, the book directly
focuses on the needs of both business-oriented and technical
managers by providing guidance on all steps of implementing
CloudScale as well as making decisions during that implementation.
The demonstrators and reference projects described serve as a
valuable starting point for learning from experience. This book is
meant for all stakeholders interested in delivering scalable,
elastic, and cost-efficient cloud computing applications: managers,
product owners, software architects and developers alike. With this
book, they can both see the overall picture as well as dive into
issues of particular interest.
This book introduces the development of self-interference
(SI)-cancellation techniques for full-duplex wireless communication
systems. The authors rely on estimation theory and signal
processing to develop SI-cancellation algorithms by generating an
estimate of the received SI and subtracting it from the received
signal. The authors also cover two new SI-cancellation methods
using the new concept of active signal injection (ASI) for
full-duplex MIMO-OFDM systems. The ASI approach adds an appropriate
cancelling signal to each transmitted signal such that the combined
signals from transmit antennas attenuate the SI at the receive
antennas. The authors illustrate that the SI-pre-cancelling signal
does not affect the data-bearing signal. This book is for
researchers and professionals working in wireless communications
and engineers willing to understand the challenges of deploying
full-duplex and practical solutions to implement a full-duplex
system. Advanced-level students in electrical engineering and
computer science studying wireless communications will also find
this book useful as a secondary textbook.
This book presents state-of-the-art research on robust resource
allocation in current and future wireless networks. The authors
describe the nominal resource allocation problems in wireless
networks and explain why introducing robustness in such networks is
desirable. Then, depending on the objectives of the problem, namely
maximizing the social utility or the per-user utility, cooperative
or competitive approaches are explained and their corresponding
robust problems are considered in detail. For each approach, the
costs and benefits of robust schemes are discussed and the
algorithms for reducing their costs and improving their benefits
are presented. Considering the fact that such problems are
inherently non-convex and intractable, a taxonomy of different
relaxation techniques is presented, and applications of such
techniques are shown via several examples throughout the book.
Finally, the authors argue that resource allocation continues to be
an important issue in future wireless networks, and propose
specific problems for future research.
The Internet has dramatically altered the landscape of crime and
national security, creating new threats, such as identity theft,
computer viruses, and cyberattacks. Moreover, because cybercrimes
are often not limited to a single site or nation, crime scenes
themselves have changed. Consequently, law enforcement must
confront these new dangers and embrace novel methods of prevention,
as well as produce new tools for digital surveillance-which can
jeopardize privacy and civil liberties. Cybercrime brings together
leading experts in law, criminal justice, and security studies to
describe crime prevention and security protection in the electronic
age. Ranging from new government requirements that facilitate
spying to new methods of digital proof, the book is essential to
understand how criminal law-and even crime itself-have been
transformed in our networked world. Contributors: Jack M. Balkin,
Susan W. Brenner, Daniel E. Geer, Jr., James Grimmelmann, Emily
Hancock, Beryl A. Howell, Curtis E.A. Karnow, Eddan Katz, Orin S.
Kerr, Nimrod Kozlovski, Helen Nissenbaum, Kim A. Taipale, Lee Tien,
Shlomit Wagman, and Tal Zarsky. Jack M. Balkin is Knight Professor
of Constitutional Law and the First Amendment at Yale Law School,
and the Founder and Director of Yale's Information Society Project
(ISP). He is the co-editor of The State of Play: Law, Games, and
Virtual Worlds, also from NYU Press. James Grimmelmann, Nimrod
Kozlovski, Shlomit Wagman, and Tal Zarsky are Fellows of the ISP.
Eddan Katz is the Executive Director of the Information Society
Project.
By joining bodies of research in media theory, cultural studies,
and critical pedagogy, "Developing Media Literacy in Cyberspace"
offers a vision of learning that values social empowerment over
technical skills. An inquiry into the existence and range of models
equipped to cultivate critical teaching and learning in the
Internet-supported classroom, this new study argues that media
literacy offers the best long-term training for today's youth to
become experienced practitioners of 21st-century technology. Author
Julie Frechette helps educators develop and provide concrete
learning strategies that enable students to judge the validity and
worth of what they see on the Internet as they strive to become
critically autonomous in a technology-laden world.
Part of this effort lies in developing a keen awareness of the
institutional, political, and economic structure of the Internet as
a means of communication that is increasingly marketing products
and targeting advertisements toward youth. Values on the Internet
are discussed constantly both by the major media and by the private
sector, with little regard for the pervasive interests and
authority of profitable industries staking out their territory in
this new global village. Unlike other studies that provide a broad
sociohistorical context for the development of theoretical uses of
new technologies in the classroom, "Developing Media Literacy in
Cyberspace" lays the groundwork for establishing critical thinking
skills that will serve students' interests as they navigate this
vast and complicated cyberterritory.
Internet use for business-to-business e-commerce is expected to
grow at spectacular rates. Many experts feel that perceived lack of
trust in e-commerce transactions on the Internet has contributed to
the slow adoption of e-commerce in the recent past. This book
provides an avenue for managers and researchers to explore, examine
and describe interorganizaitonal trust relationships in e-commerce
participation. With the identification of trust behaviours in
business relationships this will increase the awareness of
e-commerce participants, who can then examine their own and their
trading partners' trust behaviours.
This book presents a design methodology that is practically
applicable to the architectural design of a broad range of systems.
It is based on fundamental design concepts to conceive and specify
the required functional properties of a system, while abstracting
from the specific implementation functions and technologies that
can be chosen to build the system. Abstraction and precision are
indispensable when it comes to understanding complex systems and
precisely creating and representing them at a high functional
level. Once understood, these concepts appear natural, self-evident
and extremely powerful, since they can directly, precisely and
concisely reflect what is considered essential for the functional
behavior of a system. The first two chapters present the global
views on how to design systems and how to interpret terms and
meta-concepts. This informal introduction provides the general
context for the remainder of the book. On a more formal level,
Chapters 3 through 6 present the main basic design concepts,
illustrating them with examples. Language notations are introduced
along with the basic design concepts. Lastly, Chapters 7 to 12
discuss the more intricate basic design concepts of interactive
systems by focusing on their common functional goal. These chapters
are recommended to readers who have a particular interest in the
design of protocols and interfaces for various systems. The
didactic approach makes it suitable for graduate students who want
to develop insights into and skills in developing complex systems,
as well as practitioners in industry and large organizations who
are responsible for the design and development of large and complex
systems. It includes numerous tangible examples from various
fields, and several appealing exercises with their solutions.
Web technologies have become a vital element within educational,
professional, and social settings as they have the potential to
improve performance and productivity across organizations.
Artificial Intelligence Technologies and the Evolution of Web 3.0
brings together emergent research and best practices surrounding
the effective usage of Web 3.0 technologies in a variety of
environments. Featuring the latest technologies and applications
across industries, this publication is a vital reference source for
academics, researchers, students, and professionals who are
interested in new ways to use intelligent web technologies within
various settings.
This book provides a general and comprehensible overview of
imbalanced learning. It contains a formal description of a problem,
and focuses on its main features, and the most relevant proposed
solutions. Additionally, it considers the different scenarios in
Data Science for which the imbalanced classification can create a
real challenge. This book stresses the gap with standard
classification tasks by reviewing the case studies and ad-hoc
performance metrics that are applied in this area. It also covers
the different approaches that have been traditionally applied to
address the binary skewed class distribution. Specifically, it
reviews cost-sensitive learning, data-level preprocessing methods
and algorithm-level solutions, taking also into account those
ensemble-learning solutions that embed any of the former
alternatives. Furthermore, it focuses on the extension of the
problem for multi-class problems, where the former classical
methods are no longer to be applied in a straightforward way. This
book also focuses on the data intrinsic characteristics that are
the main causes which, added to the uneven class distribution,
truly hinders the performance of classification algorithms in this
scenario. Then, some notes on data reduction are provided in order
to understand the advantages related to the use of this type of
approaches. Finally this book introduces some novel areas of study
that are gathering a deeper attention on the imbalanced data issue.
Specifically, it considers the classification of data streams,
non-classical classification problems, and the scalability related
to Big Data. Examples of software libraries and modules to address
imbalanced classification are provided. This book is highly
suitable for technical professionals, senior undergraduate and
graduate students in the areas of data science, computer science
and engineering. It will also be useful for scientists and
researchers to gain insight on the current developments in this
area of study, as well as future research directions.
This textbook addresses the conceptual and practical aspects of the
various phases of the lifecycle of service systems, ranging from
service ideation, design, implementation, analysis, improvement and
trading associated with service systems engineering. Written by
leading experts in the field, this indispensable textbook will
enable a new wave of future professionals to think in a
service-focused way with the right balance of competencies in
computer science, engineering, and management. Fundamentals of
Service Systems is a centerpiece for a course syllabus on service
systems. Each chapter includes a summary, a list of learning
objectives, an opening case, and a review section with questions, a
project description, a list of key terms, and a list of further
reading bibliography. All these elements enable students to learn
at a faster and more comfortable peace. For researchers, teachers,
and students who want to learn about this new emerging science,
Fundamentals of Service Systems provides an overview of the core
disciplines underlying the study of service systems. It is aimed at
students of information systems, information technology, and
business and economics. It also targets business and IT
practitioners, especially those who are looking for better ways of
innovating, designing, modeling, analyzing, and optimizing service
systems.
This book demonstrates to managers the strategic significance of
intra-organizational social networks. It argues that strategic
management is embedded in the complexity of social relations that
shape the strategic direction of a company. Currently there are few
tools available to systematically collect information about the
social functioning of an organization. This book fills this gap by
shifting attention to the social relations that contribute to
strategic advantage and that build on relationships that provide
unique resources and create value for the business. It considers
three perspectives on how social networks have a strategic
function: first, social networks constitute everyday strategic
action; second, social networks convey cultural meanings; and
third, how social networks depict social processes that continually
illustrate what the organization is and what it can become. The
book shows top and upper-middle management how cultivating an
understanding of intra-firm social relations can help them to build
unique strategic advantage and make use of the day-to-day knowledge
that emerges in the social connections and interactions within an
organization.
Cloud service benchmarking can provide important, sometimes
surprising insights into the quality of services and leads to a
more quality-driven design and engineering of complex software
architectures that use such services. Starting with a broad
introduction to the field, this book guides readers step-by-step
through the process of designing, implementing and executing a
cloud service benchmark, as well as understanding and dealing with
its results. It covers all aspects of cloud service benchmarking,
i.e., both benchmarking the cloud and benchmarking in the cloud, at
a basic level. The book is divided into five parts: Part I
discusses what cloud benchmarking is, provides an overview of cloud
services and their key properties, and describes the notion of a
cloud system and cloud-service quality. It also addresses the
benchmarking lifecycle and the motivations behind running
benchmarks in particular phases of an application lifecycle. Part
II then focuses on benchmark design by discussing key objectives
(e.g., repeatability, fairness, or understandability) and defining
metrics and measurement methods, and by giving advice on developing
own measurement methods and metrics. Next, Part III explores
benchmark execution and implementation challenges and objectives as
well as aspects like runtime monitoring and result collection.
Subsequently, Part IV addresses benchmark results, covering topics
such as an abstract process for turning data into insights, data
preprocessing, and basic data analysis methods. Lastly, Part V
concludes the book with a summary, suggestions for further reading
and pointers to benchmarking tools available on the Web. The book
is intended for researchers and graduate students of computer
science and related subjects looking for an introduction to
benchmarking cloud services, but also for industry practitioners
who are interested in evaluating the quality of cloud services or
who want to assess key qualities of their own implementations
through cloud-based experiments.
This book throws new light on the way in which the Internet impacts
on democracy. Based on Jurgen Habermas' discourse-theoretical
reconstruction of democracy, it examines one of the world's
largest, most diverse but also most unequal democracies, Brazil, in
terms of the broad social and legal effects the internet has had.
Focusing on the Brazilian constitutional evolution, the book
examines how the Internet might impact on the legitimacy of a
democratic order and if, and how, it might yield opportunities for
democratic empowerment. The book also assesses the ways in which
law, as an institution and a system, reacts to the changes and
challenges brought about by the Internet: the ways in which law may
retain its strength as an integrative force, avoiding a 'virtual'
legitimacy crisis.
The widespread use of XML in business and scientific databases has
prompted the development of methodologies, techniques, and systems
for effectively managing and analyzing XML data. This has
increasingly attracted the attention of different research
communities, including database, information retrieval, pattern
recognition, and machine learning, from which several proposals
have been offered to address problems in XML data management and
knowledge discovery. XML Data Mining: Models, Methods, and
Applications aims to collect knowledge from experts of database,
information retrieval, machine learning, and knowledge management
communities in developing models, methods, and systems for XML data
mining. This book addresses key issues and challenges in XML data
mining, offering insights into the various existing solutions and
best practices for modeling, processing, analyzing XML data, and
for evaluating performance of XML data mining algorithms and
systems.
Innovations in cloud and service-oriented architectures continue to
attract attention by offering interesting opportunities for
research in scientific communities. Although advancements such as
computational power, storage, networking, and infrastructure have
aided in making major progress in the implementation and
realization of cloud-based systems, there are still significant
concerns that need to be taken into account. Principles,
Methodologies, and Service-Oriented Approaches for Cloud Computing
aims to present insight into Cloud principles, examine associated
methods and technologies, and investigate the use of
service-oriented computing technologies. In addressing supporting
infrastructure of the Cloud, including associated challenges and
pressing issues, this reference source aims to present researchers,
engineers, and IT professionals with various approaches in Cloud
computing.
This book covers aspects of human re-identification problems
related to computer vision and machine learning. Working from a
practical perspective, it introduces novel algorithms and designs
for human re-identification that bridge the gap between research
and reality. The primary focus is on building a robust, reliable,
distributed and scalable smart surveillance system that can be
deployed in real-world scenarios. This book also includes detailed
discussions on pedestrian candidates detection, discriminative
feature extraction and selection, dimension reduction,
distance/metric learning, and decision/ranking enhancement.This
book is intended for professionals and researchers working in
computer vision and machine learning. Advanced-level students of
computer science will also find the content valuable.
Readers seeking to gain a handle on the internet's global expansion
will find this book rich in scholarly foundations combined with
cutting-edge discussion of emerging ICTs and services and the
complex societal contexts in which they are embedded. To explore
possibilities to the fullest extent, a sociotechnical systems
approach is employed, focusing on the interplay of technical,
social, cultural, political, and economic dynamics to explore
alternative futures (ones that are not part of the dominant
discourse about the internet). These shared perspectives are not
well addressed elsewhere in current discussions. Awareness of these
dynamics, and the fluidity of the future, is important, as
humankind moves forward into the uncertain future. Due to the
sociotechnical complexity of the Internet, policymakers,
businesspeople, and academics worldwide have struggled to keep
abreast of developments. This volume's approach is intended to
stimulate dialogue between academics and practitioners on a topic
that will affect most aspects of human life in the near-term
future.
As the growing relationship between individuals and technology
continue to play a vital role in our society and work place, the
progress and execution of information technology communication
systems is important in maintaining our current way of life.
Knowledge and Technological Development Effects on Organizational
and Social Structures provides a wide ranging discussion on the
exchanging of research ideas and practices in an effort to bring
together the social and technical aspects within organizations and
society. This collection focuses on new ideas and studies for
research, students, and practitioners.
|
|