![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Internet
In practice, the design and architecture of a cloud varies among cloud providers. We present a generic evaluation framework for the performance, availability and reliability characteristics of various cloud platforms. We describe a generic benchmark architecture for cloud databases, specifically NoSQL database as a service. It measures the performance of replication delay and monetary cost. Service Level Agreements (SLA) represent the contract which captures the agreed upon guarantees between a service provider and its customers. The specifications of existing service level agreements (SLA) for cloud services are not designed to flexibly handle even relatively straightforward performance and technical requirements of consumer applications. We present a novel approach for SLA-based management of cloud-hosted databases from the consumer perspective and an end-to-end framework for consumer-centric SLA management of cloud-hosted databases. The framework facilitates adaptive and dynamic provisioning of the database tier of the software applications based on application-defined policies for satisfying their own SLA performance requirements, avoiding the cost of any SLA violation and controlling the monetary cost of the allocated computing resources. In this framework, the SLA of the consumer applications are declaratively defined in terms of goals which are subjected to a number of constraints that are specific to the application requirements. The framework continuously monitors the application-defined SLA and automatically triggers the execution of necessary corrective actions (scaling out/in the database tier) when required. The framework is database platform-agnostic, uses virtualization-based database replication mechanisms and requires zero source code changes of the cloud-hosted software applications.
The massive growth of the Internet has made an enormous amount of infor- tion available to us. However, it is becoming very difficult for users to acquire an - plicable one. Therefore, some techniques such as information filtering have been - troduced to address this issue. Recommender systems filter information that is useful to a user from a large amount of information. Many e-commerce sites use rec- mender systems to filter specific information that users want out of an overload of - formation [2]. For example, Amazon. com is a good example of the success of - commender systems [1]. Over the past several years, a considerable amount of research has been conducted on recommendation systems. In general, the usefulness of the recommendation is measured based on its accuracy [3]. Although a high - commendation accuracy can indicate a user's favorite items, there is a fault in that - ly similar items will be recommended. Several studies have reported that users might not be satisfied with a recommendation even though it exhibits high recommendation accuracy [4]. For this reason, we consider that a recommendation having only accuracy is - satisfactory. The serendipity of a recommendation is an important element when c- sidering a user's long-term profits. A recommendation that brings serendipity to users would solve the problem of "user weariness" and would lead to exploitation of users' tastes. The viewpoint of the diversity of the recommendation as well as its accuracy should be required for future recommender systems.
This book is the second installment of a two-volume series on IPv6
and the KAME implementation. This book discusses those protocols
that are found in more capable IPv6 devices, are commonly deployed
in more complex IPv6 network environments, or are not specific to
IPv6 but are extended to support IPv6. Specifically, this book
engages the readers in advanced topics such as routing,
multicasting, DNS, DHCPv6, mobility, and security.
NewInternetdevelopmentsposegreaterandgreaterprivacydilemmas. Inthe- formation Society, the need for individuals to protect their autonomy and retain control over their personal information is becoming more and more important. Today, informationandcommunicationtechnologies-andthepeopleresponsible for making decisions about them, designing, and implementing them-scarcely consider those requirements, thereby potentially putting individuals' privacy at risk. The increasingly collaborative character of the Internet enables anyone to compose services and contribute and distribute information. It may become hard for individuals to manage and control information that concerns them and particularly how to eliminate outdated or unwanted personal information, thus leavingpersonalhistoriesexposedpermanently. Theseactivitiesraisesubstantial new challenges for personal privacy at the technical, social, ethical, regulatory, and legal levels: How can privacy in emerging Internet applications such as c- laborative scenarios and virtual communities be protected? What frameworks and technical tools could be utilized to maintain life-long privacy? DuringSeptember3-10,2009, IFIP(InternationalFederationforInformation Processing)workinggroups9. 2 (Social Accountability),9. 6/11. 7(IT Misuseand theLaw),11. 4(NetworkSecurity)and11. 6(IdentityManagement)heldtheir5th InternationalSummerSchoolincooperationwiththeEUFP7integratedproject PrimeLife in Sophia Antipolis and Nice, France. The focus of the event was on privacy and identity managementfor emerging Internet applications throughout a person's lifetime. The aim of the IFIP Summer Schools has been to encourage young a- demic and industry entrants to share their own ideas about privacy and identity management and to build up collegial relationships with others. As such, the Summer Schools havebeen introducing participants to the social implications of information technology through the process of informed discussion.
The Semantic Web proposes the mark-up of content on the Web using formal ontologies that structure underlying data for the purpose of comprehensive and transportable machine understanding. ""Semantic Web Services: Theory, Tools and Applications"" brings contributions from researchers, scientists from both industry and academia, and representatives from different communities to study, understand, and explore the theory, tools, and applications of the Semantic Web. ""Semantic Web Services: Theory, Tools and Applications"" binds computing involving the Semantic Web, ontologies, knowledge management, Web services, and Web processes into one fully comprehensive resource, serving as the platform for exchange of both practical technologies and far reaching research.
The main purpose of this book is to sum up the vital and highly topical research issue of knowledge representation on the Web and to discuss novel solutions by combining benefits of folksonomies and Web 2.0 approaches with ontologies and semantic technologies. The book contains an overview of knowledge representation approaches in past, present and future, introduction to ontologies, Web indexing and in first case the novel approaches of developing ontologies. combines aspects of knowledge representation for both the Semantic Web (ontologies) and the Web 2.0 (folksonomies). Currently there is no monographic book which provides a combined overview over these topics. focus on the topic of using knowledge representation methods for document indexing purposes. For this purpose, considerations from classical librarian interests in knowledge representation (thesauri, classification schemes etc.) are included, which are not part of most other books which have a stronger background in computer science.
In this volume, Rudi Studer and his team deliver a self-contained compendium about the exciting field of Semantic Web services, starting with the basic standards and technologies and also including advanced applications in eGovernment and eHealth. The contributions provide both the theoretical background and the practical knowledge necessary to understand the essential ideas and to design new cutting-edge applications.
Information infrastructures are integrated solutions based on the fusion of information and communication technologies. They are characterized by the large amount of data that must be managed accordingly. An information infrastructure requires an efficient and effective information retrieval system to provide access to the items stored in the infrastructure. Terminological Ontologies: Design, Management and Practical Applications presents the main problems that affect the discovery systems of information infrastructures to manage terminological models, and introduces a combination of research tools and applications in Semantic Web technologies. This book specifically analyzes the need to create, relate, and integrate the models required for an infrastructure by elaborating on the problem of accessing these models in an efficient manner via interoperable services and components. Terminological Ontologies: Design, Management and Practical Applications is geared toward information management systems and semantic web professionals working as project managers, application developers, government workers and more. Advanced undergraduate and graduate level students, professors and researchers focusing on computer science will also find this book valuable as a secondary text or reference book.
In the early days of the Web a need was recognized for a language
to display 3D objects through a browser. An HTML-like language,
VRML, was proposed in 1994 and became the standard for describing
interactive 3D objects and worlds on the Web. 3D Web courses were
started, several best-selling books were published, and VRML
continues to be used today. However VRML, because it was based on
HTML, is a stodgy language that is not easy to incorporate with
other applications and has been difficult to add features to.
Meanwhile, applications for interactive 3D graphics have been
exploding in areas such as medicine, science, industry, and
entertainment. There is a strong need for a set of modern Web-based
technologies, applied within a standard extensible framework, to
enable a new generation of modeling & simulation applications
to emerge, develop, and interoperate. X3D is the next generation
open standard for 3D on the web. It is the result of several years
of development by the Web 3D Consortium's X3D Task Group. Instead
of a large monolithic specification (like VRML), which requires
full adoption for compliance, X3D is a component-based architecture
that can support applications ranging from a simple non-interactive
animation to the latest streaming or rendering applications. X3D
replaces VRML, but also provides compatibility with existing VRML
content and browsers. Don Brutzman organized the first symposium on
VRML and is playing a similar role with X3D; he is a founding
member of the consortium. Len Daly is a professional member of the
consortium and both Len and Don have been involved with the
development of the standard from the start.
Visual Knowledge Modeling for Semantic Web Technologies: Models and Ontologies aims to make visual knowledge modeling available to individuals as an intellectual method and a set of tools at different levels of formalization. It aims to provide to its readers a simple, yet powerful visual language to structure their thoughts, analyze information, transform it to personal knowledge, and communicate information to support knowledge acquisition in collaborative activities.
This book addresses the challenges of social network and social media analysis in terms of prediction and inference. The chapters collected here tackle these issues by proposing new analysis methods and by examining mining methods for the vast amount of social content produced. Social Networks (SNs) have become an integral part of our lives; they are used for leisure, business, government, medical, educational purposes and have attracted billions of users. The challenges that stem from this wide adoption of SNs are vast. These include generating realistic social network topologies, awareness of user activities, topic and trend generation, estimation of user attributes from their social content, and behavior detection. This text has applications to widely used platforms such as Twitter and Facebook and appeals to students, researchers, and professionals in the field.
The Social and Cognitive Impacts of E-Commerce on Modern Organizations includes articles addressing the social, cultural, organizational, and cognitive impacts of e-commerce technologies and advances on organizations around the world. Looking specifically at the impacts of electronic commerce on consumer behavior, as well as the impact of e-commerce on organizational behavior, development, and management in organizations. This important new book aims to expand the overall body of knowledge regarding the human aspects of electronic commerce technologies and utilization in modern organizations and to assist researchers and practitioners to devise more effective systems for managing the human side of e-commerce.
This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.
Explores the techniques that assist users in obtaining information by harnessing other users' expert knowledge or search experience.
An important aspect of managing human capital in the 21st century workplace is managing the interface between humans and information technology, particularly the World Wide Web. The Web has changed not only how and where business is conducted, but also how and where work is done. Personal web usage has created many desirable organizational outcomes such as, reducing the cost of communication, restructuring how work is performed. However, it has also generated undesirable outcomes, for instance, loss of intellectual property, sexual harassment lawsuits, productivity losses due to surfing usage, security threats, and network bandwidth overload by visiting web sites for travel, leisure, and sports, and news. The mechanisms controlling the interface of individual and institution in this flexible, open, autonomous work environment created by the Web are emergent phenomena, and the lines between legitimate usage and usage are just beginning to be understood. Personal Web Usage in the Workplace: A Guide to Effective Human Resources Management examines topics which embrace a wide array of Personal Web Usage issues such as antecedents of Web usage, frameworks/models of Web usage, Web technologies for monitoring usage, Web usage within other cultures and countries, Measurement issues of Web usage, and the impact of Web usage among others.
I3E 2009 was held in Nancy, France, during September 23-25, hosted by Nancy University and INRIA Grand-Est at LORIA. The conference provided scientists andpractitionersofacademia, industryandgovernmentwithaforumwherethey presented their latest ?ndings concerning application of e-business, e-services and e-society, and the underlying technology to support these applications. The 9th IFIP Conference on e-Business, e-Services and e-Society, sponsored by IFIP WG 6.1. of Technical Committees TC6 in cooperation with TC11, and TC8 represents the continuation of previous events held in Zurich (Switzerland) in 2001, Lisbon (Portugal) in 2002, Sao Paulo (Brazil) in 2003, Toulouse (France) in 2004, Poznan (Poland) in 2005, Turku (Finland) in 2006, Wuhan (China) in 2007 and Tokyo (Japan) in 2008. The call for papers attracted papers from 31 countries from the ?ve con- nents. As a result, the I3E 2009 programo?ered 12 sessions of full-paper pres- tations. The 31 selected papers cover a wide and important variety of issues in e-Business, e-servicesande-society, including security, trust, andprivacy, ethical and societal issues, business organization, provision of services as software and software as services, and others. Extended versions of selected papers submitted to I3E 2009 will be published in the International Journal of e-Adoption and in AIS Transactions on Enterprise Systems. In addition, a 500-euros prize was awarded to the authors of the best paper selected by the Program Comm- tee. We thank all authors who submitted their papers, the Program Committee members and external reviewers for their excellent
This book presents the state of the art technologies and solutions to tackle the critical challenges faced by the building and development of the WSN and ecological monitoring system but also potential impact on society at social, medical and technological level. This book is dedicated to Sensing systems for Sensors, Wireless Sensor Networks and Ecological Monitoring. The book aims at Master and PhD degree students, researchers, practitioners, especially WSN engineers involved with ecological monitoring. The book will provide an opportunity of a dedicated and a deep approach in order to improve their knowledge in this specific field.
This volume is number 67 in the series Advances in Computers that began back in 1960. This is the longest continuously published series of books that chronicles the evolution of the computer industry. Each year three volumes are produced presenting approximately 20 chapters that describe the latest technology in the use of computers today. Volume 67, subtitled "Web technology," presents 6 chapters that show the impact that the World Wide Web is having on our society today. The general theme running throughout the volume is the ubiquity of web services. Topics such as wireless access and its problems and reliability of web communications are emphasized.
Wireless Vehicular Networks for Car Collision Avoidance focuses on the development of the ITS (Intelligent Transportation Systems) in order to minimize vehicular accidents. The book presents and analyses a range of concrete accident scenarios while examining the causes of vehicular collision and proposing countermeasures based on wireless vehicular networks.The book also describes the vehicular network standards and quality of service mechanisms focusing on improving critical dissemination of safety information. With recommendations on techniques and protocols to consider when improving road safety policies in order to minimize crashes and collision risks.
Cognitive radios (CR) technology is capable of sensing its surrounding environment and adapting its internal states by making corresponding changes in certain operating parameters. CR is envisaged to solve the problems of the limited available spectrum and the inefficiency in the spectrum usage. CR has been considered in mobile ad hoc networks (MANETs), which enable wireless devices to dynamically establish networks without necessarily using a fixed infrastructure. The changing spectrum environment and the importance of protecting the transmission of the licensed users of the spectrum mainly differentiate classical MANETs from CR-MANETs. The cognitive capability and re-configurability of CR-MANETs have opened up several areas of research which have been explored extensively and continue to attract research and development. The book will describe CR-MANETs concepts, intrinsic properties and research challenges of CR-MANETs. Distributed spectrum management functionalities, such as spectrum sensing and sharing, will be presented. The design, optimization and performance evaluation of security issues and upper layers in CR-MANETs, such as transport and application layers, will be investigated.
Readers will progress from an understanding of what the Internet is now towards an understanding of the motivations and techniques that will drive its future. |
You may like...
News Search, Blogs and Feeds - A Toolkit
Lars Vage, Lars Iselid
Paperback
R1,332
Discovery Miles 13 320
|