Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 14 of 14 matches in All Departments
The new multimedia standards (for example, MPEG-21) facilitate the seamless integration of multiple modalities into interoperable multimedia frameworks, transforming the way people work and interact with multimedia data. These key technologies and multimedia solutions interact and collaborate with each other in increasingly effective ways, contributing to the multimedia revolution and having a significant impact across a wide spectrum of consumer, business, healthcare, education, and governmental domains. This book aims to provide a complete coverage of the areas outlined and to bring together the researchers from academic and industry as well as practitioners to share ideas, challenges, and solutions relating to the multifaceted aspects of this field.
The new multimedia standards (for example, MPEG-21) facilitate the
seamless integration of multiple modalities into interoperable
multimedia frameworks, transforming the way people work and
interact with multimedia data. These key technologies and
multimedia solutions interact and collaborate with each other in
increasingly effective ways, contributing to the multimedia
revolution and having a significant impact across a wide spectrum
of consumer, business, healthcare, education and governmental
domains. This book aims to provide a complete coverage of the areas
outlined and to bring together the researchers from academic and
industry as well as practitioners to share ideas, challenges and
solutions relating to the multifaceted aspects of this field.
Temporal Data Mining via Unsupervised Ensemble Learning provides the principle knowledge of temporal data mining in association with unsupervised ensemble learning and the fundamental problems of temporal data clustering from different perspectives. By providing three proposed ensemble approaches of temporal data clustering, this book presents a practical focus of fundamental knowledge and techniques, along with a rich blend of theory and practice. Furthermore, the book includes illustrations of the proposed approaches based on data and simulation experiments to demonstrate all methodologies, and is a guide to the proper usage of these methods. As there is nothing universal that can solve all problems, it is important to understand the characteristics of both clustering algorithms and the target temporal data so the correct approach can be selected for a given clustering problem. Scientists, researchers, and data analysts working with machine learning and data mining will benefit from this innovative book, as will undergraduate and graduate students following courses in computer science, engineering, and statistics.
The new multimedia standards (for example, MPEG-21) facilitate the seamless integration of multiple modalities into interoperable multimedia frameworks, transforming the way people work and interact with multimedia data. These key technologies and multimedia solutions interact and collaborate with each other in increasingly effective ways, contributing to the multimedia revolution and having a significant impact across a wide spectrum of consumer, business, healthcare, education and governmental domains. This book aims to provide a complete coverage of the areas outlined and to bring together the researchers from academic and industry as well as practitioners to share ideas, challenges and solutions relating to the multifaceted aspects of this field.
The new multimedia standards (for example, MPEG-21) facilitate the seamless integration of multiple modalities into interoperable multimedia frameworks, transforming the way people work and interact with multimedia data. These key technologies and multimedia solutions interact and collaborate with each other in increasingly effective ways, contributing to the multimedia revolution and having a significant impact across a wide spectrum of consumer, business, healthcare, education, and governmental domains. This book aims to provide a complete coverage of the areas outlined and to bring together the researchers from academic and industry as well as practitioners to share ideas, challenges, and solutions relating to the multifaceted aspects of this field.
Computation and Storage in the Cloud is the first comprehensive
and systematic work investigating the issue of computation and
storage trade-off in the cloud in order to reduce the overall
application cost. Scientific applications are usually computation
and data intensive, where complex computation tasks take a long
time for execution and the generated datasets are often terabytes
or petabytes in size. Storing valuable generated application
datasets can save their regeneration cost when they are reused, not
to mention the waiting time caused by regeneration. However, the
large size of the scientific datasets is a big challenge for their
storage. By proposing innovative concepts, theorems and algorithms,
this book will help bring the cost down dramatically for both cloud
users and service providers to run computation and data intensive
scientific applications in the cloud. Covers cost models and
benchmarking that explain the necessary tradeoffs for both cloud
providers and usersDescribes several novel strategies for storing
application datasets in the cloudIncludes real-world case studies
of scientific research applications Describes several novel strategies for storing application datasets in the cloud Includes real-world case studies of scientific research applications
Cloud computing can provide virtually unlimited scalable high performance computing resources. Cloud workflows often underlie many large scale data/computation intensive e-science applications such as earthquake modelling, weather forecasting and astrophysics. During application modelling, these sophisticated processes are redesigned as cloud workflows, and at runtime, the models are executed by employing the supercomputing and data sharing ability of the underlying cloud computing infrastructures. "Temporal QOS Management in Scientific Cloud Workflow Systems"
focuses on real world scientific applications which often must be
completed by satisfying a set of temporal constraints such as
milestones and deadlines. Meanwhile, activity duration, as a
measurement of system performance, often needs to be monitored and
controlled. This book demonstrates how to guarantee on-time
completion of most, if not all, workflow applications. Offering a
comprehensive framework to support the lifecycle of
time-constrained workflow applications, this book will enhance the
overall performance and usability of scientific cloud workflow
systems.
Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by "XaaS", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents a systematic investigation to the three critical aspects for the design of a cloud workflow system, viz. system architecture, system functionality and quality of service. Specifically, the system architecture for a cloud workflow system is designed based on the general four-layer cloud architecture, viz. application layer, platform layer, unified resources layer and fabric layer. The system functionality for a cloud workflow system is designed based on the general workflow reference model but with significant extensions to accommodate software services in the cloud. The support of QoS is critical for the quality of cloud workflow applications. This book presents a generic framework to facilitate a unified design and development process for software components that deliver lifecycle support for different QoS requirements. While the general QoS requirements for cloud workflow applications can have many dimensions, this book mainly focuses on three of the most important ones, viz. performance, reliability and security. In this book, the architecture, functionality and QoS management of our SwinDeW-C prototype cloud workflow system are demonstrated in detail as a case study to evaluate our generic design for cloud workflow systems. To conclude, this book offers a general overview of cloud workflow systems and provides comprehensive introductions to the design of the system architecture, system functionality and QoS management.
Design of complex artifacts and systems requires the cooperation of multidisciplinary design teams using multiple sophisticated commercial and non-commercial engine- ing tools such as CAD tools, modeling, simulation and optimization software, en- neering databases, and knowledge-based systems. Individuals or individual groups of multidisciplinary design teams usually work in parallel and independently with various engineering tools, which are located on different sites, often for quite a long period of time. At any moment, individual members may be working on different versions of a design or viewing the design from various perspectives, at different levels of details. In order to meet these requirements, it is necessary to have efficient comput- supported collaborative design systems. These systems should not only automate in- vidual tasks, in the manner of traditional computer-aided engineering tools, but also enable individual members to share information, collaborate, and coordinate their activities within the context of a design project. Based on close international collaboration between the University of Technology of Compiegne in France and the Institute of Computing Technology of the Chinese Ac- emy of Sciences in the early 1990s, a series of international workshops on CSCW in Design started in 1996. In order to facilitate the organization of these workshops, an International Working Group on CSCW in Design (CSCWD) was established and an International Steering Committee was formed in 1998. The series was converted to int- national conferences in 2000 building on the success of the four previous workshops."
This book constitutes the refereed proceedings of the joint 9th Asia-Pacific Web Conference, APWeb 2007, and the 8th International Conference on Web-Age Information Management, WAIM 2007, held in Huang Shan, China in June 2007. The 47 revised full papers and 36 revised short papers presented
together with 4 invited papers and the abstracts of 4 keynote
papers were carefully reviewed and selected from a total of 554
submissions. The papers are organized in topical sections on data
mining and knowledge discovery, information retrieval, P2P systems,
sensor networks, spatial and temporal databases, Web mining, XML
and semi-structured data, sensor networks and grids, query
processing and optimization, data streams, data integration and
collaborative systems, data mining and e-learning, data mining,
privacy and security, as well as data mining and data
streams.
This book constitutes the refereed combined proceedings of four international workshops held in conjunction with the joint 9th Asia-Pacific Web Conference, APWeb 2007, and the 8th International Conference on Web-Age Information Management, WAIM 2007, held in Huang Shan, China in June 2007 (see LNCS 4505). The 50 revised full papers and 25 revised short papers presented together with the abstract of 1 keynote talk were carefully reviewed and selected from a total of 266 submissions. The papers of the four workshops are very specific and contribute to enlarging the spectrum of the more general topics treated in the APWeb 2007 and WAIM 2007 main conferences. Topics addressed by the workshops are: Database Management and Application over Networks (DBMAN 2007), Emerging Trends of Web Technologies and Applications (WebETrends 2007), Process Aware Information Systems (PAIS 2007), and Application and Security Service in Web and Pervasive Environments (ASWAN 2007).
Shao-yun Yang challenges assumptions that the cultural and socioeconomic watershed of the Tang-Song transition (800–1127 CE) was marked by a xenophobic or nationalist hardening of ethnocultural boundaries in response to growing foreign threats. In that period, reinterpretations of Chineseness and its supposed antithesis, “barbarism,” were not straightforward products of political change but had their own developmental logic based in two interrelated intellectual shifts among the literati elite: the emergence of Confucian ideological and intellectual orthodoxy and the rise of neo-Confucian (daoxue) philosophy. New discourses emphasized the fluidity of the Chinese-barbarian dichotomy, subverting the centrality of cultural or ritual practices to Chinese identity and redefining the essence of Chinese civilization and its purported superiority. The key issues at stake concerned the acceptability of intellectual pluralism in a Chinese society and the importance of Confucian moral values to the integrity and continuity of the Chinese state. Through close reading of the contexts and changing geopolitical realities in which new interpretations of identity emerged, this intellectual history engages with ongoing debates over relevance of the concepts of culture, nation, and ethnicity to premodern China.
Shao-yun Yang challenges assumptions that the cultural and socioeconomic watershed of the Tang-Song transition (800-1127 CE) was marked by a xenophobic or nationalist hardening of ethnocultural boundaries in response to growing foreign threats. In that period, reinterpretations of Chineseness and its supposed antithesis, "barbarism," were not straightforward products of political change but had their own developmental logic based in two interrelated intellectual shifts among the literati elite: the emergence of Confucian ideological and intellectual orthodoxy and the rise of neo-Confucian (daoxue) philosophy. New discourses emphasized the fluidity of the Chinese-barbarian dichotomy, subverting the centrality of cultural or ritual practices to Chinese identity and redefining the essence of Chinese civilization and its purported superiority. The key issues at stake concerned the acceptability of intellectual pluralism in a Chinese society and the importance of Confucian moral values to the integrity and continuity of the Chinese state. Through close reading of the contexts and changing geopolitical realities in which new interpretations of identity emerged, this intellectual history engages with ongoing debates over relevance of the concepts of culture, nation, and ethnicity to premodern China.
With the rapid growth of Cloud computing, the size of Cloud data is expanding at a dramatic speed. A huge amount of data is generated and processed by Cloud applications, putting a higher demand on cloud storage. While data reliability should already be a requirement, data in the Cloud needs to be stored in a highly cost-effective manner. This book focuses on the trade-off between data storage cost and data reliability assurance for big data in the Cloud. Throughout the whole Cloud data lifecycle, four major features are presented: first, a novel generic data reliability model for describing data reliability in the Cloud; second, a minimum replication calculation approach for meeting a given data reliability requirement to facilitate data creation; third, a novel cost-effective data reliability assurance mechanism for big data maintenance, which could dramatically reduce the storage space needed in the Cloud; fourth, a cost-effective strategy for facilitating data creation and recovery, which could significantly reduce the energy consumption during data transfer.
|
You may like...
We Were Perfect Parents Until We Had…
Vanessa Raphaely, Karin Schimke
Paperback
|