![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > General
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science.
ICT-D refers to the trend in development thinking and practice that
considers deployment of new technologies like computers, mobile
phones and Internet, which are important for spurring economic
growth, enabling good governance and facilitating human
development. In this context, telecentres have emerged as an
immensely popular strategy for providing shared and mediated access
to ICTs. The rapid proliferation of telecentres in rural India
during the last decade was driven by multiple agencies, each with
their own purpose, priorities and pre-designed set of services.
This volume juxtaposes the global discourse on ICT-D and
telecentres with in-depth empirical case studies on the pattern of
access and use of telecenters in rural India to draw implications
for policy and practice.
Healthcare is significantly affected by technological advancements, as technology both shapes and changes health systems locally and globally. As areas of computer science, information technology, and healthcare merge, it is important to understand the current and future implications of health informatics. Healthcare and the Effect of Technology: Developments, Challenges and Advancements bridges the gap between today's empirical research findings and healthcare practice. It provides the reader with information on current technological integrations, potential uses for technology in healthcare, and the implications both positive and negative of health informatics for one's health. Technology in healthcare can improve efficiency, make patient records more accessible, increase professional communication, create global health networking, and increase access to healthcare. However, it is important to consider the ethical, confidential, and cultural implications technology in healthcare may impose. That is what makes this book is a must-read for policymakers, human resource professionals, management personnel, as well as for researchers, scholars, students, and healthcare professionals.
This book represents the compilation of papers presented at the IFIP Working Group 8. 2 conference entitled "Information Technology in the Service Economy: Challenges st and Possibilities for the 21 Century. " The conference took place at Ryerson University, Toronto, Canada, on August 10 13, 2008. Par ticipation in the conference spanned the continents from Asia to Europe with paper submissions global in focus as well. Conference submissions included complete d research papers and research in progress reports. Papers submitted to the conference went through a double blind review process in which the program co chairs, an associate editor, and reviewers provided assessments and recommendations. The editor ial efforts of the associate editors and reviewers in this process were outstanding. To foster high quality research publications in this field of study, authors of accepted pape rs were then invited to revise and resubmit their work. Through this rigorous review and revision process, 12 completed research papers and 11 research in progress reports were accepted for presentation and publica tion. Paper workshop sessions were also esta blished to provide authors of emergent work an opportunity to receive feedback fromthe IF IP 8. 2 community. Abstracts of these new projects are included in this volume. Four panels were presented at the conference to provide discussion forums for the varied aspect s of IT, service, and globalization. Panel abstracts are also included here.
Computer-based information technologies have been extensively used to help industries manage their processes and information systems hereby - come their nervous center. More specially, databases are designed to s- port the data storage, processing, and retrieval activities related to data management in information systems. Database management systems p- vide efficient task support and database systems are the key to impleme- ing industrial data management. Industrial data management requires da- base technique support. Industrial applications, however, are typically data and knowledge intensive applications and have some unique character- tics that makes their management difficult. Besides, some new techniques such as Web, artificial intelligence, and etc. have been introduced into - dustrial applications. These unique characteristics and usage of new te- nologies have put many potential requirements on industrial data mana- ment, which challenge today's database systems and promote their evolvement. Viewed from database technology, information modeling in databases can be identified at two levels: (conceptual) data modeling and (logical) database modeling. This results in conceptual (semantic) data model and logical database model. Generally a conceptual data model is designed and then the designed conceptual data model will be transformed into a chosen logical database schema. Database systems based on logical database model are used to build information systems for data mana- ment. Much attention has been directed at conceptual data modeling of - dustrial information systems. Product data models, for example, can be views as a class of semantic data models (i. e.
Living theory is a way of making use of personal accounts of experienced practice. As the Pac-Man perspective on organisational change helps the change agent articulate the personal values he is committed to and how these values may be resisted in practice, living theory is useful for developing knowledge that has a practical impact on self-improvement and social change, but it is also a type of theory that is difficult to publish in academic outlets. As a consequence of this, publishing Pac-Man living-theory research becomes a Pac-Man game in itself, with the journal editors as one of the four adversary gatekeepers, but it is a rewarding game for those who want to contribute both theoretically and practically on how to make the world a better place.
Collaborative Networks for a Sustainable World Aiming to reach a sustainable world calls for a wider collaboration among multiple stakeholders from different origins, as the changes needed for sustainability exceed the capacity and capability of any individual actor. In recent years there has been a growing awareness both in the political sphere and in civil society including the bu- ness sectors, on the importance of sustainability. Therefore, this is an important and timely research issue, not only in terms of systems design but also as an effort to b- row and integrate contributions from different disciplines when designing and/or g- erning those systems. The discipline of collaborative networks especially, which has already emerged in many application sectors, shall play a key role in the implemen- tion of effective sustainability strategies. PRO-VE 2010 focused on sharing knowledge and experiences as well as identi- ing directions for further research and development in this area. The conference - dressed models, infrastructures, support tools, and governance principles developed for collaborative networks, as important resources to support multi-stakeholder s- tainable developments. Furthermore, the challenges of this theme open new research directions for CNs. PRO-VE 2010 held in St.
The book presents the state of the art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of high performance systems and heterogeneous architectures. The application contributions cover computational fluid dynamics, material science, medical applications and climate research. Innovative fields like coupled multi-physics or multi-scale simulations are presented. All papers were chosen from presentations given at the 14th Teraflop Workshop held in December 2011 at HLRS, University of Stuttgart, Germany and the Workshop on Sustained Simulation Performance at Tohoku University in March 2012.
Evolutionary computation has emerged as a major topic in the scientific community as many of its techniques have successfully been applied to solve problems in a wide variety of fields. Modeling Applications and Theoretical Innovations in Interdisciplinary Evolutionary Computation provides comprehensive research on emerging theories and its aspects on intelligent computation. Particularly focusing on breaking trends in evolutionary computing, algorithms, and programming, this publication serves to support professionals, government employees, policy and decision makers, as well as students in this scientific field.
Do Smart Adaptive Systems Exist? is intended as a reference and a guide summarising and focusing on best practices when using intelligent techniques and building systems requiring a degree of adaptation and intelligence. It is therefore not intended as a collection of the most recent research results, but as a practical guide for experts from other areas and industrial users interested in building solutions to their problems using intelligent techniques. One of the main issues covered is an attempt to answer the question of how to select and/or combine suitable intelligent techniques from a large pool of potential solutions. Another attractive feature of the book is that it brings together experts from neural network, fuzzy, machine learning, evolutionary and hybrid systems communities who will provide their views on how these different intelligent technologies have contributed and will contribute to creation of smart adaptive systems of the future.
Over the past years, business schools have been experimenting with distance learning and online education. In many cases this new technology has not brought the anticipated results. Questions raised by online education can be linked to the fundamental problem of education and teaching, and more specifically to the models and philosophy of education and teaching. Virtual Corporate Universities: A Matrix of Knowledge and Learning for the New Digital Dawn offers a source for new thoughts about those processes in view of the use of new technologies. Learning is considered as a key-strategic tool for new strategies, innovation, and significantly improving organizational effectiveness. The book blends the elements of knowledge management, as well as organizational and individual learning. The book is not just a treatment of technology, but a fusion of a novel dynamic learner (student)-driven learning concept, the management and creation of dynamic knowledge, and next-generation technologies to generic business, organizational and managerial processes, and the development of human capital. Obviously, the implications of online learning go far beyond the field of business as presented in this book.
Over the last five to six years, ontology has received increased attention within the information systems field. Ontology provides a basis for evaluating, analyzing, and engineering business analysis methods. It is that type of theology that has allowed many organizations utilizing ontology to become more competitive within today's global environment. Business Systems Analysis with Ontologies examines, thoroughly, the area of ontologies. All aspects of ontologies are covered; analysis, evaluation, and engineering of business systems analysis methods. Readers are shown the world of ontologies through a number of research methods. For example, survey methodologies, case studies, experimental methodologies, analytical modeling, and field studies are all used within this book to help the reader understand the usefulness of ontologies.
As more and more hardware platforms support parallelism, parallel programming is gaining momentum. Applications can only leverage the performance of multi-core processors or graphics processing units if they are able to split a problem into smaller ones that can be solved in parallel. The challenges emerging from the development of parallel applications have led to the development of a great number of tools for debugging, performance analysis and other tasks. The proceedings of the 3rd International Workshop on Parallel Tools for High Performance Computing provide a technical overview in order to help engineers, developers and computer scientists decide which tools are best suited to enhancing their current development processes.
We are extremely pleased to present a comprehensive book comprising a collection of research papers which is basically an outcome of the Second IFIP TC 13.6 Working Group conference on Human Work Interaction Design, HWID2009. The conference was held in Pune, India during October 7-8, 2009. It was hosted by the Centre for Development of Advanced Computing, India, and jointly organized with Copenhagen Business School, Denmark; Aarhus University, Denmark; and Indian Institute of Technology, Guwahati, India. The theme of HWID2009 was Usability in Social, C- tural and Organizational Contexts. The conference was held under the auspices of IFIP TC 13 on Human-Computer Interaction. 1 Technical Committee TC13 on Human-Computer Interaction The committees under IFIP include the Technical Committee TC13 on Human-Computer Interaction within which the work of this volume has been conducted. TC13 on Human-Computer Interaction has as its aim to encourage theoretical and empirical human science research to promote the design and evaluation of human-oriented ICT. Within TC13 there are different working groups concerned with different aspects of human- computer interaction. The flagship event of TC13 is the bi-annual international conference called INTERACT at which both invited and contributed papers are presented. Contributed papers are rigorously refereed and the rejection rate is high.
Covering the years 2008-2012, this bookprofilesthe life and work
of recent winners of the Abel Prize: The book also presents a history of the Abel Prize written by the historian Kim Helsvig, and includes a facsimile of aletter from Niels Henrik Abel, which is transcribed, translated into English, and placed into historical perspectiveby Christian Skau. This book follows onThe Abel Prize: 2003-2007, The First Five Years(Springer, 2010), which profiles the work of the first Abel Prize winners. "
High-speed, power-efficient analog integrated circuits can be used as standalone devices or to interface modern digital signal processors and micro-controllers in various applications, including multimedia, communication, instrumentation, and control systems. New architectures and low device geometry of complementary metaloxidesemiconductor (CMOS) technologies have accelerated the movement toward system on a chip design, which merges analog circuits with digital, and radio-frequency components.
Post COVID-19 pandemic, researchers have been evaluating the healthcare system for improvements that can be made. Understanding global healthcare systems' operations is essential to preventative measures to be taken for the next global health crisis. A key part to bettering healthcare is the implementation of information management and One Health. The Handbook of Research on Information Management and One Health evaluates the concepts in global health and the application of essential information management in healthcare organizational strategic contexts. This text promotes understanding in how evaluation health and information management are decisive for health planning, management, and implementation of the One Health concept. Covering topics like development partnerships, global health, and the nature of pandemics, this text is essential for health administrators, policymakers, government officials, public health officials, information systems experts, data scientists, analysts, health information science and global health scholars, researchers, practitioners, doctors, students, and academicians.
A presentation of the central and basic concepts, techniques, and tools of computer science, with the emphasis on presenting a problem-solving approach and on providing a survey of all of the most important topics covered in degree programmes. Scheme is used throughout as the programming language and the author stresses a functional programming approach to create simple functions so as to obtain the desired programming goal. Such simple functions are easily tested individually, which greatly helps in producing programs that work correctly first time. Throughout, the author aids to writing programs, and makes liberal use of boxes with "Mistakes to Avoid." Programming examples include: * abstracting a problem; * creating pseudo code as an intermediate solution; * top-down and bottom-up design; * building procedural and data abstractions; * writing progams in modules which are easily testable. Numerous exercises help readers test their understanding of the material and develop ideas in greater depth, making this an ideal first course for all students coming to computer science for the first time.
This volume is a post-conference publication of the 4th World Congress on Social Simulation (WCSS), with contents selected from among the 80 papers originally presented at the conference. WCSS is a biennial event, jointly organized by three scientific communities in computational social science, namely, the Pacific-Asian Association for Agent-Based Approach in Social Systems Sciences (PAAA), the European Social Simulation Association (ESSA), and the Computational Social Science Society of the Americas (CSSSA). It is, therefore, currently the most prominent conference in the area of agent-based social simulation. The papers selected for this volume give a holistic view of the current development of social simulation, indicating the directions for future research and creating an important archival document and milestone in the history of computational social science. Specifically, the papers included here cover substantial progress in artificial financial markets, macroeconomic forecasting, supply chain management, bank networks, social networks, urban planning, social norms and group formation, cross-cultural studies, political party competition, voting behavior, computational demography, computational anthropology, evolution of languages, public health and epidemics, AIDS, security and terrorism, methodological and epistemological issues, empirical-based agent-based modeling, modeling of experimental social science, gaming simulation, cognitive agents, and participatory simulation. Furthermore, pioneering studies in some new research areas, such as the theoretical foundations of social simulation and categorical social science, also are included in the volume.
With the ever-increasing speed of integrated circuits, violations of the performance specifications are becoming a major factor affecting the product quality level. The need for testing timing defects is further expected to grow with the current design trend of moving towards deep submicron devices. After a long period of prevailing belief that high stuck-at fault coverage is sufficient to guarantee high quality of shipped products, the industry is now forced to rethink other types of testing. Delay testing has been a topic of extensive research both in industry and in academia for more than a decade. As a result, several delay fault models and numerous testing methodologies have been proposed. Delay Fault Testing for VLSI Circuits presents a selection of existing delay testing research results. It combines introductory material with state-of-the-art techniques that address some of the current problems in delay testing. Delay Fault Testing for VLSI Circuits covers some basic topics such as fault modeling and test application schemes for detecting delay defects. It also presents summaries and conclusions of several recent case studies and experiments related to delay testing. A selection of delay testing issues and test techniques such as delay fault simulation, test generation, design for testability and synthesis for testability are also covered. Delay Fault Testing for VLSI Circuits is intended for use by CAD and test engineers, researchers, tool developers and graduate students. It requires a basic background in digital testing. The book can used as supplementary material for a graduate-level course on VLSI testing.
The building blocks of today's embedded systems-on-a-chip are complex IP components and programmable processor cores. This means that more and more system functionality is implemented in software rather than in custom hardware. In turn, this indicates a growing need for high-level language compilers, capable of generating efficient code for embedded processors. However, traditional compiler technology hardly keeps pace with new developments in embedded processor architectures. Many existing compilers for DSPs and multimedia processors therefore produce code of insufficient quality with respect to performance and/or code size, and a large part of software for embedded systems is still being developed in assembly languages. As both embedded software as well as processors architectures are getting more and more complex, assembly programming clearly violates the demands for a short time-to-market and high dependability in embedded system design. The goal of this book is to provide new methods and techniques to software and compiler developers, that help to make the necessary step from assembly programming to the use of compilers also in embedded system design. Code Optimization Techniques for Embedded Processors discusses the state-of-the-art in the area of compilers for embedded processors. It presents a collection of new code optimization techniques, dedicated to DSP and multimedia processors. These include: compiler support for DSP address generation units, efficient mapping of data flow graphs to irregular architectures, exploitation of SIMD and conditional instructions, as well as function inlining under code size constraints. Comprehensive experimental evaluations are given forreal-life processors, that indicate the code quality improvements which can be achieved as compared to earlier techniques. In addition, C compiler frontend issues are discussed from a practical viewpoint. Code Optimization Techniques for Embedded Processors is intended for researchers and engineers active in software development for embedded systems, and for compiler developers in academia and industry.
This is the first book to treat two areas of speech synthesis: natural language processing and the inherent problems it presents for speech synthesis; and digital signal processing, with an emphasis on the concatenative approach. The text guides the reader through the material in a step-by-step easy-to-follow way. The book will be of interest to researchers and students in phonetics and speech communication, in both academia and industry.
The field of high performance computing achieved prominence through
advances in electronic and integrated technologies beginning in the
1940s. Current times are very exciting and the years to come will
witness a proliferation of the use of parallel and distributed
systems. The scientific and engineering application domains have a
key role in shaping future research and development activities in
academia and industry, especially when the solution of large and
complex problems must cope with harder and harder timing. |
![]() ![]() You may like...
The Unified Process Inception Phase…
Scott W Ambler, Larry Constantine
Paperback
R1,510
Discovery Miles 15 100
Principles of Big Graph: In-depth…
Ripon Patgiri, Ganesh Chandra Deka, …
Hardcover
Interaction Flow Modeling Language…
Marco Brambilla, Piero Fraternali
Paperback
R1,234
Discovery Miles 12 340
Agile Estimation Techniques and…
Ricardo Colomo-Palacios, Jose A. Calvo-Manzano Villalon, …
Hardcover
R5,896
Discovery Miles 58 960
Functional Gaussian Approximation for…
Florence Merlevede, Magda Peligrad, …
Hardcover
R3,396
Discovery Miles 33 960
|