Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing > General
Homeland security information systems are an important area of inquiry due to the tremendous influence information systems play on the preparation and response of government to a terrorist attack or natural disaster. ""Homeland Security Preparedness and Information Systems: Strategies for Managing Public Policy"" delves into the issues and challenges that public managers face in the adoption and implementation of information systems for homeland security. A defining collection of field advancements, this publication provides solutions for those interested in adopting additional information systems security measures in their governments.
Focusing on the critical role IT plays in organizational development, the book shows how to employ action learning to improve the competitiveness of an organization. Defining the current IT problem from an operational and strategic perspective, it presents a collection of case studies that illustrate key learning issues. It details a dynamic model for effective IT management through adaptive learning techniques-supplying proven educational theories and practices to foster the required changes in your staff. It examines existing organizational learning theories and the historical problems that occurred with companies that have used them, as well as those that have failed to use them.
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
Research argues that e-government technologies have positive influences on politics and democracy, improving citizens' environment as well as their engagement with their government. Although much research indicates that e-government technologies have increased citizen participation, there is much more than can be developed. Politics, Democracy and E-Government: Participation and Service Delivery examines how e-government impacts politics and democracy in both developed and developing countries, discussing the participation of citizens in government service delivery. This book brings forth the idea that e-government has a direct influence on the important function of governing through participation and service delivery. Containing chapters from leading e-government scholars and practitioners from across the globe, the overall objective of this book is accomplished through its discussion on the influences of e-government on democratic institutions and processes.
Sadly enough, war, conflicts and terrorism appear to stay with us in the 21st century. But what is our outlook on new methods for preventing and ending them? Present-day hard- and software enables the development of large crisis, conflict, and conflict management databases with many variables, sometimes with automated updates, statistical analyses of a high complexity, elaborate simulation models, and even interactive uses of these databases. In this book, these methods are presented, further developed, and applied in relation to the main issue: the resolution and prevention of intra- and international conflicts. Conflicts are a worldwide phenomenon. Therefore, internationally leading researchers from the USA, Austria, Canada, Germany, New Zealand and Switzerland have contributed.
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
Covering the years 2008-2012, this bookprofilesthe life and work
of recent winners of the Abel Prize: The book also presents a history of the Abel Prize written by the historian Kim Helsvig, and includes a facsimile of aletter from Niels Henrik Abel, which is transcribed, translated into English, and placed into historical perspectiveby Christian Skau. This book follows onThe Abel Prize: 2003-2007, The First Five Years(Springer, 2010), which profiles the work of the first Abel Prize winners. "
Facing the challenge of the fast changing technological environment, many companies are developing an interest in the field of technology intelligence. Their aim is to support the decision-making process by taking advantage of a well-timed preparation of relevant information by means of systematic identification, collection, analysis, dissemination, and application of this information. This book covers the gap in literature by showing how a technology intelligence system can be designed and implemented.
Reputation In Artificial Societies discusses the role of reputation
in the achievement of social order. The book proposes that
reputation is an agent property that results from transmission of
beliefs about how the agents are evaluated with regard to a
socially desirable conduct. This desirable conduct represents one
or another of the solutions to the problem of social order and may
consist of cooperation or altruism, reciprocity, or norm obedience.
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
The book presents the state of the art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of high performance systems and heterogeneous architectures. The application contributions cover computational fluid dynamics, material science, medical applications and climate research. Innovative fields like coupled multi-physics or multi-scale simulations are presented. All papers were chosen from presentations given at the 14th Teraflop Workshop held in December 2011 at HLRS, University of Stuttgart, Germany and the Workshop on Sustained Simulation Performance at Tohoku University in March 2012.
Collaborative Networks for a Sustainable World Aiming to reach a sustainable world calls for a wider collaboration among multiple stakeholders from different origins, as the changes needed for sustainability exceed the capacity and capability of any individual actor. In recent years there has been a growing awareness both in the political sphere and in civil society including the bu- ness sectors, on the importance of sustainability. Therefore, this is an important and timely research issue, not only in terms of systems design but also as an effort to b- row and integrate contributions from different disciplines when designing and/or g- erning those systems. The discipline of collaborative networks especially, which has already emerged in many application sectors, shall play a key role in the implemen- tion of effective sustainability strategies. PRO-VE 2010 focused on sharing knowledge and experiences as well as identi- ing directions for further research and development in this area. The conference - dressed models, infrastructures, support tools, and governance principles developed for collaborative networks, as important resources to support multi-stakeholder s- tainable developments. Furthermore, the challenges of this theme open new research directions for CNs. PRO-VE 2010 held in St.
This volume is a post-conference publication of the 4th World Congress on Social Simulation (WCSS), with contents selected from among the 80 papers originally presented at the conference. WCSS is a biennial event, jointly organized by three scientific communities in computational social science, namely, the Pacific-Asian Association for Agent-Based Approach in Social Systems Sciences (PAAA), the European Social Simulation Association (ESSA), and the Computational Social Science Society of the Americas (CSSSA). It is, therefore, currently the most prominent conference in the area of agent-based social simulation. The papers selected for this volume give a holistic view of the current development of social simulation, indicating the directions for future research and creating an important archival document and milestone in the history of computational social science. Specifically, the papers included here cover substantial progress in artificial financial markets, macroeconomic forecasting, supply chain management, bank networks, social networks, urban planning, social norms and group formation, cross-cultural studies, political party competition, voting behavior, computational demography, computational anthropology, evolution of languages, public health and epidemics, AIDS, security and terrorism, methodological and epistemological issues, empirical-based agent-based modeling, modeling of experimental social science, gaming simulation, cognitive agents, and participatory simulation. Furthermore, pioneering studies in some new research areas, such as the theoretical foundations of social simulation and categorical social science, also are included in the volume.
With the development of networked computing and the increased complexity of applications and software systems development, the importance of computer-supported collaborative work CSCW] has dramatically increased. Globalization has further accentuated the necessity of collaboration, while the Web has made geographically distributed collaborative systems technologically feasible in a manner that was impossible until recently. The software environments needed to support such distributed teams are referred to as Groupware. Groupware is intended to address the logistical, managerial, social, organizational and cognitive difficulties that arise in the application of distributed expertise. These issues represent the fundamental challenges to the next generation of process management. Computer-Supported Collaboration with Applications to Software Development reviews the theory of collaborative groups and the factors that affect collaboration, particularly collaborative software development. The influences considered derive from diverse sources: social and cognitive psychology, media characteristics, the problem-solving behavior of groups, process management, group information processing, and organizational effects. It also surveys empirical studies of computer-supported problem solving, especially for software development. The concluding chapter describes a collaborative model for program development. Computer-Supported Collaboration with Applications to Software Development is designed for an academic and professional market in software development, professionals and researchers in the areas of software engineering, collaborative development, management information systems, problem solving, cognitive and social psychology. This book also meets the needs of graduate-level students in computer science and information systems.
This book introduces context-aware computing, providing definitions, categories, characteristics, and context awareness itself and discussing its applications with a particular focus on smart learning environments. It also examines the elements of a context-aware system, including acquisition, modelling, reasoning, and distribution of context. It also reviews applications of context-aware computing - both past and present - to offer readers the knowledge needed to critically analyse how context awareness can be put to use. It is particularly to those new to the subject area who are interested in learning how to develop context-aware computing-oriented applications, as well as postgraduates and researchers in computer engineering, communications engineering related areas of information technology (IT). Further it provides practical know-how for professionals working in IT support and technology, consultants and business decision-makers and those working in the medical, human, and social sciences.
This book represents the compilation of papers presented at the IFIP Working Group 8. 2 conference entitled "Information Technology in the Service Economy: Challenges st and Possibilities for the 21 Century. " The conference took place at Ryerson University, Toronto, Canada, on August 10 13, 2008. Par ticipation in the conference spanned the continents from Asia to Europe with paper submissions global in focus as well. Conference submissions included complete d research papers and research in progress reports. Papers submitted to the conference went through a double blind review process in which the program co chairs, an associate editor, and reviewers provided assessments and recommendations. The editor ial efforts of the associate editors and reviewers in this process were outstanding. To foster high quality research publications in this field of study, authors of accepted pape rs were then invited to revise and resubmit their work. Through this rigorous review and revision process, 12 completed research papers and 11 research in progress reports were accepted for presentation and publica tion. Paper workshop sessions were also esta blished to provide authors of emergent work an opportunity to receive feedback fromthe IF IP 8. 2 community. Abstracts of these new projects are included in this volume. Four panels were presented at the conference to provide discussion forums for the varied aspect s of IT, service, and globalization. Panel abstracts are also included here.
Healthcare is significantly affected by technological advancements, as technology both shapes and changes health systems locally and globally. As areas of computer science, information technology, and healthcare merge, it is important to understand the current and future implications of health informatics. Healthcare and the Effect of Technology: Developments, Challenges and Advancements bridges the gap between today's empirical research findings and healthcare practice. It provides the reader with information on current technological integrations, potential uses for technology in healthcare, and the implications both positive and negative of health informatics for one's health. Technology in healthcare can improve efficiency, make patient records more accessible, increase professional communication, create global health networking, and increase access to healthcare. However, it is important to consider the ethical, confidential, and cultural implications technology in healthcare may impose. That is what makes this book is a must-read for policymakers, human resource professionals, management personnel, as well as for researchers, scholars, students, and healthcare professionals.
Computer-based information technologies have been extensively used to help industries manage their processes and information systems hereby - come their nervous center. More specially, databases are designed to s- port the data storage, processing, and retrieval activities related to data management in information systems. Database management systems p- vide efficient task support and database systems are the key to impleme- ing industrial data management. Industrial data management requires da- base technique support. Industrial applications, however, are typically data and knowledge intensive applications and have some unique character- tics that makes their management difficult. Besides, some new techniques such as Web, artificial intelligence, and etc. have been introduced into - dustrial applications. These unique characteristics and usage of new te- nologies have put many potential requirements on industrial data mana- ment, which challenge today's database systems and promote their evolvement. Viewed from database technology, information modeling in databases can be identified at two levels: (conceptual) data modeling and (logical) database modeling. This results in conceptual (semantic) data model and logical database model. Generally a conceptual data model is designed and then the designed conceptual data model will be transformed into a chosen logical database schema. Database systems based on logical database model are used to build information systems for data mana- ment. Much attention has been directed at conceptual data modeling of - dustrial information systems. Product data models, for example, can be views as a class of semantic data models (i. e.
As more and more hardware platforms support parallelism, parallel programming is gaining momentum. Applications can only leverage the performance of multi-core processors or graphics processing units if they are able to split a problem into smaller ones that can be solved in parallel. The challenges emerging from the development of parallel applications have led to the development of a great number of tools for debugging, performance analysis and other tasks. The proceedings of the 3rd International Workshop on Parallel Tools for High Performance Computing provide a technical overview in order to help engineers, developers and computer scientists decide which tools are best suited to enhancing their current development processes.
Introduces the reader to the technical aspects of real-time visual effects. Built upon a career of over twenty years in the feature film visual effects and the real-time video game industries and tested on graduate and undergraduate students. Explores all real-time visual effects in four categories: in-camera effects, in-material effects, simulations and particles. |
You may like...
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|