![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Research argues that e-government technologies have positive influences on politics and democracy, improving citizens' environment as well as their engagement with their government. Although much research indicates that e-government technologies have increased citizen participation, there is much more than can be developed. Politics, Democracy and E-Government: Participation and Service Delivery examines how e-government impacts politics and democracy in both developed and developing countries, discussing the participation of citizens in government service delivery. This book brings forth the idea that e-government has a direct influence on the important function of governing through participation and service delivery. Containing chapters from leading e-government scholars and practitioners from across the globe, the overall objective of this book is accomplished through its discussion on the influences of e-government on democratic institutions and processes.
Sadly enough, war, conflicts and terrorism appear to stay with us in the 21st century. But what is our outlook on new methods for preventing and ending them? Present-day hard- and software enables the development of large crisis, conflict, and conflict management databases with many variables, sometimes with automated updates, statistical analyses of a high complexity, elaborate simulation models, and even interactive uses of these databases. In this book, these methods are presented, further developed, and applied in relation to the main issue: the resolution and prevention of intra- and international conflicts. Conflicts are a worldwide phenomenon. Therefore, internationally leading researchers from the USA, Austria, Canada, Germany, New Zealand and Switzerland have contributed.
CAO is one of the most misunderstood and underutilized weapons available to retailers today. International consultant Barbara Anderson makes clear that in only a limited sense does CAO replace manual ordering. In its full sense it is much more--the optimization of manufacturer, supplier, and retailer distribution to the retail store-- based on consumer and store data and corporate policy. Anderson thus provides a framework and checklist for implementing CAO, and understanding of key terminology, solutions to likely problems, and ways to make CAO implementation successful, and in doing so she covers the full spectrum of retailing. A readable, easily grasped, comprehensive, unique book for retailing management and for their colleagues teaching it in colleges and universities. Anderson points out that CAO is not an off-the-shelf system but an ongoing project, each phase with its own unique set of benefits and cost justification. Retail systems must support a vision where a product may bypass the store on the way to the consumer, or even the distribution center on the way to the stores. Consumers have a wide range of choices, not only of where to shop, but how to shop, and this demands ever greater levels of service. CAO systems help assure that the correct product is available at the store, that it can be located throughout the supply chain, and that it can be moved easily from any location. In CAO, all levels of operation work with real-time information, using decision-making tools that react and learn from new information. Her book thus shows there is no one right system, product, or approach for successful CAO. It's too big a leap to make in one step but consists of modules and functions that can grow in sophistication over time, and that not all retailers nor all categories within one retailer will use the same methods for forecasting and ordering. She also shows that the distinct separation of replenishment product from planning product is artifically imposed and that the separation of head-quarters from stores is also artificial. Indeed, integration does not mean the integration of separate systems; rather, of business functions themselves. Readers will thus get not only a knowledgeable discussion of what CAO should be, what it is and how it works, but an immediately useful understanding of how to make it work in their own companies.
This comprehensive, detailed reference to Mathematica provides the reader with both a working knowledge of Mathematica programming in general and a detailed knowledge of key aspects of Mathematica needed to create the fastest, shortest, and most elegant implementations possible to solve problems from the natural and physical sciences. The Guidebook gives the user a deeper understanding of Mathematica by instructive implementations, explanations, and examples from a range of disciplines at varying levels of complexity. "Programming" covers the structure of Mathematica expressions, after an overview of the syntax of Mathematica, its programming, graphic, numeric and symbolic capabilities in chapter 1. Chapter 2-6 cover hierarchical construction of all Mathematica objects out of symbolic expressions, the definition of functions, the recognition of patterns and their efficient application, program flows and program structuring, the manipulation of lists, and additional topics. An Appendix contains some general references on algorithms and applications of computer algebra, Mathematica itself and comparisons of various algebra systems. The multiplatform CD contains Mathematica 4.1 notebooks with detailed descriptions and explanations of the Mathematica commands needed in that chapter and used in applications, supplemented by a variety of mathematical, physical, and graphic examples and worked out solutions to all exercises. The Mathematica Guidebook is an indispensible resource for practitioners, researchers and professionals in mathematics, the sciences, and engineering. It will find a natural place on the bookshelf as an essential reference work.
Homeland security information systems are an important area of inquiry due to the tremendous influence information systems play on the preparation and response of government to a terrorist attack or natural disaster. ""Homeland Security Preparedness and Information Systems: Strategies for Managing Public Policy"" delves into the issues and challenges that public managers face in the adoption and implementation of information systems for homeland security. A defining collection of field advancements, this publication provides solutions for those interested in adopting additional information systems security measures in their governments.
Science has made great progress in the twentieth century, with the establishment of proper disciplines in the fields of physics, computer science, molecular biology, and many others. At the same time, there have also emerged many engineering ideas that are interdisciplinary in nature, beyond the realm of such orthodox disciplines. These in clude, for example, artificial intelligence, fuzzy logic, artificial neural networks, evolutional computation, data mining, and so on. In or der to generate new technology that is truly human-friendly in the twenty-first century, integration of various methods beyond specific disciplines is required. Soft computing is a key concept for the creation of such human friendly technology in our modern information society. Professor Rutkowski is a pioneer in this field, having devoted himself for many years to publishing a large variety of original work. The present vol ume, based mostly on his own work, is a milestone in the devel opment of soft computing, integrating various disciplines from the fields of information science and engineering. The book consists of three parts, the first of which is devoted to probabilistic neural net works. Neural excitation is stochastic, so it is natural to investi gate the Bayesian properties of connectionist structures developed by Professor Rutkowski. This new approach has proven to be par ticularly useful for handling regression and classification problems vi Preface in time-varying environments. Throughout this book, major themes are selected from theoretical subjects that are tightly connected with challenging applications."
Volume 55 covers some particularly hot topics. Linda Harasim writes
about education and the Web in "The Virtual University: A State of
the Art." She discusses the issues that will need to be addressed
if online education is to live up to expectations. Neville Holmes
covers a related subject in his chapter "The Net, the Web, and the
Children." He argues that the Web is an evolutionary, rather than
revolutionary, development and highlights the division between the
rich and the poor within and across nations. Continuing the WWW
theme, George Mihaila, Louqa Raschid, and Maria-Esther Vidal look
at the problems of using the Web and finding the information you
want.
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
Reputation In Artificial Societies discusses the role of reputation
in the achievement of social order. The book proposes that
reputation is an agent property that results from transmission of
beliefs about how the agents are evaluated with regard to a
socially desirable conduct. This desirable conduct represents one
or another of the solutions to the problem of social order and may
consist of cooperation or altruism, reciprocity, or norm obedience.
Computational finance deals with the mathematics of computer programs that realize financial models or systems. This book outlines the epistemic risks associated with the current valuations of different financial instruments and discusses the corresponding risk management strategies. It covers most of the research and practical areas in computational finance. Starting from traditional fundamental analysis and using algebraic and geometric tools, it is guided by the logic of science to explore information from financial data without prejudice. In fact, this book has the unique feature that it is structured around the simple requirement of objective science: the geometric structure of the data = the information contained in the data.
A complete introduction to the many mathematical tools used to solve practical problems in coding. Mathematicians have been fascinated with the theory of error-correcting codes since the publication of Shannon's classic papers fifty years ago. With the proliferation of communications systems, computers, and digital audio devices that employ error-correcting codes, the theory has taken on practical importance in the solution of coding problems. This solution process requires the use of a wide variety of mathematical tools and an understanding of how to find mathematical techniques to solve applied problems. Introduction to the Theory of Error-Correcting Codes, Third Edition demonstrates this process and prepares students to cope with coding problems. Like its predecessor, which was awarded a three-star rating by the Mathematical Association of America, this updated and expanded edition gives readers a firm grasp of the timeless fundamentals of coding as well as the latest theoretical advances. This new edition features:
Introduction to the Theory of Error-Correcting Codes, Third Edition is the ideal textbook for senior-undergraduate and first-year graduate courses on error-correcting codes in mathematics, computer science, and electrical engineering.
This lively and fascinating text traces the key developments in computation - from 3000 B.C. to the present day - in an easy-to-follow and concise manner. Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science.
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike.
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
Facing the challenge of the fast changing technological environment, many companies are developing an interest in the field of technology intelligence. Their aim is to support the decision-making process by taking advantage of a well-timed preparation of relevant information by means of systematic identification, collection, analysis, dissemination, and application of this information. This book covers the gap in literature by showing how a technology intelligence system can be designed and implemented.
With the development of networked computing and the increased complexity of applications and software systems development, the importance of computer-supported collaborative work CSCW] has dramatically increased. Globalization has further accentuated the necessity of collaboration, while the Web has made geographically distributed collaborative systems technologically feasible in a manner that was impossible until recently. The software environments needed to support such distributed teams are referred to as Groupware. Groupware is intended to address the logistical, managerial, social, organizational and cognitive difficulties that arise in the application of distributed expertise. These issues represent the fundamental challenges to the next generation of process management. Computer-Supported Collaboration with Applications to Software Development reviews the theory of collaborative groups and the factors that affect collaboration, particularly collaborative software development. The influences considered derive from diverse sources: social and cognitive psychology, media characteristics, the problem-solving behavior of groups, process management, group information processing, and organizational effects. It also surveys empirical studies of computer-supported problem solving, especially for software development. The concluding chapter describes a collaborative model for program development. Computer-Supported Collaboration with Applications to Software Development is designed for an academic and professional market in software development, professionals and researchers in the areas of software engineering, collaborative development, management information systems, problem solving, cognitive and social psychology. This book also meets the needs of graduate-level students in computer science and information systems.
Do Smart Adaptive Systems Exist? is intended as a reference and a guide summarising and focusing on best practices when using intelligent techniques and building systems requiring a degree of adaptation and intelligence. It is therefore not intended as a collection of the most recent research results, but as a practical guide for experts from other areas and industrial users interested in building solutions to their problems using intelligent techniques. One of the main issues covered is an attempt to answer the question of how to select and/or combine suitable intelligent techniques from a large pool of potential solutions. Another attractive feature of the book is that it brings together experts from neural network, fuzzy, machine learning, evolutionary and hybrid systems communities who will provide their views on how these different intelligent technologies have contributed and will contribute to creation of smart adaptive systems of the future.
As more and more hardware platforms support parallelism, parallel programming is gaining momentum. Applications can only leverage the performance of multi-core processors or graphics processing units if they are able to split a problem into smaller ones that can be solved in parallel. The challenges emerging from the development of parallel applications have led to the development of a great number of tools for debugging, performance analysis and other tasks. The proceedings of the 3rd International Workshop on Parallel Tools for High Performance Computing provide a technical overview in order to help engineers, developers and computer scientists decide which tools are best suited to enhancing their current development processes.
We are extremely pleased to present a comprehensive book comprising a collection of research papers which is basically an outcome of the Second IFIP TC 13.6 Working Group conference on Human Work Interaction Design, HWID2009. The conference was held in Pune, India during October 7-8, 2009. It was hosted by the Centre for Development of Advanced Computing, India, and jointly organized with Copenhagen Business School, Denmark; Aarhus University, Denmark; and Indian Institute of Technology, Guwahati, India. The theme of HWID2009 was Usability in Social, C- tural and Organizational Contexts. The conference was held under the auspices of IFIP TC 13 on Human-Computer Interaction. 1 Technical Committee TC13 on Human-Computer Interaction The committees under IFIP include the Technical Committee TC13 on Human-Computer Interaction within which the work of this volume has been conducted. TC13 on Human-Computer Interaction has as its aim to encourage theoretical and empirical human science research to promote the design and evaluation of human-oriented ICT. Within TC13 there are different working groups concerned with different aspects of human- computer interaction. The flagship event of TC13 is the bi-annual international conference called INTERACT at which both invited and contributed papers are presented. Contributed papers are rigorously refereed and the rejection rate is high.
Information engineering and applications is the field of study concerned with constructing information computing, intelligent systems, mathematical models, numerical solution techniques, and using computers and other electronic devices to analyze and solve natural scientific, social scientific and engineering problems. Information engineering is an important underpinning for techniques used in information and computational science and there are many unresolved problems worth studying. The Proceedings of the 2nd International Conference on Information Engineering and Applications (IEA 2012), which was held in Chongqing, China, from October 26-28, 2012, discusses the most innovative research and developments including technical challenges and social, legal, political, and economic issues. A forum for engineers and scientists in academia, industry, and government, the Proceedings of the 2nd International Conference on Information Engineering and Applications presents ideas, results, works in progress, and experience in all aspects of information engineering and applications.
This volume is a post-conference publication of the 4th World Congress on Social Simulation (WCSS), with contents selected from among the 80 papers originally presented at the conference. WCSS is a biennial event, jointly organized by three scientific communities in computational social science, namely, the Pacific-Asian Association for Agent-Based Approach in Social Systems Sciences (PAAA), the European Social Simulation Association (ESSA), and the Computational Social Science Society of the Americas (CSSSA). It is, therefore, currently the most prominent conference in the area of agent-based social simulation. The papers selected for this volume give a holistic view of the current development of social simulation, indicating the directions for future research and creating an important archival document and milestone in the history of computational social science. Specifically, the papers included here cover substantial progress in artificial financial markets, macroeconomic forecasting, supply chain management, bank networks, social networks, urban planning, social norms and group formation, cross-cultural studies, political party competition, voting behavior, computational demography, computational anthropology, evolution of languages, public health and epidemics, AIDS, security and terrorism, methodological and epistemological issues, empirical-based agent-based modeling, modeling of experimental social science, gaming simulation, cognitive agents, and participatory simulation. Furthermore, pioneering studies in some new research areas, such as the theoretical foundations of social simulation and categorical social science, also are included in the volume. |
You may like...
Trajectories through the New Testament…
Andrew Gregory, Christopher Tuckett
Hardcover
R5,772
Discovery Miles 57 720
Reshore Production Now - How to Rebuild…
William A. Levinson
Paperback
R1,264
Discovery Miles 12 640
How To Identify Trees In South Africa
Braam van Wyk, Piet van Wyk
Paperback
|