![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
This lively and fascinating text traces the key developments in computation - from 3000 B.C. to the present day - in an easy-to-follow and concise manner. Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.
CAO is one of the most misunderstood and underutilized weapons available to retailers today. International consultant Barbara Anderson makes clear that in only a limited sense does CAO replace manual ordering. In its full sense it is much more--the optimization of manufacturer, supplier, and retailer distribution to the retail store-- based on consumer and store data and corporate policy. Anderson thus provides a framework and checklist for implementing CAO, and understanding of key terminology, solutions to likely problems, and ways to make CAO implementation successful, and in doing so she covers the full spectrum of retailing. A readable, easily grasped, comprehensive, unique book for retailing management and for their colleagues teaching it in colleges and universities. Anderson points out that CAO is not an off-the-shelf system but an ongoing project, each phase with its own unique set of benefits and cost justification. Retail systems must support a vision where a product may bypass the store on the way to the consumer, or even the distribution center on the way to the stores. Consumers have a wide range of choices, not only of where to shop, but how to shop, and this demands ever greater levels of service. CAO systems help assure that the correct product is available at the store, that it can be located throughout the supply chain, and that it can be moved easily from any location. In CAO, all levels of operation work with real-time information, using decision-making tools that react and learn from new information. Her book thus shows there is no one right system, product, or approach for successful CAO. It's too big a leap to make in one step but consists of modules and functions that can grow in sophistication over time, and that not all retailers nor all categories within one retailer will use the same methods for forecasting and ordering. She also shows that the distinct separation of replenishment product from planning product is artifically imposed and that the separation of head-quarters from stores is also artificial. Indeed, integration does not mean the integration of separate systems; rather, of business functions themselves. Readers will thus get not only a knowledgeable discussion of what CAO should be, what it is and how it works, but an immediately useful understanding of how to make it work in their own companies.
Rule-basedevolutionaryonlinelearningsystems, oftenreferredtoasMichig- style learning classi?er systems (LCSs), were proposed nearly thirty years ago (Holland, 1976; Holland, 1977) originally calling them cognitive systems. LCSs combine the strength of reinforcement learning with the generali- tion capabilities of genetic algorithms promising a ?exible, online general- ing, solely reinforcement dependent learning system. However, despite several initial successful applications of LCSs and their interesting relations with a- mal learning and cognition, understanding of the systems remained somewhat obscured. Questions concerning learning complexity or convergence remained unanswered. Performance in di?erent problem types, problem structures, c- ceptspaces, andhypothesisspacesstayednearlyunpredictable. Thisbookhas the following three major objectives: (1) to establish a facetwise theory - proachforLCSsthatpromotessystemanalysis, understanding, anddesign;(2) to analyze, evaluate, and enhance the XCS classi?er system (Wilson, 1995) by the means of the facetwise approach establishing a fundamental XCS learning theory; (3) to identify both the major advantages of an LCS-based learning approach as well as the most promising potential application areas. Achieving these three objectives leads to a rigorous understanding of LCS functioning that enables the successful application of LCSs to diverse problem types and problem domains. The quantitative analysis of XCS shows that the inter- tive, evolutionary-based online learning mechanism works machine learning competitively yielding a low-order polynomial learning complexity. Moreover, the facetwise analysis approach facilitates the successful design of more - vanced LCSs including Holland's originally envisioned cognitive systems. Martin V.
History of the Book The last three decades have witnessed an explosive development in integrated circuit fabrication technologies. The complexities of cur rent CMOS circuits are reaching beyond the 100 nanometer feature size and multi-hundred million transistors per integrated circuit. To fully exploit this technological potential, circuit designers use sophisticated Computer-Aided Design (CAD) tools. While supporting the talents of innumerable microelectronics engineers, these CAD tools have become the enabling factor responsible for the successful design and implemen tation of thousands of high performance, large scale integrated circuits. This research monograph originated from a body of doctoral disserta tion research completed by the first author at the University of Rochester from 1994 to 1999 while under the supervision of Prof. Eby G. Friedman. This research focuses on issues in the design of the clock distribution net work in large scale, high performance digital synchronous circuits and particularly, on algorithms for non-zero clock skew scheduling. During the development of this research, it has become clear that incorporating timing issues into the successful integrated circuit design process is of fundamental importance, particularly in that advanced theoretical de velopments in this area have been slow to reach the designers' desktops."
This book presents scientific metrics and its applications for approaching scientific findings in the field of Physics, Economics and Scientometrics. Based on a collection of the author's publications in these fields, the book reveals the profound links between the measures and the findings in the natural laws, from micro-particles to macro-cosmos, in the economic rules of human society, and in the core knowledge among mass information. With this book the readers can gain insights or ideas on addressing the questions of how to measure the physical world, economics process and human knowledge, from the perspective of scientific metrics. The book is also useful to scientists, particularly to specialists in physics, economics and scientometrics, for promoting and stimulating their creative ideas based on scientific metrics.
This book is an introduction to the fundamental concepts and tools needed for solving problems of a geometric nature using a computer. It attempts to fill the gap between standard geometry books, which are primarily theoretical, and applied books on computer graphics, computer vision, robotics, or machine learning. This book covers the following topics: affine geometry, projective geometry, Euclidean geometry, convex sets, SVD and principal component analysis, manifolds and Lie groups, quadratic optimization, basics of differential geometry, and a glimpse of computational geometry (Voronoi diagrams and Delaunay triangulations). Some practical applications of the concepts presented in this book include computer vision, more specifically contour grouping, motion interpolation, and robot kinematics. In this extensively updated second edition, more material on convex sets, Farkas's lemma, quadratic optimization and the Schur complement have been added. The chapter on SVD has been greatly expanded and now includes a presentation of PCA. The book is well illustrated and has chapter summaries and a large number of exercises throughout. It will be of interest to a wide audience including computer scientists, mathematicians, and engineers. Reviews of first edition: "Gallier's book will be a useful source for anyone interested in applications of geometrical methods to solve problems that arise in various branches of engineering. It may help to develop the sophisticated concepts from the more advanced parts of geometry into useful tools for applications." (Mathematical Reviews, 2001) ..".it will be useful as a reference book for postgraduates wishing to find the connection between their current problem and the underlying geometry." (The Australian Mathematical Society, 2001)"
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike.
This comprehensive, detailed reference to Mathematica provides the reader with both a working knowledge of Mathematica programming in general and a detailed knowledge of key aspects of Mathematica needed to create the fastest, shortest, and most elegant implementations possible to solve problems from the natural and physical sciences. The Guidebook gives the user a deeper understanding of Mathematica by instructive implementations, explanations, and examples from a range of disciplines at varying levels of complexity. "Programming" covers the structure of Mathematica expressions, after an overview of the syntax of Mathematica, its programming, graphic, numeric and symbolic capabilities in chapter 1. Chapter 2-6 cover hierarchical construction of all Mathematica objects out of symbolic expressions, the definition of functions, the recognition of patterns and their efficient application, program flows and program structuring, the manipulation of lists, and additional topics. An Appendix contains some general references on algorithms and applications of computer algebra, Mathematica itself and comparisons of various algebra systems. The multiplatform CD contains Mathematica 4.1 notebooks with detailed descriptions and explanations of the Mathematica commands needed in that chapter and used in applications, supplemented by a variety of mathematical, physical, and graphic examples and worked out solutions to all exercises. The Mathematica Guidebook is an indispensible resource for practitioners, researchers and professionals in mathematics, the sciences, and engineering. It will find a natural place on the bookshelf as an essential reference work.
Science has made great progress in the twentieth century, with the establishment of proper disciplines in the fields of physics, computer science, molecular biology, and many others. At the same time, there have also emerged many engineering ideas that are interdisciplinary in nature, beyond the realm of such orthodox disciplines. These in clude, for example, artificial intelligence, fuzzy logic, artificial neural networks, evolutional computation, data mining, and so on. In or der to generate new technology that is truly human-friendly in the twenty-first century, integration of various methods beyond specific disciplines is required. Soft computing is a key concept for the creation of such human friendly technology in our modern information society. Professor Rutkowski is a pioneer in this field, having devoted himself for many years to publishing a large variety of original work. The present vol ume, based mostly on his own work, is a milestone in the devel opment of soft computing, integrating various disciplines from the fields of information science and engineering. The book consists of three parts, the first of which is devoted to probabilistic neural net works. Neural excitation is stochastic, so it is natural to investi gate the Bayesian properties of connectionist structures developed by Professor Rutkowski. This new approach has proven to be par ticularly useful for handling regression and classification problems vi Preface in time-varying environments. Throughout this book, major themes are selected from theoretical subjects that are tightly connected with challenging applications."
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
Research argues that e-government technologies have positive influences on politics and democracy, improving citizens' environment as well as their engagement with their government. Although much research indicates that e-government technologies have increased citizen participation, there is much more than can be developed. Politics, Democracy and E-Government: Participation and Service Delivery examines how e-government impacts politics and democracy in both developed and developing countries, discussing the participation of citizens in government service delivery. This book brings forth the idea that e-government has a direct influence on the important function of governing through participation and service delivery. Containing chapters from leading e-government scholars and practitioners from across the globe, the overall objective of this book is accomplished through its discussion on the influences of e-government on democratic institutions and processes.
Sadly enough, war, conflicts and terrorism appear to stay with us in the 21st century. But what is our outlook on new methods for preventing and ending them? Present-day hard- and software enables the development of large crisis, conflict, and conflict management databases with many variables, sometimes with automated updates, statistical analyses of a high complexity, elaborate simulation models, and even interactive uses of these databases. In this book, these methods are presented, further developed, and applied in relation to the main issue: the resolution and prevention of intra- and international conflicts. Conflicts are a worldwide phenomenon. Therefore, internationally leading researchers from the USA, Austria, Canada, Germany, New Zealand and Switzerland have contributed.
Homeland security information systems are an important area of inquiry due to the tremendous influence information systems play on the preparation and response of government to a terrorist attack or natural disaster. ""Homeland Security Preparedness and Information Systems: Strategies for Managing Public Policy"" delves into the issues and challenges that public managers face in the adoption and implementation of information systems for homeland security. A defining collection of field advancements, this publication provides solutions for those interested in adopting additional information systems security measures in their governments.
Facing the challenge of the fast changing technological environment, many companies are developing an interest in the field of technology intelligence. Their aim is to support the decision-making process by taking advantage of a well-timed preparation of relevant information by means of systematic identification, collection, analysis, dissemination, and application of this information. This book covers the gap in literature by showing how a technology intelligence system can be designed and implemented.
Reputation In Artificial Societies discusses the role of reputation
in the achievement of social order. The book proposes that
reputation is an agent property that results from transmission of
beliefs about how the agents are evaluated with regard to a
socially desirable conduct. This desirable conduct represents one
or another of the solutions to the problem of social order and may
consist of cooperation or altruism, reciprocity, or norm obedience.
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
Information engineering and applications is the field of study concerned with constructing information computing, intelligent systems, mathematical models, numerical solution techniques, and using computers and other electronic devices to analyze and solve natural scientific, social scientific and engineering problems. Information engineering is an important underpinning for techniques used in information and computational science and there are many unresolved problems worth studying. The Proceedings of the 2nd International Conference on Information Engineering and Applications (IEA 2012), which was held in Chongqing, China, from October 26-28, 2012, discusses the most innovative research and developments including technical challenges and social, legal, political, and economic issues. A forum for engineers and scientists in academia, industry, and government, the Proceedings of the 2nd International Conference on Information Engineering and Applications presents ideas, results, works in progress, and experience in all aspects of information engineering and applications.
With the development of networked computing and the increased complexity of applications and software systems development, the importance of computer-supported collaborative work CSCW] has dramatically increased. Globalization has further accentuated the necessity of collaboration, while the Web has made geographically distributed collaborative systems technologically feasible in a manner that was impossible until recently. The software environments needed to support such distributed teams are referred to as Groupware. Groupware is intended to address the logistical, managerial, social, organizational and cognitive difficulties that arise in the application of distributed expertise. These issues represent the fundamental challenges to the next generation of process management. Computer-Supported Collaboration with Applications to Software Development reviews the theory of collaborative groups and the factors that affect collaboration, particularly collaborative software development. The influences considered derive from diverse sources: social and cognitive psychology, media characteristics, the problem-solving behavior of groups, process management, group information processing, and organizational effects. It also surveys empirical studies of computer-supported problem solving, especially for software development. The concluding chapter describes a collaborative model for program development. Computer-Supported Collaboration with Applications to Software Development is designed for an academic and professional market in software development, professionals and researchers in the areas of software engineering, collaborative development, management information systems, problem solving, cognitive and social psychology. This book also meets the needs of graduate-level students in computer science and information systems.
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science.
Healthcare is significantly affected by technological advancements, as technology both shapes and changes health systems locally and globally. As areas of computer science, information technology, and healthcare merge, it is important to understand the current and future implications of health informatics. Healthcare and the Effect of Technology: Developments, Challenges and Advancements bridges the gap between today's empirical research findings and healthcare practice. It provides the reader with information on current technological integrations, potential uses for technology in healthcare, and the implications both positive and negative of health informatics for one's health. Technology in healthcare can improve efficiency, make patient records more accessible, increase professional communication, create global health networking, and increase access to healthcare. However, it is important to consider the ethical, confidential, and cultural implications technology in healthcare may impose. That is what makes this book is a must-read for policymakers, human resource professionals, management personnel, as well as for researchers, scholars, students, and healthcare professionals.
This book represents the compilation of papers presented at the IFIP Working Group 8. 2 conference entitled "Information Technology in the Service Economy: Challenges st and Possibilities for the 21 Century. " The conference took place at Ryerson University, Toronto, Canada, on August 10 13, 2008. Par ticipation in the conference spanned the continents from Asia to Europe with paper submissions global in focus as well. Conference submissions included complete d research papers and research in progress reports. Papers submitted to the conference went through a double blind review process in which the program co chairs, an associate editor, and reviewers provided assessments and recommendations. The editor ial efforts of the associate editors and reviewers in this process were outstanding. To foster high quality research publications in this field of study, authors of accepted pape rs were then invited to revise and resubmit their work. Through this rigorous review and revision process, 12 completed research papers and 11 research in progress reports were accepted for presentation and publica tion. Paper workshop sessions were also esta blished to provide authors of emergent work an opportunity to receive feedback fromthe IF IP 8. 2 community. Abstracts of these new projects are included in this volume. Four panels were presented at the conference to provide discussion forums for the varied aspect s of IT, service, and globalization. Panel abstracts are also included here. |
![]() ![]() You may like...
Sasol Voëls Van Suider-Afrika (Met…
Ian Sinclair, Phil Hockey
Paperback
On a True Parthenogenesis in Moths and…
Carl Theodor Ernst Von Siebold
Paperback
R410
Discovery Miles 4 100
|