![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
Business-to-business (B2B) integration is a buzzword which has been used a lot in recent years, with a variety of meanings. Starting with a clear technical definition of this term and its relation to topics like A2A (Application-to-Application), ASP (Application Service Provider), A2A, and B2C (Business-to-Consumer), Christoph Bussler outlines a complete and consistent B2B integration architecture based on a coherent conceptual model. He shows that B2B integration not only requires the exchange of business events between distributed trading partners across networks like the Internet, but also demands back-end application integration within business processes, and thus goes far beyond traditional approaches to enterprise application integration approaches. His detailed presentation describes how B2B integration standards like RosettaNet or SWIFT, the application integration standard J2EE Connector Architecture and basic standards like XML act together in order to enable business process integration. The book is the first of its kind that discusses B2B concepts and architectures independent of specific and short-term industrial or academic approaches and thus provides solid and long-lasting knowledge for researchers, students, and professionals interested in the field of B2B integration.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
Volume 11 Reviews in Computational Chemistry Kenny B. Lipkowitz and Donald B. Boyd The Theme of this Eleventh Volume is Computer-Aided Ligand Design and Modeling of Biomolecules. A Stellar Group of Scientists from Around the World Join in this Volume to Provide Tutorials for Beginners and Experts. Chapters 1 and 2 Take A Detailed Look at De Novo Design Methodologies for Discovering New Ligands which May Become Pharmaceuticals. Chapters 3 and 4 Cover the Methods and Applications of Three-Dimensional Quantitative Structure-Activity Relationships (3D-QSAR) Currently Used in Drug Discovery. Ways to Compute the Correct Lipophilic/Hydrophilic Behavior of Molecules are Taught in Chapter 5. Chapter 6 is an Exposition of Realistically Simulating DNA in the Complex Milieu of Ions that Surround it. An Appendix to this Volume Gives A Compendium of Software and Internet Tools for Computational Chemistry. -From Reviews of the Series . This Well-Respected Series Continues the Fine Selection of Topics and Presentation Qualities Set Forth by the Previous Members. For Example, Each Chapter Contains Thorough Treatment of the Theory Behind the Topic Being Covered. Moreover, the Background Material is Followed by Ample Timely Examples Culled From Recent Literature. Journal of Medicinal Chemistry
Research argues that e-government technologies have positive influences on politics and democracy, improving citizens' environment as well as their engagement with their government. Although much research indicates that e-government technologies have increased citizen participation, there is much more than can be developed. Politics, Democracy and E-Government: Participation and Service Delivery examines how e-government impacts politics and democracy in both developed and developing countries, discussing the participation of citizens in government service delivery. This book brings forth the idea that e-government has a direct influence on the important function of governing through participation and service delivery. Containing chapters from leading e-government scholars and practitioners from across the globe, the overall objective of this book is accomplished through its discussion on the influences of e-government on democratic institutions and processes.
Most of the intriguing social phenomena of our time, such as international terrorism, social inequality, and urban ethnic segregation, are consequences of complex forms of agent interaction that are difficult to observe methodically and experimentally. This book looks at a new research stream that makes use of advanced computer simulation modelling techniques to spotlight agent interaction that allows us to explain the emergence of social patterns. It presents a method to pursue analytical sociology investigations that look at relevant social mechanisms in various empirical situations, such as markets, urban cities, and organisations. This book: Provides a comprehensive introduction to epistemological, theoretical and methodological features of agent-based modelling in sociology through various discussions and examples.Presents the pros and cons of using agent-based models in sociology.Explores agent-based models in combining quantitative and qualitative aspects, and micro- and macro levels of analysis.Looks at how to pose an agent-based research question, identifying the model building blocks, and how to validate simulation results.Features examples of agent-based models that look at crucial sociology issues.Supported by an accompanying website featuring data sets and code for the models included in the book. "Agent-Based Computational Sociology" is written in a common sociological language and features examples of models that look at all the traditional explanatory challenges of sociology. Researchers and graduate students involved in the field of agent-based modelling and computer simulation in areas such as social sciences, cognitive sciences and computer sciences will benefit from this book.
Sadly enough, war, conflicts and terrorism appear to stay with us in the 21st century. But what is our outlook on new methods for preventing and ending them? Present-day hard- and software enables the development of large crisis, conflict, and conflict management databases with many variables, sometimes with automated updates, statistical analyses of a high complexity, elaborate simulation models, and even interactive uses of these databases. In this book, these methods are presented, further developed, and applied in relation to the main issue: the resolution and prevention of intra- and international conflicts. Conflicts are a worldwide phenomenon. Therefore, internationally leading researchers from the USA, Austria, Canada, Germany, New Zealand and Switzerland have contributed.
CAO is one of the most misunderstood and underutilized weapons available to retailers today. International consultant Barbara Anderson makes clear that in only a limited sense does CAO replace manual ordering. In its full sense it is much more--the optimization of manufacturer, supplier, and retailer distribution to the retail store-- based on consumer and store data and corporate policy. Anderson thus provides a framework and checklist for implementing CAO, and understanding of key terminology, solutions to likely problems, and ways to make CAO implementation successful, and in doing so she covers the full spectrum of retailing. A readable, easily grasped, comprehensive, unique book for retailing management and for their colleagues teaching it in colleges and universities. Anderson points out that CAO is not an off-the-shelf system but an ongoing project, each phase with its own unique set of benefits and cost justification. Retail systems must support a vision where a product may bypass the store on the way to the consumer, or even the distribution center on the way to the stores. Consumers have a wide range of choices, not only of where to shop, but how to shop, and this demands ever greater levels of service. CAO systems help assure that the correct product is available at the store, that it can be located throughout the supply chain, and that it can be moved easily from any location. In CAO, all levels of operation work with real-time information, using decision-making tools that react and learn from new information. Her book thus shows there is no one right system, product, or approach for successful CAO. It's too big a leap to make in one step but consists of modules and functions that can grow in sophistication over time, and that not all retailers nor all categories within one retailer will use the same methods for forecasting and ordering. She also shows that the distinct separation of replenishment product from planning product is artifically imposed and that the separation of head-quarters from stores is also artificial. Indeed, integration does not mean the integration of separate systems; rather, of business functions themselves. Readers will thus get not only a knowledgeable discussion of what CAO should be, what it is and how it works, but an immediately useful understanding of how to make it work in their own companies.
This comprehensive, detailed reference to Mathematica provides the reader with both a working knowledge of Mathematica programming in general and a detailed knowledge of key aspects of Mathematica needed to create the fastest, shortest, and most elegant implementations possible to solve problems from the natural and physical sciences. The Guidebook gives the user a deeper understanding of Mathematica by instructive implementations, explanations, and examples from a range of disciplines at varying levels of complexity. "Programming" covers the structure of Mathematica expressions, after an overview of the syntax of Mathematica, its programming, graphic, numeric and symbolic capabilities in chapter 1. Chapter 2-6 cover hierarchical construction of all Mathematica objects out of symbolic expressions, the definition of functions, the recognition of patterns and their efficient application, program flows and program structuring, the manipulation of lists, and additional topics. An Appendix contains some general references on algorithms and applications of computer algebra, Mathematica itself and comparisons of various algebra systems. The multiplatform CD contains Mathematica 4.1 notebooks with detailed descriptions and explanations of the Mathematica commands needed in that chapter and used in applications, supplemented by a variety of mathematical, physical, and graphic examples and worked out solutions to all exercises. The Mathematica Guidebook is an indispensible resource for practitioners, researchers and professionals in mathematics, the sciences, and engineering. It will find a natural place on the bookshelf as an essential reference work.
Science has made great progress in the twentieth century, with the establishment of proper disciplines in the fields of physics, computer science, molecular biology, and many others. At the same time, there have also emerged many engineering ideas that are interdisciplinary in nature, beyond the realm of such orthodox disciplines. These in clude, for example, artificial intelligence, fuzzy logic, artificial neural networks, evolutional computation, data mining, and so on. In or der to generate new technology that is truly human-friendly in the twenty-first century, integration of various methods beyond specific disciplines is required. Soft computing is a key concept for the creation of such human friendly technology in our modern information society. Professor Rutkowski is a pioneer in this field, having devoted himself for many years to publishing a large variety of original work. The present vol ume, based mostly on his own work, is a milestone in the devel opment of soft computing, integrating various disciplines from the fields of information science and engineering. The book consists of three parts, the first of which is devoted to probabilistic neural net works. Neural excitation is stochastic, so it is natural to investi gate the Bayesian properties of connectionist structures developed by Professor Rutkowski. This new approach has proven to be par ticularly useful for handling regression and classification problems vi Preface in time-varying environments. Throughout this book, major themes are selected from theoretical subjects that are tightly connected with challenging applications."
Reputation In Artificial Societies discusses the role of reputation
in the achievement of social order. The book proposes that
reputation is an agent property that results from transmission of
beliefs about how the agents are evaluated with regard to a
socially desirable conduct. This desirable conduct represents one
or another of the solutions to the problem of social order and may
consist of cooperation or altruism, reciprocity, or norm obedience.
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
Computational finance deals with the mathematics of computer programs that realize financial models or systems. This book outlines the epistemic risks associated with the current valuations of different financial instruments and discusses the corresponding risk management strategies. It covers most of the research and practical areas in computational finance. Starting from traditional fundamental analysis and using algebraic and geometric tools, it is guided by the logic of science to explore information from financial data without prejudice. In fact, this book has the unique feature that it is structured around the simple requirement of objective science: the geometric structure of the data = the information contained in the data.
There are many myths about Artificial Intelligence (AI) relating to what it is and what it can and cannot do. The people making decisions on AI projects are often not technologically savvy and unable to find easy answers. The spending on and the returns from AI projects are not necessarily straightforward. Part of the reason for this is the lack of understanding of the impact of critical decision criteria. AI touches on many ethical concepts - data privacy, validity, and, more importantly, its potential misuse. AI often replaces human decision-making, as managers do not clearly understand the implications of those choices. This book provides an easy and accessible guide for practitioners without a technological background to understand AI. It guides the reader through the fundamental issues confronting decision-makers. It offers advice on 'how to ask relevant questions' using the 15 decision scales. There is currently no comparable book on the market that acts as a pocketbook management reference guide for the AI layman.
A complete introduction to the many mathematical tools used to solve practical problems in coding. Mathematicians have been fascinated with the theory of error-correcting codes since the publication of Shannon's classic papers fifty years ago. With the proliferation of communications systems, computers, and digital audio devices that employ error-correcting codes, the theory has taken on practical importance in the solution of coding problems. This solution process requires the use of a wide variety of mathematical tools and an understanding of how to find mathematical techniques to solve applied problems. Introduction to the Theory of Error-Correcting Codes, Third Edition demonstrates this process and prepares students to cope with coding problems. Like its predecessor, which was awarded a three-star rating by the Mathematical Association of America, this updated and expanded edition gives readers a firm grasp of the timeless fundamentals of coding as well as the latest theoretical advances. This new edition features:
Introduction to the Theory of Error-Correcting Codes, Third Edition is the ideal textbook for senior-undergraduate and first-year graduate courses on error-correcting codes in mathematics, computer science, and electrical engineering.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
This lively and fascinating text traces the key developments in computation - from 3000 B.C. to the present day - in an easy-to-follow and concise manner. Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science.
Learn application security from the very start, with this comprehensive and approachable guide! Alice and Bob Learn Application Security is an accessible and thorough resource for anyone seeking to incorporate, from the beginning of the System Development Life Cycle, best security practices in software development. This book covers all the basic subjects such as threat modeling and security testing, but also dives deep into more complex and advanced topics for securing modern software systems and architectures. Throughout, the book offers analogies, stories of the characters Alice and Bob, real-life examples, technical explanations and diagrams to ensure maximum clarity of the many abstract and complicated subjects. Topics include: Secure requirements, design, coding, and deployment Security Testing (all forms) Common Pitfalls Application Security Programs Securing Modern Applications Software Developer Security Hygiene Alice and Bob Learn Application Security is perfect for aspiring application security engineers and practicing software developers, as well as software project managers, penetration testers, and chief information security officers who seek to build or improve their application security programs. Alice and Bob Learn Application Security illustrates all the included concepts with easy-to-understand examples and concrete practical applications, furthering the reader's ability to grasp and retain the foundational and advanced topics contained within.
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike. |
You may like...
The Power of Slow - 101 Ways to Save…
Christine Louise Hohlbaum
Paperback
A Short History of English Church Music
Eric Routley, Lionel Dakers
Hardcover
R1,554
Discovery Miles 15 540
World Gone Crazy
The Reverend Cleve Freckleton & The Sinners
Vinyl record
|