![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
This book is about the verification of reactive systems. A reactive system is a system that maintains an ongoing interaction with its environment, as opposed to computing some final value on termination. The family of reactive systems includes many classes of programs whose correct and reliable construction is con sidered to be particularly challenging, including concurrent programs, embedded and process control programs, and operating systems. Typical examples of such systems are an air traffic control system, programs controlling mechanical devices such as a train, or perpetually ongoing processes such as a nuclear reactor. With the expanding use of computers in safety-critical areas, where failure is potentially disastrous, correctness is crucial. This has led to the introduction of formal verification techniques, which give both users and designers of software and hardware systems greater confidence that the systems they build meet the desired specifications. Framework The approach promoted in this book is based on the use of temporal logic for specifying properties of reactive systems, and develops an extensive verification methodology for proving that a system meets its temporal specification. Reactive programs must be specified in terms of their ongoing behavior, and temporal logic provides an expressive and natural language for specifying this behavior. Our framework for specifying and verifying temporal properties of reactive systems is based on the following four components: 1. A computational model to describe the behavior of reactive systems. The model adopted in this book is that of a Fair Transition System (FTS)."
This book is a result of the Seventh International Conference on Information Sys tems Development-Methods and Tools, Theory and Practice held in Bled, Slovenia, Sep tember 21-23, 1998. The purpose of the conference was to address issues facing academia and industry when specifying, developing, managing, and improving information comput erized systems. During the past few years, many new concepts and approaches emerged in the Information Systems Development (ISD) field. The various theories, methods, and tools available to system developers also bring problems such as choosing the most effec tive approach for a specific task. This conference provides a meeting place for IS re searchers and practitioners from Eastern and Western Europe as well as from other parts of the world. An objective of the conference is not only to share scientific knowledge and in terests but to establish strong professional ties among the participants. The Seventh International Conference on Information Systems Develop ment-ISD'98 continues the concepts of the first Polish-Scandinavian Seminar on Current Trends in Information Systems Development Methodologies held in Gdansk, Poland in 1988. Through the years, the Seminar developed into the International Conference on In formation Systems Development. ISD'99 will be held in Boise, Idaho. The selection of papers was carried out by the International Program Committee. All papers were reviewed in advance by three people. Papers were judged according to their originality, relevance, and presentation quality. All papers were judged only on their own merits, independent of other submissions."
The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.
We describe in this book, new methods for intelligent manufacturing using soft computing techniques and fractal theory. Soft Computing (SC) consists of several computing paradigms, including fuzzy logic, neural networks, and genetic algorithms, which can be used to produce powerful hybrid intelligent systems. Fractal theory provides us with the mathematical tools to understand the geometrical complexity of natural objects and can be used for identification and modeling purposes. Combining SC techniques with fractal theory, we can take advantage of the "intelligence" provided by the computer methods and also take advantage of the descriptive power of the fractal mathematical tools. Industrial manufacturing systems can be considered as non-linear dynamical systems, and as a consequence can have highly complex dynamic behaviors. For this reason, the need for computational intelligence in these manufacturing systems has now been well recognized. We consider in this book the concept of "intelligent manufacturing" as the application of soft computing techniques and fractal theory for achieving the goals of manufacturing, which are production planning and control, monitoring and diagnosis of faults, and automated quality control. As a prelude, we provide a brief overview of the existing methodologies in Soft Computing. We then describe our own approach in dealing with the problems in achieving intelligent manufacturing. Our particular point of view is that to really achieve intelligent manufacturing in real-world applications we need to use SC techniques and fractal theory.
The vigorous development of the internet and other information technologies have significantly expanded the amount and variety of sources of information available on decision making. This book presents the current trends of soft computing applications to the fields of measurements and information acquisition. Main topics are the production and presentation of information including multimedia, virtual environment, and computer animation as well as the improvement of decisions made on the basis of this information in various applications ranging from engineering to business. In order to make high-quality decisions, one has to fuse information of different kinds from a variety of sources with differing degrees of reliability and uncertainty. The necessity to use intelligent methodologies in the analysis of such systems is demonstrated as well as the inspiring relation of computational intelligence to its natural counterpart. This book includes several contributions demonstrating a further movement towards the interdisciplinary collaboration of the biological and computer sciences with examples from biology and robotics.
A day does not pass without a newspaper report about yet another company that has started outsourcing technology or other business processes to India. The Senate recently voted 70 to 26 in favor of preventing federal contracts going offshore, yet US managers continue to beat a path to India because it is the global leader for offshore IT-enabled services. Many CEOs seek to reduce their costs or improve service quality, but not many understand India on their first visit and some are confused by the culture. In this book author Mark Kobayashi-Hillary introduces India and the major players in the Indian service industry. He offers a balanced view on the trend to outsource to India, describing the reasons why a business should utilize India as an offshore outsourcing destination and the steps needed to find and work with a local partner. Not only does the book make a compelling economic case for outsourcing to this region, it also discusses how to manage the entire transition process, including the potential impact on local resources. Mark Kobayashi-Hillary is a British writer and independent outsourcing consultant based in London. He has worked at a senior level for several leading banking and technology groups and has been involved in managing outsourced relationships in the UK, Singapore and India. He is a regular commentator on India and outsourcing in the European press. Outsourcing To India is written from personal experience and several years of research. This practical guide will help managers navigate through the offshore outsourcing maze, allowing them to avoid many of the major pitfalls others have faced when setting up shop in India.
Symmetries and Groups in Signal Processing: An Introduction deals with the subject of symmetry, and with its place and role in modern signal processing. In the sciences, symmetry considerations and related group theoretic techniques have had a place of central importance since the early twenties. In engineering, however, a matching recognition of their power is a relatively recent development. Despite that, the related literature, in the form of journal papers and research monographs, has grown enormously. A proper understanding of the concepts that have emerged in the process requires a mathematical background that goes beyond what is traditionally covered in an engineering undergraduate curriculum. Admittedly, there is a wide selection of excellent introductory textbooks on the subject of symmetry and group theory. But they are all primarily addressed to students of the sciences and mathematics, or to students of courses in mathematics. Addressed to students with an engineering background, this book is meant to help bridge the gap.
This book is an extension of one author's doctoral thesis on the false path problem. The work was begun with the idea of systematizing the various solutions to the false path problem that had been proposed in the literature, with a view to determining the computational expense of each versus the gain in accuracy. However, it became clear that some of the proposed approaches in the literature were wrong in that they under estimated the critical delay of some circuits under reasonable conditions. Further, some other approaches were vague and so of questionable accu racy. The focus of the research therefore shifted to establishing a theory (the viability theory) and algorithms which could be guaranteed correct, and then using this theory to justify (or not) existing approaches. Our quest was successful enough to justify presenting the full details in a book. After it was discovered that some existing approaches were wrong, it became apparent that the root of the difficulties lay in the attempts to balance computational efficiency and accuracy by separating the tempo ral and logical (or functional) behaviour of combinational circuits. This separation is the fruit of several unstated assumptions; first, that one can ignore the logical relationships of wires in a network when considering timing behaviour, and, second, that one can ignore timing considerations when attempting to discover the values of wires in a circuit."
Initially proposed as rivals of classical logic, alternative logics have become increasingly important in sciences such as quantum physics, computer science, and artificial intelligence. The contributions collected in this volume address and explore the question whether the usage of logic in the sciences, especially in modern physics, requires a deviation from classical mathematical logic. The articles in the first part of the book set the scene by describing the context and the dilemma when applying logic in science. In part II the authors offer several logics that deviate in different ways from classical logics. The twelve papers in part III investigate in detail specific aspects such as quantum logic, quantum computation, computer-science considerations, praxic logic, and quantum probability. Most of the contributions are revised and partially extended versions of papers presented at a conference of the same title of the Academie Internationale de Philosophie des Sciences held at the Internationales Forschungszentrum Salzburg in May 1999. Others have been added to complete the picture of recent research in alternative logics as they have been developed for applications in the sciences. "
Welcome to the proceedings of the Seventh International Conference of the UK Systems Society being held at York University, United Kingdom from July 7th to 10th, 2002. It is a pleasure to be able to share with you this collection ofpapers that have been contributed by systems thinkers from around the world. As with previous UKSS conferences, the aim ofthis conference is to encourage debate and promote development of pertinent issues in systems theory and practice. In current times where the focus has moved from 'information' to 'knowledge' and where 'knowledge management', of everyday speak, it seemed fitting to 'knowledge assets' and so on, have become part offer a conference title of'Systems Theory and Practice in the Knowledge Age'. In keeping with another tradition of previous conferences, the UKSS Conference 2002 Committee decided to compile a collection ofdelegates' papers before the event as a platform from which to launch discussions in York. Ideas presented in the following papers will, undoubtedly, be developed during the dialogue generated at the conference and new papers will emerge. In his abstract for his plenary at this conference, Professor Peter Checkland throws down the gauntlet to systems thinking and its relevance in the knowledge age with the following statement: "30 Years In The Systems Movement: Disappointments I Have Known and Hopes/or the Future Springing from a lunchtime conversation at an American University, the Systems Movement is now nearly 50 years old.
High-Speed Clock Network Design is a collection of design concepts, techniques and research works from the author for clock distribution in microprocessors and high-performance chips. It is organized in 11 chapters as follows. Chapter 1 provides an overview to the design of clock networks. Chapter 2 specifies the timing requirements in digital design. Chapter 3 shows the circuits of sequential elements including latches and flip-flops. Chapter 4 describes the domino circuits, which need special clock signals. Chapter 5 discusses the phase-locked loop (PLL) and delay-locked loop (DLL), which provide the clock generation and de-skewing for the on-chip clock distribution. Chapter 6 summarizes the clock distribution techniques published in the state-of-the-art microprocessor chips. Chapter 7 describes the CAD flow on the clock network simulation. Chapter 8 gives the research work on low-voltage swing clock distribution. Chapter 9 explores the possibility of placing the global clock tree on the package layers. Chapter 10 shows the algorithms of balanced clock routing and wire sizing for the skew minimization. Chapter 11 shows a commercial CAD tool that deals with clock tree synthesis in the ASIC design flow. The glossary is attached at the end of this book. The clock network design is still a challenging task in most high-speed VLSI chips, since the clock frequency and power consumption requirements are increasingly difficult to meet for multiple clock networks on the chip. Many research works and industry examples will be shown in this area to continually improve the clock distribution networks for future high-performance chips.
Logic Synthesis for Control Automata provides techniques for logic design of very complex control units with hardly any constraints on their size, i.e. the number of inputs, outputs and states. These techniques cover all stages of control unit design, including: description of control unit behavior by using operator schemes of algorithms (binary decision trees) and various transformations of these descriptions -- composition, decomposition, minimization, etc.; synthesis of a control automaton (finite-state machine); synthesis of an automaton logic circuit: with matrix structure as a part of LSI or VLSI circuits; as multilevel circuit with logic gates; with standard LSI and VLSI circuits with and without memory. Each chapter contains many examples, illustrating the use of the models and methods described. Moreover, the special last chapter demonstrates in detail the whole design methodology presented in the previous chapters, through the examples of the logic design for a control unit. The models, methods and algorithms described in the book can be applied to a broad class of digital system design problems including design of complex controllers, robots, control units of computers and for designing CAD systems of VLSI circuits using FPGA, PLD and SIC technologies. Logic Synthesis for Control Automata is a valuable reference for graduate students, researchers and engineers involved in the design of very complex controllers, VLSI circuits and CAD systems. The inclusion of many examples and problems makes it most suitable for a course on the subject.
This is a collection of state-of-the-art surveys on topics at the interface between transportation modeling and operations research given by leading international experts. Based on contributions to a NATO workshop, the surveys are up-to-date and rigorous presentations or applications of quantitative methods in the area. The subjects covered include dynamic traffic simulation techniques and dynamic routing in congested networks, operation and control of traffic management tools, optimized transportation data collection, and vehicle routing problems.
As the world proceeds quickly into the Information Age, it encounters both successes and challenges, and it is well recognized that Intelligent Information Processing provides the key to the Information Age and to mastering many of these challenges. Intelligent Information Processing supports the most advanced productive tools that are said to be able to change human life and the world itself. However, the path is never a straight one and every new technology brings with it a spate of new research problems to be tackled by researchers. As such, the demand for Information Processing research is ever-increasing. This book presents the proceedings of the 4th IFIP International Conference on Intelligent Information Processing. This conference provides a forum for engineers and scientists in academia, university and industry to present their latest research findings in all aspects of Intelligent Information Processing.
COLLABORATIVE NETWORKS Becoming a pervasive paradigm In recent years the area of collaborative networks is being consolidated as a new discipline (Camarinha-Matos, Afsarmanesh, 2005) that encompasses and gives more structured support to a large diversity of collaboration forms. In terms of applications, besides the "traditional" sectors represented by the advanced supply chains, virtual enterprises, virtual organizations, virtual teams, and their breading environments, new forms of collaborative structures are emerging in all sectors of the society. Examples can be found in e-government, intelligent transportation systems, collaborative virtual laboratories, agribusiness, elderly care, silver economy, etc. In some cases those developments tend to adopt a terminology that is specific of that domain; often the involved actors in a given domain are not fully aware of the developments in the mainstream research on collaborative networks. For instance, the grid community adopted the term "virtual organization" but focused mainly on the resource sharing perspective, ignoring most of the other aspects involved in collaboration. The European enterprise interoperability community, which was initially focused on the intra-enterprise aspects, is moving towards inter-enterprise collaboration. Collaborative networks are thus becoming a pervasive paradigm giving basis to new socio-organizational structures.
Focusing on both theoretical and practical aspects of online learning by introducing a variety of online instructional models, this work also looks at the best practices that help educators and professional trainers to better understand the dynamics of online learning.
This book will resonate with anyone no matter where you reside on this journey, whether newbie or old guard. If you want to be part of this change, you need to understand all about the messy middle that Leda so expertly describes in this book. If you read this book and it doesn't resonate, then I suggest you think about stepping aside. -Curt Queyrouze, President, CCBX, A Division of Coastal Community Bank The world is going digital, and so is banking-in fits, starts, and circles. Why is it so hard? Why is the industry constantly getting in the way of its own technological progress and what can we do about it all? This book looks at the human and structural obstacles to innovation-driven transformation and at the change in habits, mindsets and leadership needed for the next stage of the digital journey and argues that this change will be brought about, not by external heroes and saviours, not by a generation yet to be born, but people just like us. People who understand the industry and its quirks. Bankers who have the grit, determination and energy to drive change. Bankers like us. This book celebrates and chronicles the shared experience of bankers like us. It starts with a 'this is who we are' piece, including the author's trench credentials. It then present an overview of corporate culture (this is what we deal with and a few ideas on how to handle it), as well as a piece on why transformation is so difficult and so many get it wrong; a piece on the challenges our lack of diversity brings or compounds, and a hopeful look-ahead on what a team of principled, dedicated folks can do despite everything.
The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This second volume presents experimental laboratory prototypes and applied computing implementations. Emergent molecular computing is presented by enzymatic logical gates and circuits, and DNA nano-devices. Reaction-diffusion chemical computing is exemplified by logical circuits in Belousov-Zhabotinsky medium and geometrical computation in precipitating chemical reactions. Logical circuits realised with solitons and impulses in polymer chains show advances in collision-based computing. Photo-chemical and memristive devices give us a glimpse on hot topics of a novel hardware. Practical computing is represented by algorithms of collective and immune-computing and nature-inspired optimisation. Living computing devices are implemented in real and simulated cells, regenerating organisms, plant roots and slime mould. The book is the encyclopedia, the first ever complete authoritative account, of the theoretical and experimental findings in the unconventional computing written by the world leaders in the field. All chapters are self-contains, no specialist background is required to appreciate ideas, findings, constructs and designs presented. This treatise in unconventional computing appeals to readers from all walks of life, from high-school pupils to university professors, from mathematicians, computers scientists and engineers to chemists and biologists.
The consequences of recent floods and flash floods in many parts of the world have been devastating. One way to improving flood management practice is to invest in data collection and modelling activities which enable an understanding of the functioning of a system and the selection of optimal mitigation measures. A Digital Terrain Model (DTM) provides the most essential information for flood managers. Light Detection and Ranging (LiDAR) surveys which enable the capture of spot heights at a spacing of 0.5m to 5m with a horizontal accuracy of 0.3m and a vertical accuracy of 0.15m can be used to develop high accuracy DTM but needs careful processing before using it for any application.This book presents the augmentation of an existing Progressive Morphological filtering algorithm for processing raw LiDAR data to support a 1D/2D urban flood modelling framework. The key characteristics of this improved algorithm are: (1) the ability to deal with different kinds of buildings; (2) the ability to detect elevated road/rail lines and represent them in accordance to the reality; (3) the ability to deal with bridges and riverbanks; and (4) the ability to recover curbs and the use of appropriated roughness coefficient of Manning's value to represent close-to-earth vegetation (e.g. grass and small bush).
In the last decade there have been rapid developments in the field of computer-based learning environments. A whole new generation of computer-based learning environments has appeared, requiring new approaches to design and development. One main feature of current systems is that they distinguish different knowledge bases that are assumed to be necessary to support learning processes. Current computer-based learning environments often require explicit representations of large bodies of knowledge, including knowledge of instruction. This book focuses on instructional models as explicit, potentially implementable representations of knowledge concerning one or more aspects of instruction. The book has three parts, relating to different aspects of the knowledge that should be made explicit in instructional models: knowledge of instructional planning, knowledge of instructional strategies, and knowledge of instructional control. The book is based on a NATO Advanced Research Workshop held at the University of Twente, The Netherlands in July 1991.
Over the past twenty years, the conventional knowledge management approach has evolved into a strategic management approach that has found applications and opportunities outside of business, in society at large, through education, urban development, governance, and healthcare, among others. Knowledge-Based Development for Cities and Societies: Integrated Multi-Level Approaches enlightens the concepts and challenges of knowledge management for both urban environments and entire regions, enhancing the expertise and knowledge of scholars, researchers, practitioners, managers and urban developers in the development of successful knowledge-based development policies, creation of knowledge cities and prosperous knowledge societies. This reference creates large knowledge base for scholars, managers and urban developers and increases the awareness of the role of knowledge cities and knowledge societies in the knowledge era, as well as of the challenges and opportunities for future research.
Underwater Robots reports on the latest progress in underwater robotics. In spite of its importance, the ocean is generally overlooked, since we focus more of our attention on land and atmospheric issues. We have not yet been able to explore the full depths of the ocean and its resources. The deep oceans range between 19000 to 36000 feet. At a mere 33-foot depth, the pressure is twice the normal atmospheric pressure of 29.4 psi. This obstacle, compounded with other complex issues due to the unstructured and hazardous environment, makes it difficult to travel in the ocean even though today's technologies allow humans to land on the moon. Only recently, we discovered by using manned submersibles that a large amount of carbon dioxide comes from the sea-floor and that extraordinary groups of organisms live in hydrothermal vent areas. On March 24, 1995 Kaiko (a remotely operated vehicle) navigated the deepest region of the ocean, the Mariana Trough. This vehicle successfully dived to a depth of 33000 feet and instantly showed scenes from the trench through a video camera. New tools like this enable us to gain knowledge of mysterious places. However, extensive use of manned submersibles and remotely operated vehicles is limited to a few applications because of very high operational costs, operator fatigue and safety issues. In spite of these hindrances, the demand for advanced underwater robot technologies is growing and will eventually arrive at fully autonomous, specialized, reliable underwater robotic vehicles. Underwater Robots is an edited volume of peer-reviewed original research comprising thirteen invited contributions by leading researchers. This research work has also been published as a special issue of Autonomous Robots (Volume 3, Numbers 2 and 3). |
![]() ![]() You may like...
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,319
Discovery Miles 43 190
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,470
Discovery Miles 24 700
The Internet of Medical Things…
Subhendu Kumar Pani, Priyadarsan Patra, …
Hardcover
Numerical Computation, Data Analysis and…
Yumin Cheng
Hardcover
|