![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book is a result of the Seventh International Conference on Information Sys tems Development-Methods and Tools, Theory and Practice held in Bled, Slovenia, Sep tember 21-23, 1998. The purpose of the conference was to address issues facing academia and industry when specifying, developing, managing, and improving information comput erized systems. During the past few years, many new concepts and approaches emerged in the Information Systems Development (ISD) field. The various theories, methods, and tools available to system developers also bring problems such as choosing the most effec tive approach for a specific task. This conference provides a meeting place for IS re searchers and practitioners from Eastern and Western Europe as well as from other parts of the world. An objective of the conference is not only to share scientific knowledge and in terests but to establish strong professional ties among the participants. The Seventh International Conference on Information Systems Develop ment-ISD'98 continues the concepts of the first Polish-Scandinavian Seminar on Current Trends in Information Systems Development Methodologies held in Gdansk, Poland in 1988. Through the years, the Seminar developed into the International Conference on In formation Systems Development. ISD'99 will be held in Boise, Idaho. The selection of papers was carried out by the International Program Committee. All papers were reviewed in advance by three people. Papers were judged according to their originality, relevance, and presentation quality. All papers were judged only on their own merits, independent of other submissions."
The 11 th IFIP International Conference on Very Large Scale Integration, in Montpellier, France, December 3-5,2001, was a great success. The main focus was about IP Cores, Circuits and System Designs & Applications as well as SOC Design Methods and CAD. This book contains the best papers (39 among 70) that have been presented during the conference. Those papers deal with all aspects of importance for the design of the current and future integrated systems. System on Chip (SOC) design is today a big challenge for designers, as a SOC may contain very different blocks, such as microcontrollers, DSPs, memories including embedded DRAM, analog, FPGA, RF front-ends for wireless communications and integrated sensors. The complete design of such chips, in very deep submicron technologies down to 0.13 mm, with several hundreds of millions of transistors, supplied at less than 1 Volt, is a very challenging task if design, verification, debug and industrial test are considered. The microelectronic revolution is fascinating; 55 years ago, in late 1947, the transistor was invented, and everybody knows that it was by William Shockley, John Bardeen and Walter H. Brattein, Bell Telephone Laboratories, which received the Nobel Prize in Physics in 1956. Probably, everybody thinks that it was recognized immediately as a major invention.
Initially proposed as rivals of classical logic, alternative logics have become increasingly important in sciences such as quantum physics, computer science, and artificial intelligence. The contributions collected in this volume address and explore the question whether the usage of logic in the sciences, especially in modern physics, requires a deviation from classical mathematical logic. The articles in the first part of the book set the scene by describing the context and the dilemma when applying logic in science. In part II the authors offer several logics that deviate in different ways from classical logics. The twelve papers in part III investigate in detail specific aspects such as quantum logic, quantum computation, computer-science considerations, praxic logic, and quantum probability. Most of the contributions are revised and partially extended versions of papers presented at a conference of the same title of the Academie Internationale de Philosophie des Sciences held at the Internationales Forschungszentrum Salzburg in May 1999. Others have been added to complete the picture of recent research in alternative logics as they have been developed for applications in the sciences. "
A day does not pass without a newspaper report about yet another company that has started outsourcing technology or other business processes to India. The Senate recently voted 70 to 26 in favor of preventing federal contracts going offshore, yet US managers continue to beat a path to India because it is the global leader for offshore IT-enabled services. Many CEOs seek to reduce their costs or improve service quality, but not many understand India on their first visit and some are confused by the culture. In this book author Mark Kobayashi-Hillary introduces India and the major players in the Indian service industry. He offers a balanced view on the trend to outsource to India, describing the reasons why a business should utilize India as an offshore outsourcing destination and the steps needed to find and work with a local partner. Not only does the book make a compelling economic case for outsourcing to this region, it also discusses how to manage the entire transition process, including the potential impact on local resources. Mark Kobayashi-Hillary is a British writer and independent outsourcing consultant based in London. He has worked at a senior level for several leading banking and technology groups and has been involved in managing outsourced relationships in the UK, Singapore and India. He is a regular commentator on India and outsourcing in the European press. Outsourcing To India is written from personal experience and several years of research. This practical guide will help managers navigate through the offshore outsourcing maze, allowing them to avoid many of the major pitfalls others have faced when setting up shop in India.
This is a collection of state-of-the-art surveys on topics at the interface between transportation modeling and operations research given by leading international experts. Based on contributions to a NATO workshop, the surveys are up-to-date and rigorous presentations or applications of quantitative methods in the area. The subjects covered include dynamic traffic simulation techniques and dynamic routing in congested networks, operation and control of traffic management tools, optimized transportation data collection, and vehicle routing problems.
Logic Synthesis for Control Automata provides techniques for logic design of very complex control units with hardly any constraints on their size, i.e. the number of inputs, outputs and states. These techniques cover all stages of control unit design, including: description of control unit behavior by using operator schemes of algorithms (binary decision trees) and various transformations of these descriptions -- composition, decomposition, minimization, etc.; synthesis of a control automaton (finite-state machine); synthesis of an automaton logic circuit: with matrix structure as a part of LSI or VLSI circuits; as multilevel circuit with logic gates; with standard LSI and VLSI circuits with and without memory. Each chapter contains many examples, illustrating the use of the models and methods described. Moreover, the special last chapter demonstrates in detail the whole design methodology presented in the previous chapters, through the examples of the logic design for a control unit. The models, methods and algorithms described in the book can be applied to a broad class of digital system design problems including design of complex controllers, robots, control units of computers and for designing CAD systems of VLSI circuits using FPGA, PLD and SIC technologies. Logic Synthesis for Control Automata is a valuable reference for graduate students, researchers and engineers involved in the design of very complex controllers, VLSI circuits and CAD systems. The inclusion of many examples and problems makes it most suitable for a course on the subject.
This book is an extension of one author's doctoral thesis on the false path problem. The work was begun with the idea of systematizing the various solutions to the false path problem that had been proposed in the literature, with a view to determining the computational expense of each versus the gain in accuracy. However, it became clear that some of the proposed approaches in the literature were wrong in that they under estimated the critical delay of some circuits under reasonable conditions. Further, some other approaches were vague and so of questionable accu racy. The focus of the research therefore shifted to establishing a theory (the viability theory) and algorithms which could be guaranteed correct, and then using this theory to justify (or not) existing approaches. Our quest was successful enough to justify presenting the full details in a book. After it was discovered that some existing approaches were wrong, it became apparent that the root of the difficulties lay in the attempts to balance computational efficiency and accuracy by separating the tempo ral and logical (or functional) behaviour of combinational circuits. This separation is the fruit of several unstated assumptions; first, that one can ignore the logical relationships of wires in a network when considering timing behaviour, and, second, that one can ignore timing considerations when attempting to discover the values of wires in a circuit."
Focusing on both theoretical and practical aspects of online learning by introducing a variety of online instructional models, this work also looks at the best practices that help educators and professional trainers to better understand the dynamics of online learning.
Welcome to the proceedings of the Seventh International Conference of the UK Systems Society being held at York University, United Kingdom from July 7th to 10th, 2002. It is a pleasure to be able to share with you this collection ofpapers that have been contributed by systems thinkers from around the world. As with previous UKSS conferences, the aim ofthis conference is to encourage debate and promote development of pertinent issues in systems theory and practice. In current times where the focus has moved from 'information' to 'knowledge' and where 'knowledge management', of everyday speak, it seemed fitting to 'knowledge assets' and so on, have become part offer a conference title of'Systems Theory and Practice in the Knowledge Age'. In keeping with another tradition of previous conferences, the UKSS Conference 2002 Committee decided to compile a collection ofdelegates' papers before the event as a platform from which to launch discussions in York. Ideas presented in the following papers will, undoubtedly, be developed during the dialogue generated at the conference and new papers will emerge. In his abstract for his plenary at this conference, Professor Peter Checkland throws down the gauntlet to systems thinking and its relevance in the knowledge age with the following statement: "30 Years In The Systems Movement: Disappointments I Have Known and Hopes/or the Future Springing from a lunchtime conversation at an American University, the Systems Movement is now nearly 50 years old.
High-Speed Clock Network Design is a collection of design concepts, techniques and research works from the author for clock distribution in microprocessors and high-performance chips. It is organized in 11 chapters as follows. Chapter 1 provides an overview to the design of clock networks. Chapter 2 specifies the timing requirements in digital design. Chapter 3 shows the circuits of sequential elements including latches and flip-flops. Chapter 4 describes the domino circuits, which need special clock signals. Chapter 5 discusses the phase-locked loop (PLL) and delay-locked loop (DLL), which provide the clock generation and de-skewing for the on-chip clock distribution. Chapter 6 summarizes the clock distribution techniques published in the state-of-the-art microprocessor chips. Chapter 7 describes the CAD flow on the clock network simulation. Chapter 8 gives the research work on low-voltage swing clock distribution. Chapter 9 explores the possibility of placing the global clock tree on the package layers. Chapter 10 shows the algorithms of balanced clock routing and wire sizing for the skew minimization. Chapter 11 shows a commercial CAD tool that deals with clock tree synthesis in the ASIC design flow. The glossary is attached at the end of this book. The clock network design is still a challenging task in most high-speed VLSI chips, since the clock frequency and power consumption requirements are increasingly difficult to meet for multiple clock networks on the chip. Many research works and industry examples will be shown in this area to continually improve the clock distribution networks for future high-performance chips.
In the last decade there have been rapid developments in the field of computer-based learning environments. A whole new generation of computer-based learning environments has appeared, requiring new approaches to design and development. One main feature of current systems is that they distinguish different knowledge bases that are assumed to be necessary to support learning processes. Current computer-based learning environments often require explicit representations of large bodies of knowledge, including knowledge of instruction. This book focuses on instructional models as explicit, potentially implementable representations of knowledge concerning one or more aspects of instruction. The book has three parts, relating to different aspects of the knowledge that should be made explicit in instructional models: knowledge of instructional planning, knowledge of instructional strategies, and knowledge of instructional control. The book is based on a NATO Advanced Research Workshop held at the University of Twente, The Netherlands in July 1991.
As the world proceeds quickly into the Information Age, it encounters both successes and challenges, and it is well recognized that Intelligent Information Processing provides the key to the Information Age and to mastering many of these challenges. Intelligent Information Processing supports the most advanced productive tools that are said to be able to change human life and the world itself. However, the path is never a straight one and every new technology brings with it a spate of new research problems to be tackled by researchers. As such, the demand for Information Processing research is ever-increasing. This book presents the proceedings of the 4th IFIP International Conference on Intelligent Information Processing. This conference provides a forum for engineers and scientists in academia, university and industry to present their latest research findings in all aspects of Intelligent Information Processing.
COLLABORATIVE NETWORKS Becoming a pervasive paradigm In recent years the area of collaborative networks is being consolidated as a new discipline (Camarinha-Matos, Afsarmanesh, 2005) that encompasses and gives more structured support to a large diversity of collaboration forms. In terms of applications, besides the "traditional" sectors represented by the advanced supply chains, virtual enterprises, virtual organizations, virtual teams, and their breading environments, new forms of collaborative structures are emerging in all sectors of the society. Examples can be found in e-government, intelligent transportation systems, collaborative virtual laboratories, agribusiness, elderly care, silver economy, etc. In some cases those developments tend to adopt a terminology that is specific of that domain; often the involved actors in a given domain are not fully aware of the developments in the mainstream research on collaborative networks. For instance, the grid community adopted the term "virtual organization" but focused mainly on the resource sharing perspective, ignoring most of the other aspects involved in collaboration. The European enterprise interoperability community, which was initially focused on the intra-enterprise aspects, is moving towards inter-enterprise collaboration. Collaborative networks are thus becoming a pervasive paradigm giving basis to new socio-organizational structures.
Underwater Robots reports on the latest progress in underwater robotics. In spite of its importance, the ocean is generally overlooked, since we focus more of our attention on land and atmospheric issues. We have not yet been able to explore the full depths of the ocean and its resources. The deep oceans range between 19000 to 36000 feet. At a mere 33-foot depth, the pressure is twice the normal atmospheric pressure of 29.4 psi. This obstacle, compounded with other complex issues due to the unstructured and hazardous environment, makes it difficult to travel in the ocean even though today's technologies allow humans to land on the moon. Only recently, we discovered by using manned submersibles that a large amount of carbon dioxide comes from the sea-floor and that extraordinary groups of organisms live in hydrothermal vent areas. On March 24, 1995 Kaiko (a remotely operated vehicle) navigated the deepest region of the ocean, the Mariana Trough. This vehicle successfully dived to a depth of 33000 feet and instantly showed scenes from the trench through a video camera. New tools like this enable us to gain knowledge of mysterious places. However, extensive use of manned submersibles and remotely operated vehicles is limited to a few applications because of very high operational costs, operator fatigue and safety issues. In spite of these hindrances, the demand for advanced underwater robot technologies is growing and will eventually arrive at fully autonomous, specialized, reliable underwater robotic vehicles. Underwater Robots is an edited volume of peer-reviewed original research comprising thirteen invited contributions by leading researchers. This research work has also been published as a special issue of Autonomous Robots (Volume 3, Numbers 2 and 3).
The design process of digital circuits is often carried out in individual steps, like logic synthesis, mapping, and routing. Since originally the complete process was too complex, it has been split up in several - more or less independent - phases. In the last 40 years powerful algorithms have been developed to find optimal solutions for each of these steps. However, the interaction of these different algorithms has not been considered for a long time. This leads to quality loss e.g. in cases where highly optimized netlists fit badly onto the target architecture. Since the resulting circuits are often far from being optimal and insufficient regarding the optimization criteria, like area and delay, several iterations of the complete design process have to be carried out to get high quality results. This is a very time consuming and costly process. For this reason, some years ago the idea of one-pass synthesis came up. There were two main approaches how to guarantee that a design got "first time right": Combining levels that were split before, e.g. to use layout information already during the logic synthesis phase; Restricting the optimization in one level such that it better fits to the next one. So far, several approaches in these two directions have been presented and new techniques are under development. In Towards One-Pass Synthesis we describe the new paradigm that is used in one-pass synthesis and present examples for the two techniques above. Theoretical and practical aspects are discussed and minimization algorithms are given. This will help people working with synthesis tools and circuit design in general (in industry and academia) to keep informed about recent developments andnew trends in this area.
In this book, the author traces the origin of the present information technology revolution, the technological features that underlie its impact, the organizations, and the companies and technologies which are governing current and future growth. It explains how the technology works, how it fits together, how the industry is structured and what the future might bring.
This book describes the emerging point-of-care (POC) technologies that are paving the way to the next generation healthcare monitoring and management. It provides the readers with comprehensive, up-to-date information about the emerging technologies, such as smartphone-based mobile healthcare technologies, smart devices, commercial personalized POC technologies, paper-based immunoassays (IAs), lab-on-a-chip (LOC)-based IAs, and multiplex IAs. The book also provides guided insights into the POC diabetes management software and smart applications, and the statistical determination of various bioanalytical parameters. Additionally, the authors discuss the future trends in POC technologies and personalized and integrated healthcare solutions for chronic diseases, such as diabetes, stress, obesity, and cardiovascular disorders. Each POC technology is described comprehensively and analyzed critically with its characteristic features, bioanalytical principles, applications, advantages, limitations, and future trends. This book would be a very useful resource and teaching aid for professionals working in the field of POC technologies, in vitro diagnostics (IVD), mobile healthcare, Big Data, smart technology, software, smart applications, biomedical engineering, biosensors, personalized healthcare, and other disciplines.
This book presents the proceedings of the Third International Conference on Electrical Engineering and Control (ICEECA2017). It covers new control system models and troubleshooting tips, and also addresses complex system requirements, such as increased speed, precision and remote capabilities, bridging the gap between the complex, math-heavy controls theory taught in formal courses, and the efficient implementation required in real-world industry settings. Further, it considers both the engineering aspects of signal processing and the practical issues in the broad field of information transmission and novel technologies for communication networks and modern antenna design. This book is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science, signal processing, as well as mechanical and chemical engineering.
Adaptive Learning of Polynomial Networks delivers theoretical and practical knowledge for the development of algorithms that infer linear and non-linear multivariate models, providing a methodology for inductive learning of polynomial neural network models (PNN) from data. The empirical investigations detailed here demonstrate that PNN models evolved by genetic programming and improved by backpropagation are successful when solving real-world tasks. The text emphasizes the model identification process and presents
This volume is an essential reference for researchers and practitioners interested in the fields of evolutionary computation, artificial neural networks and Bayesian inference, and will also appeal to postgraduate and advanced undergraduate students of genetic programming. Readers willstrengthen their skills in creating both efficient model representations and learning operators that efficiently sample the search space, navigating the search process through the design of objective fitness functions, and examining the search performance of the evolutionary system.
Computer science is increasingly becoming an essential 21st century skill. As school systems around the world recognize the importance of computer science, demand for teachers who have the knowledge and skills to deliver computer science instruction is rapidly growing. Yet a number of recent studies indicate that teachers report low confidence and limited understanding of computer science, frequently confusing basic computer literacy skills with computer science. This is true for both teachers at the K-8 level as well as secondary education teachers who frequently transition to computer science from other content areas, such as mathematics. As computer science is not yet included in most teacher preparation programs, professional development is a critical step in efforts to prepare in-service teachers to deliver high-quality computer science instruction. To date, however, research on best practices in computer science professional development has been severely lacking in the literature, making it difficult for researchers and practitioners alike to examine effective in-service preparation models. This book provide examples of professional development approaches that help teachers integrate aspects of computing in existing curricula at the K-8 level or deliver stand-alone computer science courses at the secondary school level. Further, this book identifies computational competencies for teachers, promising pedagogical strategies that advance teacher learning, as well as alternative pathways for ongoing learning including microcredentials. The primary audience of the book is graduate students and faculty in educational technology, educational or cognitive psychology, learning theory, curriculum and instruction, computer science, instructional systems and learning sciences. Additionally, the book will serve as a valuable addition to education practitioners and curriculum developers as well as policy makers looking to increase the number of teachers who are prepared to deliver computing education.
Over the past two decades, many advances have been made in the decision support system (DSS) field. They range from progress in fundamental concepts, to improved techniques and methods, to widespread use of commercial software for DSS development. Still, the depth and breadth of the DSS field continues to grow, fueled by the need to better support decision making in a world that is increasingly complex in terms of volume, diversity, and interconnectedness of the knowledge on which decisions can be based. This continuing growth is facilitated by increasing computer power and decreasing per-unit computing costs. But, it is spearheaded by the multifaceted efforts of DSS researchers. The collective work of these researchers runs from the speculative to the normative to the descriptive. It includes analysis of what the field needs, designs of means for meeting recognized needs, and implementations for study. It encompasses theoretical, empirical, and applied orientations. It is concerned with the invention of concepts, frameworks, models, and languages for giving varied, helpful perspectives. It involves the discovery of principles, methods, and techniques for expeditious construction of successful DSSs. It aims to create computer-based tools that facilitate DSS development. It assesses DSS efficacy by observing systems, their developers, and their users. This growing body of research continues to be fleshed out and take shape on a strong, but still-developing, skeletal foundation.
The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This second volume presents experimental laboratory prototypes and applied computing implementations. Emergent molecular computing is presented by enzymatic logical gates and circuits, and DNA nano-devices. Reaction-diffusion chemical computing is exemplified by logical circuits in Belousov-Zhabotinsky medium and geometrical computation in precipitating chemical reactions. Logical circuits realised with solitons and impulses in polymer chains show advances in collision-based computing. Photo-chemical and memristive devices give us a glimpse on hot topics of a novel hardware. Practical computing is represented by algorithms of collective and immune-computing and nature-inspired optimisation. Living computing devices are implemented in real and simulated cells, regenerating organisms, plant roots and slime mould. The book is the encyclopedia, the first ever complete authoritative account, of the theoretical and experimental findings in the unconventional computing written by the world leaders in the field. All chapters are self-contains, no specialist background is required to appreciate ideas, findings, constructs and designs presented. This treatise in unconventional computing appeals to readers from all walks of life, from high-school pupils to university professors, from mathematicians, computers scientists and engineers to chemists and biologists.
Global competition, sluggish economies and the potential offered by emerging technologies have pushed firms to fundamentally rethink their business processes. Business Process Reengineering (BPR) has become recognized as a means to restructure aging bureaucratized processes to achieve the strategic objectives of increased efficiency, reduced costs, improved quality and greater customer satisfaction. Business Process Change: Reengineering Concepts, Methods and Technologies provides extensive coverage of the organizational, managerial and technical concepts related to business process change. Among some of the topics included in this book are: process change components; enablers of process change; methodologies, techniques and tools; team-based management; effective adoption of BPR.
System-Level Synthesis deals with the concurrent design of electronic applications, including both hardware and software. The issue has become the bottleneck in the design of electronic systems, including both hardware and software, in several major industrial fields, including telecommunications, automotive and aerospace engineering. The major difficulty with the subject is that it demands contributions from several research fields, including system specification, system architecture, hardware design, and software design. Most existing book cover well only a few aspects of system-level synthesis. The present volume presents a comprehensive discussion of all the aspects of system-level synthesis. Each topic is covered by a contribution written by an international authority on the subject. |
You may like...
Machinery of the Heavens - a System of…
Asahel Phelps Pichereau
Paperback
R419
Discovery Miles 4 190
Innovation Policy at the Intersection…
Mlungisi B.G. Cele, Thierry M Luescher, …
Paperback
Cosmic Perspective, The - Pearson New…
Jeffrey Bennett, Megan Donahue, …
Paperback
|