![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book is a result of the Seventh International Conference on Information Sys tems Development-Methods and Tools, Theory and Practice held in Bled, Slovenia, Sep tember 21-23, 1998. The purpose of the conference was to address issues facing academia and industry when specifying, developing, managing, and improving information comput erized systems. During the past few years, many new concepts and approaches emerged in the Information Systems Development (ISD) field. The various theories, methods, and tools available to system developers also bring problems such as choosing the most effec tive approach for a specific task. This conference provides a meeting place for IS re searchers and practitioners from Eastern and Western Europe as well as from other parts of the world. An objective of the conference is not only to share scientific knowledge and in terests but to establish strong professional ties among the participants. The Seventh International Conference on Information Systems Develop ment-ISD'98 continues the concepts of the first Polish-Scandinavian Seminar on Current Trends in Information Systems Development Methodologies held in Gdansk, Poland in 1988. Through the years, the Seminar developed into the International Conference on In formation Systems Development. ISD'99 will be held in Boise, Idaho. The selection of papers was carried out by the International Program Committee. All papers were reviewed in advance by three people. Papers were judged according to their originality, relevance, and presentation quality. All papers were judged only on their own merits, independent of other submissions."
This book is an extension of one author's doctoral thesis on the false path problem. The work was begun with the idea of systematizing the various solutions to the false path problem that had been proposed in the literature, with a view to determining the computational expense of each versus the gain in accuracy. However, it became clear that some of the proposed approaches in the literature were wrong in that they under estimated the critical delay of some circuits under reasonable conditions. Further, some other approaches were vague and so of questionable accu racy. The focus of the research therefore shifted to establishing a theory (the viability theory) and algorithms which could be guaranteed correct, and then using this theory to justify (or not) existing approaches. Our quest was successful enough to justify presenting the full details in a book. After it was discovered that some existing approaches were wrong, it became apparent that the root of the difficulties lay in the attempts to balance computational efficiency and accuracy by separating the tempo ral and logical (or functional) behaviour of combinational circuits. This separation is the fruit of several unstated assumptions; first, that one can ignore the logical relationships of wires in a network when considering timing behaviour, and, second, that one can ignore timing considerations when attempting to discover the values of wires in a circuit."
Initially proposed as rivals of classical logic, alternative logics have become increasingly important in sciences such as quantum physics, computer science, and artificial intelligence. The contributions collected in this volume address and explore the question whether the usage of logic in the sciences, especially in modern physics, requires a deviation from classical mathematical logic. The articles in the first part of the book set the scene by describing the context and the dilemma when applying logic in science. In part II the authors offer several logics that deviate in different ways from classical logics. The twelve papers in part III investigate in detail specific aspects such as quantum logic, quantum computation, computer-science considerations, praxic logic, and quantum probability. Most of the contributions are revised and partially extended versions of papers presented at a conference of the same title of the Academie Internationale de Philosophie des Sciences held at the Internationales Forschungszentrum Salzburg in May 1999. Others have been added to complete the picture of recent research in alternative logics as they have been developed for applications in the sciences. "
A day does not pass without a newspaper report about yet another company that has started outsourcing technology or other business processes to India. The Senate recently voted 70 to 26 in favor of preventing federal contracts going offshore, yet US managers continue to beat a path to India because it is the global leader for offshore IT-enabled services. Many CEOs seek to reduce their costs or improve service quality, but not many understand India on their first visit and some are confused by the culture. In this book author Mark Kobayashi-Hillary introduces India and the major players in the Indian service industry. He offers a balanced view on the trend to outsource to India, describing the reasons why a business should utilize India as an offshore outsourcing destination and the steps needed to find and work with a local partner. Not only does the book make a compelling economic case for outsourcing to this region, it also discusses how to manage the entire transition process, including the potential impact on local resources. Mark Kobayashi-Hillary is a British writer and independent outsourcing consultant based in London. He has worked at a senior level for several leading banking and technology groups and has been involved in managing outsourced relationships in the UK, Singapore and India. He is a regular commentator on India and outsourcing in the European press. Outsourcing To India is written from personal experience and several years of research. This practical guide will help managers navigate through the offshore outsourcing maze, allowing them to avoid many of the major pitfalls others have faced when setting up shop in India.
This is a collection of state-of-the-art surveys on topics at the interface between transportation modeling and operations research given by leading international experts. Based on contributions to a NATO workshop, the surveys are up-to-date and rigorous presentations or applications of quantitative methods in the area. The subjects covered include dynamic traffic simulation techniques and dynamic routing in congested networks, operation and control of traffic management tools, optimized transportation data collection, and vehicle routing problems.
Logic Synthesis for Control Automata provides techniques for logic design of very complex control units with hardly any constraints on their size, i.e. the number of inputs, outputs and states. These techniques cover all stages of control unit design, including: description of control unit behavior by using operator schemes of algorithms (binary decision trees) and various transformations of these descriptions -- composition, decomposition, minimization, etc.; synthesis of a control automaton (finite-state machine); synthesis of an automaton logic circuit: with matrix structure as a part of LSI or VLSI circuits; as multilevel circuit with logic gates; with standard LSI and VLSI circuits with and without memory. Each chapter contains many examples, illustrating the use of the models and methods described. Moreover, the special last chapter demonstrates in detail the whole design methodology presented in the previous chapters, through the examples of the logic design for a control unit. The models, methods and algorithms described in the book can be applied to a broad class of digital system design problems including design of complex controllers, robots, control units of computers and for designing CAD systems of VLSI circuits using FPGA, PLD and SIC technologies. Logic Synthesis for Control Automata is a valuable reference for graduate students, researchers and engineers involved in the design of very complex controllers, VLSI circuits and CAD systems. The inclusion of many examples and problems makes it most suitable for a course on the subject.
Influence action through data! This is not a book. It is a one-of-a-kind immersive learning experience through which you can become--or teach others to be--a powerful data storyteller. Let's practice! helps you build confidence and credibility to create graphs and visualizations that make sense and weave them into action-inspiring stories. Expanding upon best seller storytelling with data's foundational lessons, Let's practice! delivers fresh content, a plethora of new examples, and over 100 hands-on exercises. Author and data storytelling maven Cole Nussbaumer Knaflic guides you along the path to hone core skills and become a well-practiced data communicator. Each chapter includes: Practice with Cole: exercises based on real-world examples first posed for you to consider and solve, followed by detailed step-by-step illustration and explanation Practice on your own: thought-provoking questions and even more exercises to be assigned or worked through individually, without prescribed solutions Practice at work: practical guidance and hands-on exercises for applying storytelling with data lessons on the job, including instruction on when and how to solicit useful feedback and refine for greater impact The lessons and exercises found within this comprehensive guide will empower you to master--or develop in others--data storytelling skills and transition your work from acceptable to exceptional. By investing in these skills for ourselves and our teams, we can all tell inspiring and influential data stories!
Focusing on both theoretical and practical aspects of online learning by introducing a variety of online instructional models, this work also looks at the best practices that help educators and professional trainers to better understand the dynamics of online learning.
Welcome to the proceedings of the Seventh International Conference of the UK Systems Society being held at York University, United Kingdom from July 7th to 10th, 2002. It is a pleasure to be able to share with you this collection ofpapers that have been contributed by systems thinkers from around the world. As with previous UKSS conferences, the aim ofthis conference is to encourage debate and promote development of pertinent issues in systems theory and practice. In current times where the focus has moved from 'information' to 'knowledge' and where 'knowledge management', of everyday speak, it seemed fitting to 'knowledge assets' and so on, have become part offer a conference title of'Systems Theory and Practice in the Knowledge Age'. In keeping with another tradition of previous conferences, the UKSS Conference 2002 Committee decided to compile a collection ofdelegates' papers before the event as a platform from which to launch discussions in York. Ideas presented in the following papers will, undoubtedly, be developed during the dialogue generated at the conference and new papers will emerge. In his abstract for his plenary at this conference, Professor Peter Checkland throws down the gauntlet to systems thinking and its relevance in the knowledge age with the following statement: "30 Years In The Systems Movement: Disappointments I Have Known and Hopes/or the Future Springing from a lunchtime conversation at an American University, the Systems Movement is now nearly 50 years old.
High-Speed Clock Network Design is a collection of design concepts, techniques and research works from the author for clock distribution in microprocessors and high-performance chips. It is organized in 11 chapters as follows. Chapter 1 provides an overview to the design of clock networks. Chapter 2 specifies the timing requirements in digital design. Chapter 3 shows the circuits of sequential elements including latches and flip-flops. Chapter 4 describes the domino circuits, which need special clock signals. Chapter 5 discusses the phase-locked loop (PLL) and delay-locked loop (DLL), which provide the clock generation and de-skewing for the on-chip clock distribution. Chapter 6 summarizes the clock distribution techniques published in the state-of-the-art microprocessor chips. Chapter 7 describes the CAD flow on the clock network simulation. Chapter 8 gives the research work on low-voltage swing clock distribution. Chapter 9 explores the possibility of placing the global clock tree on the package layers. Chapter 10 shows the algorithms of balanced clock routing and wire sizing for the skew minimization. Chapter 11 shows a commercial CAD tool that deals with clock tree synthesis in the ASIC design flow. The glossary is attached at the end of this book. The clock network design is still a challenging task in most high-speed VLSI chips, since the clock frequency and power consumption requirements are increasingly difficult to meet for multiple clock networks on the chip. Many research works and industry examples will be shown in this area to continually improve the clock distribution networks for future high-performance chips.
In the last decade there have been rapid developments in the field of computer-based learning environments. A whole new generation of computer-based learning environments has appeared, requiring new approaches to design and development. One main feature of current systems is that they distinguish different knowledge bases that are assumed to be necessary to support learning processes. Current computer-based learning environments often require explicit representations of large bodies of knowledge, including knowledge of instruction. This book focuses on instructional models as explicit, potentially implementable representations of knowledge concerning one or more aspects of instruction. The book has three parts, relating to different aspects of the knowledge that should be made explicit in instructional models: knowledge of instructional planning, knowledge of instructional strategies, and knowledge of instructional control. The book is based on a NATO Advanced Research Workshop held at the University of Twente, The Netherlands in July 1991.
As the world proceeds quickly into the Information Age, it encounters both successes and challenges, and it is well recognized that Intelligent Information Processing provides the key to the Information Age and to mastering many of these challenges. Intelligent Information Processing supports the most advanced productive tools that are said to be able to change human life and the world itself. However, the path is never a straight one and every new technology brings with it a spate of new research problems to be tackled by researchers. As such, the demand for Information Processing research is ever-increasing. This book presents the proceedings of the 4th IFIP International Conference on Intelligent Information Processing. This conference provides a forum for engineers and scientists in academia, university and industry to present their latest research findings in all aspects of Intelligent Information Processing.
COLLABORATIVE NETWORKS Becoming a pervasive paradigm In recent years the area of collaborative networks is being consolidated as a new discipline (Camarinha-Matos, Afsarmanesh, 2005) that encompasses and gives more structured support to a large diversity of collaboration forms. In terms of applications, besides the "traditional" sectors represented by the advanced supply chains, virtual enterprises, virtual organizations, virtual teams, and their breading environments, new forms of collaborative structures are emerging in all sectors of the society. Examples can be found in e-government, intelligent transportation systems, collaborative virtual laboratories, agribusiness, elderly care, silver economy, etc. In some cases those developments tend to adopt a terminology that is specific of that domain; often the involved actors in a given domain are not fully aware of the developments in the mainstream research on collaborative networks. For instance, the grid community adopted the term "virtual organization" but focused mainly on the resource sharing perspective, ignoring most of the other aspects involved in collaboration. The European enterprise interoperability community, which was initially focused on the intra-enterprise aspects, is moving towards inter-enterprise collaboration. Collaborative networks are thus becoming a pervasive paradigm giving basis to new socio-organizational structures.
Underwater Robots reports on the latest progress in underwater robotics. In spite of its importance, the ocean is generally overlooked, since we focus more of our attention on land and atmospheric issues. We have not yet been able to explore the full depths of the ocean and its resources. The deep oceans range between 19000 to 36000 feet. At a mere 33-foot depth, the pressure is twice the normal atmospheric pressure of 29.4 psi. This obstacle, compounded with other complex issues due to the unstructured and hazardous environment, makes it difficult to travel in the ocean even though today's technologies allow humans to land on the moon. Only recently, we discovered by using manned submersibles that a large amount of carbon dioxide comes from the sea-floor and that extraordinary groups of organisms live in hydrothermal vent areas. On March 24, 1995 Kaiko (a remotely operated vehicle) navigated the deepest region of the ocean, the Mariana Trough. This vehicle successfully dived to a depth of 33000 feet and instantly showed scenes from the trench through a video camera. New tools like this enable us to gain knowledge of mysterious places. However, extensive use of manned submersibles and remotely operated vehicles is limited to a few applications because of very high operational costs, operator fatigue and safety issues. In spite of these hindrances, the demand for advanced underwater robot technologies is growing and will eventually arrive at fully autonomous, specialized, reliable underwater robotic vehicles. Underwater Robots is an edited volume of peer-reviewed original research comprising thirteen invited contributions by leading researchers. This research work has also been published as a special issue of Autonomous Robots (Volume 3, Numbers 2 and 3).
Global competition, sluggish economies and the potential offered by emerging technologies have pushed firms to fundamentally rethink their business processes. Business Process Reengineering (BPR) has become recognized as a means to restructure aging bureaucratized processes to achieve the strategic objectives of increased efficiency, reduced costs, improved quality and greater customer satisfaction. Business Process Change: Reengineering Concepts, Methods and Technologies provides extensive coverage of the organizational, managerial and technical concepts related to business process change. Among some of the topics included in this book are: process change components; enablers of process change; methodologies, techniques and tools; team-based management; effective adoption of BPR.
Our understanding of nature is often through nonuniform observations in space or time. In space, one normally observes the important features of an object, such as edges. The less important features are interpolated. History is a collection of important events that are nonuniformly spaced in time. Historians infer between events (interpolation) and politicians and stock market analysts forecast the future from past and present events (extrapolation). The 20 chapters of Nonuniform Sampling: Theory and Practice contain contributions by leading researchers in nonuniform and Shannon sampling, zero crossing, and interpolation theory. Its practical applications include NMR, seismology, speech and image coding, modulation and coding, optimal content, array processing, and digital filter design. It has a tutorial outlook for practising engineers and advanced students in science, engineering, and mathematics. It is also a useful reference for scientists and engineers working in the areas of medical imaging, geophysics, astronomy, biomedical engineering, computer graphics, digital filter design, speech and video processing, and phased array radar. A special feature of the package is a CD-ROM containing C-codes, Matlab and Mathcad programs for the algorithms presented.
The design process of digital circuits is often carried out in individual steps, like logic synthesis, mapping, and routing. Since originally the complete process was too complex, it has been split up in several - more or less independent - phases. In the last 40 years powerful algorithms have been developed to find optimal solutions for each of these steps. However, the interaction of these different algorithms has not been considered for a long time. This leads to quality loss e.g. in cases where highly optimized netlists fit badly onto the target architecture. Since the resulting circuits are often far from being optimal and insufficient regarding the optimization criteria, like area and delay, several iterations of the complete design process have to be carried out to get high quality results. This is a very time consuming and costly process. For this reason, some years ago the idea of one-pass synthesis came up. There were two main approaches how to guarantee that a design got "first time right": Combining levels that were split before, e.g. to use layout information already during the logic synthesis phase; Restricting the optimization in one level such that it better fits to the next one. So far, several approaches in these two directions have been presented and new techniques are under development. In Towards One-Pass Synthesis we describe the new paradigm that is used in one-pass synthesis and present examples for the two techniques above. Theoretical and practical aspects are discussed and minimization algorithms are given. This will help people working with synthesis tools and circuit design in general (in industry and academia) to keep informed about recent developments andnew trends in this area.
In this book, the author traces the origin of the present information technology revolution, the technological features that underlie its impact, the organizations, and the companies and technologies which are governing current and future growth. It explains how the technology works, how it fits together, how the industry is structured and what the future might bring.
This book describes the emerging point-of-care (POC) technologies that are paving the way to the next generation healthcare monitoring and management. It provides the readers with comprehensive, up-to-date information about the emerging technologies, such as smartphone-based mobile healthcare technologies, smart devices, commercial personalized POC technologies, paper-based immunoassays (IAs), lab-on-a-chip (LOC)-based IAs, and multiplex IAs. The book also provides guided insights into the POC diabetes management software and smart applications, and the statistical determination of various bioanalytical parameters. Additionally, the authors discuss the future trends in POC technologies and personalized and integrated healthcare solutions for chronic diseases, such as diabetes, stress, obesity, and cardiovascular disorders. Each POC technology is described comprehensively and analyzed critically with its characteristic features, bioanalytical principles, applications, advantages, limitations, and future trends. This book would be a very useful resource and teaching aid for professionals working in the field of POC technologies, in vitro diagnostics (IVD), mobile healthcare, Big Data, smart technology, software, smart applications, biomedical engineering, biosensors, personalized healthcare, and other disciplines.
The consequences of recent floods and flash floods in many parts of the world have been devastating. One way to improving flood management practice is to invest in data collection and modelling activities which enable an understanding of the functioning of a system and the selection of optimal mitigation measures. A Digital Terrain Model (DTM) provides the most essential information for flood managers. Light Detection and Ranging (LiDAR) surveys which enable the capture of spot heights at a spacing of 0.5m to 5m with a horizontal accuracy of 0.3m and a vertical accuracy of 0.15m can be used to develop high accuracy DTM but needs careful processing before using it for any application.This book presents the augmentation of an existing Progressive Morphological filtering algorithm for processing raw LiDAR data to support a 1D/2D urban flood modelling framework. The key characteristics of this improved algorithm are: (1) the ability to deal with different kinds of buildings; (2) the ability to detect elevated road/rail lines and represent them in accordance to the reality; (3) the ability to deal with bridges and riverbanks; and (4) the ability to recover curbs and the use of appropriated roughness coefficient of Manning's value to represent close-to-earth vegetation (e.g. grass and small bush).
Over the past twenty years, the conventional knowledge management approach has evolved into a strategic management approach that has found applications and opportunities outside of business, in society at large, through education, urban development, governance, and healthcare, among others. Knowledge-Based Development for Cities and Societies: Integrated Multi-Level Approaches enlightens the concepts and challenges of knowledge management for both urban environments and entire regions, enhancing the expertise and knowledge of scholars, researchers, practitioners, managers and urban developers in the development of successful knowledge-based development policies, creation of knowledge cities and prosperous knowledge societies. This reference creates large knowledge base for scholars, managers and urban developers and increases the awareness of the role of knowledge cities and knowledge societies in the knowledge era, as well as of the challenges and opportunities for future research.
This book presents the proceedings of the Third International Conference on Electrical Engineering and Control (ICEECA2017). It covers new control system models and troubleshooting tips, and also addresses complex system requirements, such as increased speed, precision and remote capabilities, bridging the gap between the complex, math-heavy controls theory taught in formal courses, and the efficient implementation required in real-world industry settings. Further, it considers both the engineering aspects of signal processing and the practical issues in the broad field of information transmission and novel technologies for communication networks and modern antenna design. This book is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science, signal processing, as well as mechanical and chemical engineering.
Educational initiatives attempt to introduce or promote a culture of quality within education by raising concerns related to student learning, providing services related to assessment, professional development of teachers, curriculum and pedagogy, and influencing educational policy, in the realm of technology. |
You may like...
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,065
Discovery Miles 40 650
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|