![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Performance and Reliability Analysis of Computer Systems: An Example-Based Approach Using the SHARPE Software Package provides a variety of probabilistic, discrete-state models used to assess the reliability and performance of computer and communication systems. The models included are combinatorial reliability models (reliability block diagrams, fault trees and reliability graphs), directed, acyclic task precedence graphs, Markov and semi-Markov models (including Markov reward models), product-form queueing networks and generalized stochastic Petri nets. A practical approach to system modeling is followed; all of the examples described are solved and analyzed using the SHARPE tool. In structuring the book, the authors have been careful to provide the reader with a methodological approach to analytical modeling techniques. These techniques are not seen as alternatives but rather as an integral part of a single process of assessment which, by hierarchically combining results from different kinds of models, makes it possible to use state-space methods for those parts of a system that require them and non-state-space methods for the more well-behaved parts of the system. The SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) package is the toolchest' that allows the authors to specify stochastic models easily and solve them quickly, adopting model hierarchies and very efficient solution techniques. All the models described in the book are specified and solved using the SHARPE language; its syntax is described and the source code of almost all the examples discussed is provided. Audience: Suitable for use in advanced level courses covering reliability and performance of computer and communications systems and by researchers and practicing engineers whose work involves modeling of system performance and reliability.
"Specification and transformation of programs" is short for a methodology of software development where, from a formal specification of a problem to be solved, programs correctly solving that problem are constructed by stepwise application of formal, semantics-preserving transformation rules. The approach considers programming as a formal activity. Consequently, it requires some mathematical maturity and, above all, the will to try something new. A somewhat experienced programmer or a third- or fourth-year student in computer science should be able to master most of this material - at least, this is the level I have aimed at. This book is primarily intended as a general introductory textbook on transformational methodology. As with any methodology, reading and understanding is necessary but not sufficient. Therefore, most of the chapters contain a set of exercises for practising as homework. Solutions to these exercises exist and can, in principle, be obtained at nominal cost from the author upon request on appropriate letterhead. In addition, the book also can be seen as a comprehensive account of the particular transformational methodology developed within the Munich CIP project.
Deryn Watson and David Tinsley The topic of the conference, integrating infonnation technology into education, is both broad and multi-facetted. In order to help focus the papers and discussion we identified 7 themes: * Current developments in society and education influencing integration; * Teachers, their roles and concerns; * Learners, their expectations of and behaviour in an integrated environment; * Developments and concerns in the curriculum; * Successes and failures in existing practice; * Organisation and management of integrated environments; * Identification of social and political influences. Each author was invited to focus on one theme, and these remained strands throughout as can be seen from the short papers and focus group reports. The first and most significant concern therefore was to be clear about our notions of integration; what do we mean and how is this relevant? Our keynote paper from Cornu clearly marked out this debate by examining the notion of integration and alerting us to the fact that as long as the use of IT is still added to the curriculum, then integration has not yet begun.
The need for a comprehensive survey-type exposition on formal languages and related mainstream areas of computer science has been evident for some years. In the early 1970s, when the book Formal Languages by the second mentioned editor appeared, it was still quite feasible to write a comprehensive book with that title and include also topics of current research interest. This would not be possible anymore. A standard-sized book on formal languages would either have to stay on a fairly low level or else be specialized and restricted to some narrow sector of the field. The setup becomes drastically different in a collection of contributions, where the best authorities in the world join forces, each of them concentrat ing on their own areas of specialization. The present three-volume Handbook constitutes such a unique collection. In these three volumes we present the current state of the art in formallanguage theory. We were most satisfied with the enthusiastic response given to our request for contributions by specialists representing various subfields. The need for a Handbook of Formal Languages was in many answers expressed in different ways: as an easily accessible his torical reference, a general source of information, an overall course-aid, and a compact collection of material for self-study. We are convinced that the final result will satisfy such various needs."
Service Intelligence and Service Science: Evolutionary Technologies and Challenges the emerging fields of service intelligence and service science, positioning them as the most promising directions for the evolution of service computing. This book demonstrates the critical role such areas play in supporting service computing processes, and furthers an increase in current research, best practices, and new directions in service computing technologies and applications.
This book presents the latest research in formal techniques for distributed systems, including material on theory, applications, tools and industrial usage of formal techniques.
In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."
A self-contained treatment of the fundamentals of quantum computing
Understanding digital modes and practices of traditional rhetoric are essential in emphasising information and interaction in human-to-human and human-computer contexts. These emerging technologies are essential in gauging information processes across global contexts. Digital Rhetoric and Global Literacies: Communication Modes and Digital Practices in the Networked World compiles relevant theoretical frameworks, current practical applications, and emerging practices of digital rhetoric. Highlighting the key principles and understandings of the underlying modes, practices, and literacies of communication, this book is a vital guide for professionals, scholars, researchers, and educators interested in finding clarity and enrichment in the diverse perspectives of digital rhetoric research.
E-ffective Writing for E-Learning Environments integrates research and practice in user-centered design and learning design for instructors in post-secondary institutions and learning organizations who are developing e-learning resources. The book is intended as a development guide for experts in areas other than instructional or educational technology (in other words, experts in cognate areas such as Biology or English or Nursing) rather than as a learning design textbook. The organization of the book reflects the development process for a resource, course, or program from planning and development through formative evaluation, and identifies trends and issues that faculty or developers might encounter along the way. The account of the process of one faculty member's course development journey illustrates the suggested design guidelines. The accompanying practice guide provides additional information, examples, learning activities, and tools to supplement the text.
Microprocessors are the key component of the infrastructure of our 21st-century electronic- and digital information-based society. More than four billion are sold each year for use in 'intelligent' electronic devices; ranging from smart egg-timer through to aircraft management systems. Most of these processor devices appear in the form of highly-integrated microcontrollers, which comprize a core microprocessor together with memory and analog/digital peripheral ports. By using simple cores, these single-chip computers are the cost- and size-effective means of adding the brains to previous dumb widgets; such as the credit card. Using the same winning format as the successful Springer guide, The Quintessential PIC (R) Microcontroller, this down-to-earth new textbook/guide has been completely rewritten based on the more powerful PIC18 enhanced-range Microchip MCU family. Throughout the book, commercial hardware and software products are used to illustrate the material, as readers are provided real-world in-depth guidance on the design, construction and programming of small, embedded microcontroller-based systems. Suitable for stand-alone usage, the text does not require a prerequisite deep understanding of digital systems. Topics and features: uses an in-depth bottom-up approach to the topic of microcontroller design using the Microchip enhanced-range PIC18 (R) microcontroller family as the exemplar; includes fully worked examples and self-assessment questions, with additional support material available on an associated website; provides a standalone module on foundation topics in digital, logic and computer architecture for microcontroller engineering; discusses the hardware aspects of interfacing and interrupt handling, with an emphasis on the integration of hardware and software; covers parallel and serial input/output, timing, analog, and EEPROM data-handling techniques; presents a practical build-and-program case study, as well as illustrating simple testing strategies. This useful text/reference book will be of great value to industrial engineers, hobbyists and people in academia. Students of Electronic Engineering and Computer Science, at both undergraduate and postgraduate level, will also find this an ideal textbook, with many helpful learning tools. Dr. Sid Katzen is Associate to the School of Engineering, University of Ulster at Jordanstown, Northern Ireland.
The success of VHDL since it has been balloted in 1987 as an IEEE standard may look incomprehensible to the large population of hardware designers, who had never heared of Hardware Description Languages before (for at least 90% of them), as well as to the few hundreds of specialists who had been working on these languages for a long time (25 years for some of them). Until 1988, only a very small subset of designers, in a few large companies, were used to describe their designs using a proprietary HDL, or sometimes a HDL inherited from a University when some software environment happened to be developped around it, allowing usability by third parties. A number of benefits were definitely recognized to this practice, such as functional verification of a specification through simulation, first performance evaluation of a tentative design, and sometimes automatic microprogram generation or even automatic high level synthesis. As there was apparently no market for HDL's, the ECAD vendors did not care about them, start-up companies were seldom able to survive in this area, and large users of proprietary tools were spending more and more people and money just to maintain their internal system.
Performance evaluation of increasingly complex human-made systems requires the use of simulation models. However, these systems are difficult to describe and capture by succinct mathematical models. The purpose of this book is to address the difficulties of the optimization of complex systems via simulation models or other computation-intensive models involving possible stochastic effects and discrete choices. This book establishes distinct advantages of the "softer" ordinal approach for search-based type problems, analyzes its general properties, and shows the many orders of magnitude improvement in computational efficiency that is possible.
This book contains the ceremonials and the proceedings pertaining to the Int- national Symposium CCN2005 on "Complex Computing-Networks: A Link between Brain-like and Wave-Oriented Electrodynamics Algorithms," convened at Do ?u ? University of Istanbul, Turkey, on 13-14 June 2005, in connection with the bestowal of the honorary doctorate degrees on Professors Leopold B. Felsen and Leon O. Chua, for their extraordinary achievements in electromagnetics, and n- linear systems, respectively. The symposium was co-organized by Cem Goknar and Levent Sevgi, in consultation with Leopold B. Felsen and Leon O. Chua. Istanbul is a city with wonderful natural and historical surroundings, a city not only interconnecting Asia and Europe but also Eastern and Western cultures. Therefore, CCN2005 was a memorable event not only in the lifetime of Drs. Felsen, Chua, and their families, but also for all the other participants who were there to congratulate the recipients and participate in the symposium."
Integrated circuits are finding ever wider applications through a range of industries. Introduction to VLSI Process Engineering presents the design principles for devices, describes the overall VLSI process, and deals with the essential manufacturing technologies and inspection procedures.
The CoreGRID Network of Excellence (NoE) project began in September 2004. Two months later, in November 2004, the first CoreGRID Integra tion Workshop was held within the framework of the prestigious international Dagstuhl seminars. CoreGRID aims at strengthening and advancing long-term research, knowledge transfer and integration in the area of Grid and Peer-to- Peer technologies. CoreGRID is a Network of Excellence - a new type of project within the European 6th Framework Programme, to ensure progressive evolution and durable integration of the European Grid research community. To achieve this objective, CoreGRID brings together a critical mass of we- established researchers and doctoral students from forty-two institutions that have constructed an ambitious joint programme of activities. Although excellence is a goal to which CoreGRID is committed, durable integration is our main concern. It means that CoreGRID has to carry out activ ities to improve the effectiveness of European research in Grid by coordinating and adapting the participants' activities in Grid research, to share resources such as Grid testbeds, to encourage exchange of research staff and students, and to ensure close collaboration and wide dissemination of its results to the international community. Organising CoreGRID Integration Workshops is one of the activities that aims at identifying and promoting durable collaboration between partners involved in the network."
This book investigates in detail the emerging deep learning (DL) technique in computational physics, assessing its promising potential to substitute conventional numerical solvers for calculating the fields in real-time. After good training, the proposed architecture can resolve both the forward computing and the inverse retrieve problems. Pursuing a holistic perspective, the book includes the following areas. The first chapter discusses the basic DL frameworks. Then, the steady heat conduction problem is solved by the classical U-net in Chapter 2, involving both the passive and active cases. Afterwards, the sophisticated heat flux on a curved surface is reconstructed by the presented Conv-LSTM, exhibiting high accuracy and efficiency. Besides, the electromagnetic parameters of complex medium such as the permittivity and conductivity are retrieved by a cascaded framework in Chapter 4. Additionally, a physics-informed DL structure along with a nonlinear mapping module are employed to obtain the space/temperature/time-related thermal conductivity via the transient temperature in Chapter 5. Finally, in Chapter 6, a series of the latest advanced frameworks and the corresponding physics applications are introduced. As deep learning techniques are experiencing vigorous development in computational physics, more people desire related reading materials. This book is intended for graduate students, professional practitioners, and researchers who are interested in DL for computational physics.
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata. Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
BE 2002 is the second in a series of conferences on eCommerce, eBusiness, and eGovemment organised by the three IFIP committees TC6, TC8, and TCll. As BE 2001 did last year in Zurich, BE 2002 continues to provide a forum for users, engineers, and researchers from academia, industry and government to present their latest findings in eCommerce, eBusiness, and eGovernment applications and the underlying technologies which support those applications. This year's conference comprises a main track with sessions on eGovernment, Trust, eMarkets, Fraud and Security, eBusiness (both B2B and B2C), the Design of systems, eLearning, Public and Health Systems, Web Design, and the Applications of and Procedures for eCommerce and eBusiness, as well as two associated Workshops (not included in these proceedings): eBusiness Models in the Digital Online Music and Online News Sectors; and eBusiness Standardisation - Challenges and Solutions for the Networked Economy. The 47 papers accepted for presentation in these sessions and published in this book of proceedings were selected from 80 submissions. They were rigorously reviewed (all papers were double-blind refereed) before being selected by the International Programme Committee. This rejection rate of almost 50% indicates just how seriously the Committee took its quality control activities.
In April 1993, an interdisciplinary NATO Advanced Research Workshop on "Collaborative dialogue technologies in distance learning" was held in Segovia, Spain. The workshop brought together researchers in fields related to distance learning using computer-mediated communication. The statement of justification of the NATO ARW follows hereafter. Justification of the NATO Advanced Research Workshop on Collaborative Dialogue Technologies in Distance Learning Computer Mediated Communication (CMC) systems have features that reduce some temporal, physical and social constraints on communication. Theories of communication have shifted from viewing communication as a linear transmission of messages by a sender to a receiver, to viewing it as a social paradigm, where individuals are actors in a network of interdependent relationships embedded in organizational and social structures. Recent research focuses on models of information-sharing to support not only the activities of individuals but also the problem-solving activities of groups, such as decision-making, planning or co writing. This area of research is called Computer Supported Cooperative Work (CSCW). The Artificial Intelligence (AI) approach uses knowledge-based systems to enhance and facilitate all these processes, including the possibility of using natural language. The traditional model of distance education places a strong emphasis on indepen dent study, supported by well developed learning materials. This model can be characterized as one-way media. However, the potential of CMC to provide better guidance to the student in Higher Distance Education has been quickly recognized for at least two kind of activities: information sharing and interaction."
As suggested by the title of this book, I will present a collection of coherently related applications and a theoretical development of a general systems theory. Hopefully, this book will invite all readers to sample an exciting and challenging (even fun ) piece of interdisciplinary research, that has characterized the scientific and technological achievements of the twentieth century. And, I hope that many of them will be motivated to do additional reading and to contribute to topics along the lines described in the following pages. Since the applications in this volume range through many scientific disciplines, from sociology to atomic physics, from Einstein's relativity theory to Dirac's quan tum mechanics, from optimization theory to unreasonable effectiveness of mathe matics to foundations of mathematical modeling, from general systems theory to Schwartz's distributions, special care has been given to write each application in a language appropriate to that field. That is, mathematical symbols and abstractions are used at different levels so that readers in various fields will find it possible to read. Also, because of the wide range of applications, each chapter has been written so that, in general, there is no need to reference a different chapter in order to understand a specific application. At the same time, if a reader has the desire to go through the entire book without skipping any chapter, it is strongly suggested to refer back to Chapters 2 and 3 as often as possible.
The developments within the computationally and numerically oriented ar eas of Operations Research, Finance, Statistics and Economics have been sig nificant over the past few decades. Each area has been developing its own computer systems and languages that suit its needs, but there is relatively little cross-fertilization among them yet. This volume contains a collection of papers that each highlights a particular system, language, model or paradigm from one of the computational disciplines, aimed at researchers and practitioners from the other fields. The 15 papers cover a number of relevant topics: Models and Modelling in Operations Research and Economics, novel High-level and Object-Oriented approaches to programming, through advanced uses of Maple and MATLAB, and applications and solution of Differential Equations in Finance. It is hoped that the material in this volume will whet the reader's appetite for discovering and exploring new approaches to old problems, and in the longer run facilitate cross-fertilization among the fields. We would like to thank the contributing authors, the reviewers, the publisher, and last, but not least, Jesper Saxtorph, Anders Nielsen, and Thomas Stidsen for invaluable technical assistance."
From a linguistic perspective, it is quanti?cation which makes all the di?- ence between "having no dollars" and "having a lot of dollars". And it is the meaning of the quanti?er "most" which eventually decides if "Most Ame- cans voted Kerry" or "Most Americans voted Bush" (as it stands). Natural language(NL)quanti?erslike"all","almostall","many"etc. serveanimp- tant purpose because they permit us to speak about properties of collections, as opposed to describing speci?c individuals only; in technical terms, qu- ti?ers are a 'second-order' construct. Thus the quantifying statement "Most Americans voted Bush" asserts that the set of voters of George W. Bush c- prisesthemajorityofAmericans,while"Bushsneezes"onlytellsussomething about a speci?c individual. By describing collections rather than individuals, quanti?ers extend the expressive power of natural languages far beyond that of propositional logic and make them a universal communication medium. Hence language heavily depends on quantifying constructions. These often involve fuzzy concepts like "tall", and they frequently refer to fuzzy quantities in agreement like "about ten", "almost all", "many" etc. In order to exploit this expressive power and make fuzzy quanti?cation available to technical applications, a number of proposals have been made how to model fuzzy quanti?ers in the framework of fuzzy set theory. These approaches usually reduce fuzzy quanti?cation to a comparison of scalar or fuzzy cardinalities [197, 132].
In the globalizing world, knowledge and information (and the social and technological settings for their production and communication) are now seen as keys to economic prosperity. The economy of a knowledge city creates value-added products using research, technology, and brainpower. The social benefit of knowledge-based urban development (KBUD); however, extends beyond aggregate economic growth. ""Knowledge-Based Urban Development"" covers the theoretical, thematic, and country-specific issues of knowledge cities to underline the growing importance of KBUD all around the world, providing academics, researchers, and practitioners with substantive research on the decisive lineaments of urban development for knowledge-based production (drawing attention to new planning processes to foster such development), and worldwide best practices and case studies in the field of urban development.
Health institutions are investing in and fielding information technology solutions at an unprecedented pace. With the recommendations from the Institute of Medicine around information technology solutions for patient safety, mandates from industry groups such as Leapfrog about using infor mation systems to improve health care, and the move toward evidence based practice, health institutions cannot afford to retain manual practices. The installation of multi-million dollar computerized health systems repre sents the very life blood of contemporary clinical operations and a crucial link to the financial viability of institutions. Yet, the implementation of health information systems is exceptionally complex, expensive and often just plain messy. The need for improvement in the art and science of systems implemen tation is clear: up to 70-80% of information technology installations fail. The reasons are multi-faceted, ranging from the complexity of the diverse workflows being computerized, the intricate nature of health organizations, the knowledge and skills of users to other reasons such as strategies for obtaining key executive support, weaving through the politics peculiar to the institution, and technical facets including the usability of systems. Thus, the art and science of successfully implementing systems remains deeply layered in elusiveness. Still, given the pervasiveness of system implementa tions and the importance of the outcomes, this is a critical topic, especially for nurses and informatics nurse specialists." |
You may like...
|