![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Timing research in high performance VLSI systems has advanced at a steady pace over the last few years. Tools, however, especially theoretical mechanisms, lag behind. Much of the present timing research relies heavily on timing diagrams, which although intuitive, are inadequate for analysis of large designs with many parameters. Further, timing diagrams offer only approximations, not exact solutions to many timing problems and provide little insight in the cases where temporal properties of a design interact intricately with the design's logical functionalities. Timed Boolean Functions presents a methodology for timing research which facilitates analysis and design of circuits and systems in a unified temporal and logical domain. The goal of the book is to present the central idea of representing logical and timing information in a common structure, TBFs, and to present a canonical form suitable for efficient manipulation. This methodology is then applied to practical applications to provide intuition and insight into the subject so that these general methods can be adapted to specific engineering problems and also to further the research necessary to enhance the understanding of the field. Timed Boolean Functions is written for professionals involved in timing research and digital designers who want to enhance their understanding of the timing aspects of high speed circuits. The prerequisites are a common background in logic design, computer algorithms, combinatorial optimization and a certain degree of mathematical sophistication.
Gay provides an authoritative overview of the major developments, people, and organizations that have shaped the design and use of computers. He also describes innovations in computer research and technology (including highlights from developments in supercomputers, supercomputer networks, and Internet 2), the latest trends in consumer products (new applications that influence the way individuals and businesses interface with their world), social issues related to computers (Microsoft's influence, privacy, encryption, universal access, and adaptive technologies), and information on computer careers and how to prepare for them.
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
Open Distributed Processing contains the selected proceedings of the Third International Conference on Open Distributed Systems, organized by the International Federation for Information Processing and held in Brisbane, Australia, in February 1995. The book deals with the interconnectivity problems that advanced computer networking raises, providing those working in the area with the most recent research, including security and management issues.
Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."
E-agriculture and e-government have transformed public service delivery across the globe, though there are still a number of associated economic, social, political, and legal challenges. E-Agriculture and E-Government for Global Policy Development: Implications and Future Directions provides critical research and knowledge on electronic agriculture and e-government development experiences from around the world. This authoritative reference source describes and evaluates real-life e-agriculture and e-government case studies, examines theoretical frameworks, and discusses key global policy development issues, challenges, and constraints on socio-economic advancements.
"Soft Computing and its Applications in Business and Economics," or SC-BE for short, is a work whose importance is hard to exaggerate. Authored by leading contributors to soft computing and its applications, SC-BE is a sequel to an earlier book by Professors R. A. Aliev and R. R. Aliev, "Soft Computing and Its Applications," World Scientific, 200l. SC-BE is a self-contained exposition of the foundations of soft computing, and presents a vast compendium of its applications to business, finance, decision analysis and economics. One cannot but be greatly impressed by the wide variety of applications - applications ranging from use of fuzzy logic in transportation and health case systems, to use of a neuro-fuzzy approach to modeling of credit risk in trading, and application of soft computing to e-commerce. To view the contents of SC-BE in a clearer perspective, a bit of history is in order. In science, as in other realms of human activity, there is a tendency to be nationalistic - to commit oneself to a particular methodology and relegate to a position of inferiority or irrelevance all alternative methodologies. As we move further into the age of machine intelligence and automated reasoning, we run into more and more problems which do not lend themselves to solution through the use of our favorite methodology.
Worldwide, the urge is being felt to pave the way towards the introduction of an electronic government. Many countries recognise the potential of digital aids in providing information and services to citizens, organisations and companies. Recent developments have put pressure on the legislature to provide an adequate legal framework for electronic administrative communication. Thus, various countries have started to draft provisions in their administrative law in order to remove legal impediments that hamper electronic services from public administrations. Written by specialists from different countries, E-Government and its Implications for Administrative Law provides an overview and analysis of such legislative developments in France, Germany, Norway and the United States. What approach has been taken in these countries? What specific provisions have been formulated to facilitate electronic administrative communication and at what level? What requirements are introduced to gain sufficient trust in electronic service delivery? In providing an in-depth analysis of the legislative projects in the various countries, this book gives a glance at the differences in policy making as well as the lessons that can be learned for future regulatory projects to amend administrative law for the digital era. This is Volume 1 in the Information Technology and Law (IT&Law) Series
Universally acclaimed as the book on garbage collection. A complete and up-to-date revision of the 2012 Garbage Collection Handbook. Thorough coverage of parallel, concurrent and real-time garbage collection algortithms including C4, Garbage First, LXR, Shenandoah, Transactional Sapphire and ZGC, and garbage collection on the GPU. Clear explanation of the trickier aspects of garbage collection, including the interface to the run-time system, handling of finalisation and weak references, and support for dynamic languages. New chapters on energy aware garbage collection, and persistence and garbage collection. The e-book includes more than 40,000 hyperlinks to algorithms, figures, glossary entries, indexed items, original research papers and much more. Backed by a comprehensive online database of over 3,400 garbage collection-related publications
Performance and Reliability Analysis of Computer Systems: An Example-Based Approach Using the SHARPE Software Package provides a variety of probabilistic, discrete-state models used to assess the reliability and performance of computer and communication systems. The models included are combinatorial reliability models (reliability block diagrams, fault trees and reliability graphs), directed, acyclic task precedence graphs, Markov and semi-Markov models (including Markov reward models), product-form queueing networks and generalized stochastic Petri nets. A practical approach to system modeling is followed; all of the examples described are solved and analyzed using the SHARPE tool. In structuring the book, the authors have been careful to provide the reader with a methodological approach to analytical modeling techniques. These techniques are not seen as alternatives but rather as an integral part of a single process of assessment which, by hierarchically combining results from different kinds of models, makes it possible to use state-space methods for those parts of a system that require them and non-state-space methods for the more well-behaved parts of the system. The SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) package is the toolchest' that allows the authors to specify stochastic models easily and solve them quickly, adopting model hierarchies and very efficient solution techniques. All the models described in the book are specified and solved using the SHARPE language; its syntax is described and the source code of almost all the examples discussed is provided. Audience: Suitable for use in advanced level courses covering reliability and performance of computer and communications systems and by researchers and practicing engineers whose work involves modeling of system performance and reliability.
"Specification and transformation of programs" is short for a methodology of software development where, from a formal specification of a problem to be solved, programs correctly solving that problem are constructed by stepwise application of formal, semantics-preserving transformation rules. The approach considers programming as a formal activity. Consequently, it requires some mathematical maturity and, above all, the will to try something new. A somewhat experienced programmer or a third- or fourth-year student in computer science should be able to master most of this material - at least, this is the level I have aimed at. This book is primarily intended as a general introductory textbook on transformational methodology. As with any methodology, reading and understanding is necessary but not sufficient. Therefore, most of the chapters contain a set of exercises for practising as homework. Solutions to these exercises exist and can, in principle, be obtained at nominal cost from the author upon request on appropriate letterhead. In addition, the book also can be seen as a comprehensive account of the particular transformational methodology developed within the Munich CIP project.
Today, a major component of any project management effort is the combined use of qualitative and quantitative tools. While publications on qualitative approaches to project management are widely available, few project management books have focused on the quantitative approaches. This book represents the first major project management book with a practical focus on the quantitative approaches to project management. The book organizes quantitative techniques into an integrated framework for project planning, scheduling, and control. Numerous illustrative examples are presented. Topics covered in the book include PERT/CPM/PDM and extensions, mathematical project scheduling, heuristic project scheduling, project economics, statistical data analysis for project planning, computer simulation, assignment and transportation problems, and learning curve analysis. Chapter one gives a brief overview of project management, presenting a general-purpose project management model. Chapter two covers CPM, PERT, and PDM network techniques. Chapter three covers project scheduling subject to resource constraints. Chapter four covers project optimization. Chapter five discusses economic analysis for project planning and control. Chapter six discusses learning curve analysis. Chapter seven covers statistical data analysis for project planning and control. Chapter eight presents techniques for project analysis and selection. Tables and figures are used throughout the book to enhance the effectiveness of the discussions. This book is excellent as a textbook for upper-level undergraduate and graduate courses in Industrial Engineering, Engineering Management, and Business, and as a detailed, comprehensive guidefor corporate management.
The need for a comprehensive survey-type exposition on formal languages and related mainstream areas of computer science has been evident for some years. In the early 1970s, when the book Formal Languages by the second mentioned editor appeared, it was still quite feasible to write a comprehensive book with that title and include also topics of current research interest. This would not be possible anymore. A standard-sized book on formal languages would either have to stay on a fairly low level or else be specialized and restricted to some narrow sector of the field. The setup becomes drastically different in a collection of contributions, where the best authorities in the world join forces, each of them concentrat ing on their own areas of specialization. The present three-volume Handbook constitutes such a unique collection. In these three volumes we present the current state of the art in formallanguage theory. We were most satisfied with the enthusiastic response given to our request for contributions by specialists representing various subfields. The need for a Handbook of Formal Languages was in many answers expressed in different ways: as an easily accessible his torical reference, a general source of information, an overall course-aid, and a compact collection of material for self-study. We are convinced that the final result will satisfy such various needs."
Service Intelligence and Service Science: Evolutionary Technologies and Challenges the emerging fields of service intelligence and service science, positioning them as the most promising directions for the evolution of service computing. This book demonstrates the critical role such areas play in supporting service computing processes, and furthers an increase in current research, best practices, and new directions in service computing technologies and applications.
This book presents the latest research in formal techniques for distributed systems, including material on theory, applications, tools and industrial usage of formal techniques.
In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."
A self-contained treatment of the fundamentals of quantum computing
Understanding digital modes and practices of traditional rhetoric are essential in emphasising information and interaction in human-to-human and human-computer contexts. These emerging technologies are essential in gauging information processes across global contexts. Digital Rhetoric and Global Literacies: Communication Modes and Digital Practices in the Networked World compiles relevant theoretical frameworks, current practical applications, and emerging practices of digital rhetoric. Highlighting the key principles and understandings of the underlying modes, practices, and literacies of communication, this book is a vital guide for professionals, scholars, researchers, and educators interested in finding clarity and enrichment in the diverse perspectives of digital rhetoric research.
E-ffective Writing for E-Learning Environments integrates research and practice in user-centered design and learning design for instructors in post-secondary institutions and learning organizations who are developing e-learning resources. The book is intended as a development guide for experts in areas other than instructional or educational technology (in other words, experts in cognate areas such as Biology or English or Nursing) rather than as a learning design textbook. The organization of the book reflects the development process for a resource, course, or program from planning and development through formative evaluation, and identifies trends and issues that faculty or developers might encounter along the way. The account of the process of one faculty member's course development journey illustrates the suggested design guidelines. The accompanying practice guide provides additional information, examples, learning activities, and tools to supplement the text.
Microprocessors are the key component of the infrastructure of our 21st-century electronic- and digital information-based society. More than four billion are sold each year for use in 'intelligent' electronic devices; ranging from smart egg-timer through to aircraft management systems. Most of these processor devices appear in the form of highly-integrated microcontrollers, which comprize a core microprocessor together with memory and analog/digital peripheral ports. By using simple cores, these single-chip computers are the cost- and size-effective means of adding the brains to previous dumb widgets; such as the credit card. Using the same winning format as the successful Springer guide, The Quintessential PIC (R) Microcontroller, this down-to-earth new textbook/guide has been completely rewritten based on the more powerful PIC18 enhanced-range Microchip MCU family. Throughout the book, commercial hardware and software products are used to illustrate the material, as readers are provided real-world in-depth guidance on the design, construction and programming of small, embedded microcontroller-based systems. Suitable for stand-alone usage, the text does not require a prerequisite deep understanding of digital systems. Topics and features: uses an in-depth bottom-up approach to the topic of microcontroller design using the Microchip enhanced-range PIC18 (R) microcontroller family as the exemplar; includes fully worked examples and self-assessment questions, with additional support material available on an associated website; provides a standalone module on foundation topics in digital, logic and computer architecture for microcontroller engineering; discusses the hardware aspects of interfacing and interrupt handling, with an emphasis on the integration of hardware and software; covers parallel and serial input/output, timing, analog, and EEPROM data-handling techniques; presents a practical build-and-program case study, as well as illustrating simple testing strategies. This useful text/reference book will be of great value to industrial engineers, hobbyists and people in academia. Students of Electronic Engineering and Computer Science, at both undergraduate and postgraduate level, will also find this an ideal textbook, with many helpful learning tools. Dr. Sid Katzen is Associate to the School of Engineering, University of Ulster at Jordanstown, Northern Ireland.
The success of VHDL since it has been balloted in 1987 as an IEEE standard may look incomprehensible to the large population of hardware designers, who had never heared of Hardware Description Languages before (for at least 90% of them), as well as to the few hundreds of specialists who had been working on these languages for a long time (25 years for some of them). Until 1988, only a very small subset of designers, in a few large companies, were used to describe their designs using a proprietary HDL, or sometimes a HDL inherited from a University when some software environment happened to be developped around it, allowing usability by third parties. A number of benefits were definitely recognized to this practice, such as functional verification of a specification through simulation, first performance evaluation of a tentative design, and sometimes automatic microprogram generation or even automatic high level synthesis. As there was apparently no market for HDL's, the ECAD vendors did not care about them, start-up companies were seldom able to survive in this area, and large users of proprietary tools were spending more and more people and money just to maintain their internal system.
Deryn Watson and David Tinsley The topic of the conference, integrating infonnation technology into education, is both broad and multi-facetted. In order to help focus the papers and discussion we identified 7 themes: * Current developments in society and education influencing integration; * Teachers, their roles and concerns; * Learners, their expectations of and behaviour in an integrated environment; * Developments and concerns in the curriculum; * Successes and failures in existing practice; * Organisation and management of integrated environments; * Identification of social and political influences. Each author was invited to focus on one theme, and these remained strands throughout as can be seen from the short papers and focus group reports. The first and most significant concern therefore was to be clear about our notions of integration; what do we mean and how is this relevant? Our keynote paper from Cornu clearly marked out this debate by examining the notion of integration and alerting us to the fact that as long as the use of IT is still added to the curriculum, then integration has not yet begun.
Performance evaluation of increasingly complex human-made systems requires the use of simulation models. However, these systems are difficult to describe and capture by succinct mathematical models. The purpose of this book is to address the difficulties of the optimization of complex systems via simulation models or other computation-intensive models involving possible stochastic effects and discrete choices. This book establishes distinct advantages of the "softer" ordinal approach for search-based type problems, analyzes its general properties, and shows the many orders of magnitude improvement in computational efficiency that is possible.
This book contains the ceremonials and the proceedings pertaining to the Int- national Symposium CCN2005 on "Complex Computing-Networks: A Link between Brain-like and Wave-Oriented Electrodynamics Algorithms," convened at Do ?u ? University of Istanbul, Turkey, on 13-14 June 2005, in connection with the bestowal of the honorary doctorate degrees on Professors Leopold B. Felsen and Leon O. Chua, for their extraordinary achievements in electromagnetics, and n- linear systems, respectively. The symposium was co-organized by Cem Goknar and Levent Sevgi, in consultation with Leopold B. Felsen and Leon O. Chua. Istanbul is a city with wonderful natural and historical surroundings, a city not only interconnecting Asia and Europe but also Eastern and Western cultures. Therefore, CCN2005 was a memorable event not only in the lifetime of Drs. Felsen, Chua, and their families, but also for all the other participants who were there to congratulate the recipients and participate in the symposium."
Becoming a Teacher offers a broad context for understanding education, addressing issues such as social justice, educational ideology and teacher well-being and identity. The theoretical content is balanced with practical advice for the classroom on topics such as assessment for learning, behaviour management, differentiation and curriculum planning. Becoming a Teacher draws extensively on contemporary research and empirical evidence to support critical reflection about learning and teaching. Encouraging the reader to reflect on their own knowledge and beliefs, it explores some of the complex social and cultural dimensions that influence professional learning and practice. Becoming a Teacher's approach chimes with the commonly accepted recognition that all those involved in the education of young people should take a research-informed approach towards classroom practice. The substantial rethinking that has informed this sixth edition means the Becoming a Teacher continues to provide invaluable support, guidance and insight for all those training to be secondary teachers and a rich resource for students undertaking undergraduate or postgraduate education studies programmes. |
You may like...
|