![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
The design of asynchronous circuits is increasingly important in solving problems such as complexity management, modularity, power consumption and clock distribution in large digital integrated circuits. Since the second half of the 1980s asynchronous circuits have been the subject of a great deal of research following a period of relative oblivion. The lack of interest in asynchronous techniques was motivated by the progressive shift towards synchronous design techniques that had much more structure and were much easier to verify and synthesize. System design requirements made it impossible to totally eliminate the use of asynchronous circuits. Given the objective difficulty encountered by designers, the asynchronous components of electronic systems, such as interfaces, became a serious bottleneck in the design process. The use of new models and some theoretical breakthroughs made it possible to develop asynchronous design techniques that were reliable and effective.Algorithms for Synthesis and Testing of Asynchronous Circuits describes a variety of mathematical models and of algorithms that form the backbone and the body of a new design methodology for asynchronous design. The book is intended for asynchronous hardware designers, for computer-aided tool experts, and for digital designers interested in exploring the possibility of designing asynchronous circuits. It requires a solid mathematical background in discrete event systems and algorithms. While the book has not been written as a textbook, it could nevertheless be used as a reference book in an advanced course in logic synthesis or asynchronous design. Algorithms for Synthesis and Testing of Asynchronous Circuits also includes an extensive literature review, which summarizes and compares classical papers from the 1960s with the most recent developments in the areas of asynchronous circuit design testing and verification. The validity and utility of employment tests have become entangled in the debate over the 1991 Civil Rights Bill.Worried about compliance with new federal guidelines for test validity, and concerned about possible lawsuits, the business world became wary of pre-employment testing in the early 1980s, but the use of employment testing increased throughout that decade.
Wave Pipelining: Theory and CMOS Implementation provides a coherent presentation of the theory of wave pipelined operation of digital circuits and discusses practical design techniques for the realization of wave pipelined circuits in CMOS technology. Wave pipeling is a timing methodology used in digital systems to enhance performance while conserving the number of data registers used. This is achieved by applying new data to the inputs of a combinatorial logic block before the previous outputs are available. In contrast to conventional pipelining, system performance is limited by differences in maximum and minimum circuit delay rather than maximum circuit delays. Realization of practical systems using this technique requires accurate system level and circuit level timing analysis. At the system level, timing constraints identifying valid regions of operation for correct clocking of wave pipelined circuits are presented. Both single stage and multiple stage systems including feedback are considered. At the circuit level, since performance is determined by the maximum circuit delay difference, highly accurate estimates of both maximum and minimum delays are needed.Thus, timing analysis based on traditional gate delay models is not sufficient. For CMOS circuits, data dependent delay models considering the effect of simultaneous multiple input switchings must be used. An algorithm using these delay models for accurate analysis of small to medium sized circuits is implemented in a prototype timing analyzer, XTV. Results are given for a set of benchmark circuits.
The changing manufacturing environment requires more responsive and adaptable manufacturing systems. The theme of the 4th International Conference on Changeable, Agile, Reconfigurable and Virtual production (CARV2011) is "Enabling Manufacturing Competitiveness and Economic Sustainability". Leading edge research and best implementation practices and experiences, which address these important issues and challenges, are presented. The proceedings include advances in manufacturing systems design, planning, evaluation, control and evolving paradigms such as mass customization, personalization, changeability, re-configurability and flexibility. New and important concepts such as the dynamic product families and platforms, co-evolution of products and systems, and methods for enhancing manufacturing systems' economic sustainability and prolonging their life to produce more than one product generation are treated. Enablers of change in manufacturing systems, production volume and capability scalability and managing the volatility of markets, competition among global enterprises and the increasing complexity of products, manufacturing systems and management strategies are discussed. Industry challenges and future directions for research and development needed to help both practitioners and academicians are presented.
Document image analysis is the automatic computer interpretation of images of printed and handwritten documents, including text, drawings, maps, music scores, etc. Research in this field supports a rapidly growing international industry. This is the first book to offer a broad selection of state-of-the-art research papers, including authoritative critical surveys of the literature, and parallel studies of the architectureof complete high-performance printed-document reading systems. A unique feature is the extended section on music notation, an ideal vehicle for international sharing of basic research. Also, the collection includes important new work on line drawings, handwriting, character and symbol recognition, and basic methodological issues. The IAPR 1990 Workshop on Syntactic and Structural Pattern Recognition is summarized, including the reports of its expert working groups, whose debates provide a fascinating perspective on the field. The book is an excellent text for a first-year graduate seminar in document image analysis, and is likely to remain a standard reference in the field for years.
Manufacturing contributes to over 60 % of the gross national product of the highly industrialized nations of Europe. The advances in mechanization and automation in manufacturing of international competitors are seriously challenging the market position of the European countries in different areas. Thus it becomes necessary to increase significantly the productivity of European industry. This has prompted many governments to support the development of new automation resources. Good engineers are also needed to develop the required automation tools and to apply these to manufacturing. It is the purpose ofthis book to discuss new research results in manufacturing with engineers who face the challenge of building tomor row's factories. Early automation efforts were centered around mechanical gear-and-cam technology and hardwired electrical control circuits. Because of the decreasing life cycle of most new products and the enormous model diversification, factories cannot be automated efficiently any more by these conventional technologies. With the digital computer, its fast calculation speed and large memory capacity, a new tool was created which can substantially improve the productivity of manufactur ing processes. The computer can directly control production and quality assurance functions and adapt itself quickly to changing customer orders and new products."
People use the word strategy in a variety of different contexts. The term has connotations ranging from statesmanship to economic planning, and has become pervasive in the social sciences. We also talk about "problem solving strategies" and "corporate strategy" in a large business enterprise. The concept of strategy applies whenever a sequence of goal-oriented actions is based on large-scale and long-range planning. This monograph gives a systematic overview of the theory of strategies, a new area of enquiry developed over the past two decades by the author and his team. The projects described have clearly defined research objectives and are based on realistic assumptions about the environments in which the programming systems will work, and about the constraints and requirements they have to satisfy. Applications of the systems range over various aspects of air traffic control, automatic verification and validation of discrete-event simulation models, econometric model building, distributed planning systems for manufacturing, control of traffic lights, and others. The book is aimed at researchers, teachers and students in computer science, management science and certain areas of engineering. The reader should have some maturity in computer science and mathematics, and familiarity with the basic concepts of artificial intelligence.
Interest in product data exchange and interfaces in the CAD/CAM area is steadi ly growing. The rapidly increasing graphics applications in engineering and sci ence has led to a great variety of heterogeneous hardware and software products. This has become a major obstacle in the progress of systems integration. To improve this situation CAD/CAM users have called for specification and imple mentation of standardized product data interfaces. These needs resulted in the definition of preliminary standards in this area. Since 1975 activities have been concentrated on developing standards for three major areas: - computer graphics, - sculptured surfaces, and - data exchange for engineering drawings. The Graphical Kernel System (GKS) has been accepted as an international standard for graphics programming in 1984, Y14.26M (IGES) was adopted as an American Standard in 1981 and the VDA Surface Interface (VDAFS) has been accepted by the German National Standardization Institute (DIN NAM 96.4). Although considerable progress has been achieved, the complexity of the subject and the dynamics of the CAD/CAM-development still calls for more generality and compatibility of the interfaces. This has resulted in an inter national discussion on further improvements of the standards. The major goal of this book is to bring together the different views and experiences in industry and university in the area of Product Data Interfaces, thereby contributing to the ongoing work in improving the state of the art."
Computer-aided design syst, ems have become a big business. Advances in technology have made it commercially feasible to place a powerful engineering workstation on every designer's desk. A major selling point for these workstations is the computer aided design software they provide, rather than the actual hardware. The trade magazines are full of advertisements promising full menu design systems, complete with an integrated database (preferably "relational"). What does it all mean? This book focuses on the critical issues of managing the information about a large design project. While undeniably one of the most important areas of CAD, it is also one of the least understood. Merely glueing a database system to a set of existing tools is not a solution. Several additional system components must be built to create a true design management system. These are described in this book. The book has been written from the viewpoint of how and when to apply database technology to the problems encountered by builders of computer-aided design systems. Design systems provide an excellent environment for discovering how far we can generalize the existing database concepts for non-commercial applications. This has emerged as a major new challenge for database system research. We have attem pted to avoid a "database egocentric" view by pointing out where existing database technology is inappropriate for design systems, at least given the current state of the database art. Acknowledgements."
The term "Office Automation" implies much and means little. The word "Office" is usually reserved for units in an organization that have a rather general function. They are supposed to support different activities, but it is notoriously difficult to determine what an office is supposed to do. Automation in this loose context may mean many different things. At one extreme, it is nothing more than giving people better tools than typewriters and telephones with which to do their work more efficiently and effectively. At the opposite extreme, it implies the replacement of people by machines which perform office procedures automatically. In this book we will take the approach that "Office Automation" is much more than just better tools, but falls significantly short of replacing every person in an office. It may reduce the need for clerks, it may take over some secretarial functions, and it may lessen the dependence of principals on support personnel. Office Automation will change the office environment. It will eliminate the more mundane and well understood functions and will highlight the decision-oriented activities in an office. The goal of this book is to provide some understanding of office . activities and to evaluate the potential of Office Information Systems for office procedure automation. To achieve this goal, we need to explore concepts, elaborate on techniques, and outline tools.
Current practice dictates the separation of the hardware and software development paths early in the design cycle. These paths remain independent with very little interaction occurring between them until system integration. In particular, hardware is often specified without fully appreciating the computational requirements of the software. Also, software development does not influence hardware development and does not track changes made during the hardware design phase. Thus, the ability to explore hardware/software tradeoffs is restricted, such as the movement of functionality from the software domain to the hardware domain (and vice-versa) or the modification of the hardware/software interface. As a result, problems that are encountered during system integration may require modification of the software and/or hardware, resulting in potentially significant cost increases and schedule overruns. To address the problems described above, a cooperative design approach, one that utilizes a unified view of hardware and software, is described. This approach is called hardware/software codesign. The Codesign of Embedded Systems develops several fundamental hardware/software codesign concepts and a methodology that supports them. A unified representation, referred to as a decomposition graph, is presented which can be used to describe hardware or software using either functional abstractions or data abstractions. Using a unified representation based on functional abstractions, an abstract hardware/software model has been implemented in a common simulation environment called ADEPT (Advanced Design Environment Prototyping Tool). This model permits early hardware/software evaluation and tradeoff exploration. Techniques have been developed which support the identification of software bottlenecks and the evaluation of design alternatives with respect to multiple metrics. The application of the model is demonstrated on several examples. A unified representation based on data abstractions is also explored. This work leads to investigations regarding the application of object-oriented techniques to hardware design. The Codesign of Embedded Systems: A Unified Hardware/Software Representation describes a novel approach to a topic of immense importance to CAD researchers and designers alike.
Motion and Structure from Image Sequences is invaluable reading for researchers, graduate students, and practicing engineers dealing with computer vision. It presents a balanced treatment of the theoretical and practical issues, including very recent results - some of which are published here for the first time. The topics covered in detail are: - image matching and optical flow computation - structure from stereo - structure from motion - motion estimation - integration of multiple views - motion modeling and prediction Aspects such as uniqueness of the solution, degeneracy conditions, error analysis, stability, optimality, and robustness are also investigated. These details together with the fact that the algorithms are accessible without necessarily studying the rest of the material, make this book particularly attractive to practitioners.
New perspective technologies of genetic search and evolution simulation represent the kernel of this book. The authors wanted to show how these technologies are used for practical problems solution. This monograph is devoted to specialists of CAD, intellectual information technologies in science, biology, economics, sociology and others. It may be used by post-graduate students and students of specialties connected to the systems theory and system analysis methods, information science, optimization methods, operations investigation and solution-making.
This volume contains the proceedings of a workshop on Analog Integrated Neural Systems held May 8, 1989, in connection with the International Symposium on Circuits and Systems. The presentations were chosen to encompass the entire range of topics currently under study in this exciting new discipline. Stringent acceptance requirements were placed on contributions: (1) each description was required to include detailed characterization of a working chip, and (2) each design was not to have been published previously. In several cases, the status of the project was not known until a few weeks before the meeting date. As a result, some of the most recent innovative work in the field was presented. Because this discipline is evolving rapidly, each project is very much a work in progress. Authors were asked to devote considerable attention to the shortcomings of their designs, as well as to the notable successes they achieved. In this way, other workers can now avoid stumbling into the same traps, and evolution can proceed more rapidly (and less painfully). The chapters in this volume are presented in the same order as the corresponding presentations at the workshop. The first two chapters are concerned with fmding solutions to complex optimization problems under a predefmed set of constraints. The first chapter reports what is, to the best of our knowledge, the first neural-chip design. In each case, the physics of the underlying electronic medium is used to represent a cost function in a natural way, using only nearest-neighbor connectivity.
The summer school on VLSf GAD Tools and Applications was held from July 21 through August 1, 1986 at Beatenberg in the beautiful Bernese Oberland in Switzerland. The meeting was given under the auspices of IFIP WG 10. 6 VLSI, and it was sponsored by the Swiss Federal Institute of Technology Zurich, Switzerland. Eighty-one professionals were invited to participate in the summer school, including 18 lecturers. The 81 participants came from the following countries: Australia (1), Denmark (1), Federal Republic of Germany (12), France (3), Italy (4), Norway (1), South Korea (1), Sweden (5), United Kingdom (1), United States of America (13), and Switzerland (39). Our goal in the planning for the summer school was to introduce the audience into the realities of CAD tools and their applications to VLSI design. This book contains articles by all 18 invited speakers that lectured at the summer school. The reader should realize that it was not intended to publish a textbook. However, the chapters in this book are more or less self-contained treatments of the particular subjects. Chapters 1 and 2 give a broad introduction to VLSI Design. Simulation tools and their algorithmic foundations are treated in Chapters 3 to 5 and 17. Chapters 6 to 9 provide an excellent treatment of modern layout tools. The use of CAD tools and trends in the design of 32-bit microprocessors are the topics of Chapters 10 through 16. Important aspects in VLSI testing and testing strategies are given in Chapters 18 and 19.
Mixed-Signal Layout Generation Concepts covers important
physical-design issues that exist in contemporary analog and
mixed-signal design flows. Due to the increasing pressure on
time-to-market, the steep increase in chip fabrication costs, and
the increasing design complexity, it becomes even more challenging
to produce a first-time right IC layout. The fundamental issues in
creating a layout are placement and routing. Although these coupled
problems have been investigated for many decades, no satisfactory
automated solution has emerged yet. Fortunately, supported by
modern computing power and results of new research that further
improve computation efficiency, significant steps forward have been
taken.
The computer interpretation of line drawings is a classic problem in arti?cial intelligence (AI) which has inspired the development of some fundamental AI tools, including constraint propagation, probabilistic relaxation, the characte- zation of tractable constraint classes and, most recently, the propagationof soft constraintsin?nite-domainoptimizationproblems. Line drawinginterpretation has many distinct applications on the borderline of computer vision and c- puter graphics, including sketch interpretation, the input of 3D object models 1 and the creation of 2 D illustrations in electronic documents. 2 I hope I have made this fascinating topic accessible not only to computer scientistsbutalsotomathematicians,psychologistsandcognitivescientistsand, indeed, to anyone who is intrigued by optical illusions and impossible or - biguous ?gures. This book could not have been written without the support of the CNRS, theFrenchCentreNational deRecherche Scienti?que,who?nancedmyone-year break from teaching at the University of Toulouse III. The UK Engineering and Physical Sciences Research Council also ?nanced several extended visits to the Oxford University Computing Laboratory. Section 9.1 is just a brief summary of the results on tractable constraints that have come out of this very productive joint research programme with David Cohen, Peter Jeavons and Andrei Krokhin. The various soft arc consistency techniques described in Chapter 8 were developed in collaboration with Thomas Schiex and Simon de Givry at INRA, Toulouse. I am also grateful to Ralph Martin and Peter Varley for their comments on the line-labelling constraints presented in Chapter 3.
"Developments in Computer-Integrated Manufacturing" arose from the joint work of members of the IFIP-Working Group 5.3 - Discrete Manufacturing, and other IFIP members. Within the Technical Committee 5 of the International Federation of Information Processing (lFIP) the aim of this Working Group is the advancement of computers and their application to the field of discrete part manufacturing. Capabilities will be expanded in the general areas of planning, selection, and con trol of manufacturing equipment and systems. Tools for problem solution include: mathematics, geometry, algorithms, computer techniques, and manufacturing technology. This technology will influence many industries - machine tool, auto mation, aircraft, appliance, and electronics, to name but a few. The Working Group undertook the following specific tasks: 1. To maintain liaison with other national and international organizations work ing in the same field, cooperating with them whenever desirable to further the common goal 2. To be responsible for the IFIP's work in organizing and presenting the PRO LAMA T Conferences 3. To conduct other working conferences and symposia as deemed appropriate in furthering its mission 4. To develop and sponsor research and industrial and social studies into the various aspects of its mission. The book can be regarded as an attempt to underline the main aspects of techno logy from the point of view of its software and hardware realization. Because of limitations in size and the availability of literature, the problems of robotics and quality control are not described in detail.
The purpose of this book is to discuss the state of the art and future trends in the field of computerized production management systems. It is composed of a number of independent papers, each presented in a chapter. Some of the widely recognized experts in the field around the world have been asked to contribute. lowe each of them my sincere gratitude for their kind cooperation. I am also grateful to Peter Falster and Jim Browne for their kind support in helping me to review topics to be covered and to select the authors. This book is a result of the professional work done in the International Federation of Information Processing Technical Committee IFIP TC5 "Com puter Applications in Technology" and especially in the Working Group WG5. 7 "Computer-Aided Production Management." This group was established in 1978 with the aim of promoting and encouraging the advancement of the field of computer systems for the production management of manufacturing, off shore, construction, electronic and similar and related industries. The scope of the work includes, but is not limited to, the following topics: 1) design and implementation of new production planning and control systems taking into account new technology and management philosophy; 2) CAPM in a CIM environment including interfaces to CAD and CAM; 3) project management and cost engineering; 4) knowledge engineering in CAPM; 5) CAPM for Flexible Manufacturing Systems (FMS) and Flexible Assembly Systems (F AS); 6) methods and concepts in CAPM; 7) economic and social implications of CAPM."
This volume is a record of the Workshop on Window Management held at the Ruth erford Appleton Laboratory's Cosener's House between 29 April and 1 May 1985. The main impetus for the Workshop came from the Alvey Programme's Man Machine Interface Director who was concerned at the lack of a formal definition of window management and the lack of focus for research activities in this area. Win dow Management per se is not the complete problem in understanding interaction. However, the appearance of bitmap displays from a variety of vendors enabling an operator to work simultaneously with a number of applications on a single display has focussed attention on what the overall architecture for such a system should be and also on what the interfaces to both the application and operator should be. The format of the Workshop was to spend the first day with presentations from a number of invited speakers. The aim was to get the participants aware of the current state of the art and to highlight the main outstanding issues. The second day consisted of the Workshop participants splitting into three groups and discussing specific issues in depth. Plenary sessions helped to keep the individual groups work ing on similar lines. The third day concentrated on the individual groups presenting their results and interacting with the other groups to identify main areas of con sensus and also a framework for future work."
"During the last two decades, research on structural optimization became increasingly concerned with two aspects: the application of general numeri- cal methods of optimization to structural design of complex real structures, and the analytical derivation of necessary and sufficient conditions for the optimality of broad classes of comparatively simple and more or less ideal- ized structures. Both kinds of research are important: the first for obvious reasons; the second, because it furnishes information that is useful in testing the validity, accuracy and convergence of numerical methods and in assess- ing the efficiency of practical designs. " (Prager and Rozvany, 1977a) The unexpected death of William Prager in March 1980 marked, in a sense, the end of an era in structural mechanics, but his legacy of ideas will re- main a source of inspiration for generations of researchers to come. Since his nominal retirement in the early seventies, Professor and Mrs. Prager lived in Savognin, an isolated alpine village and ski resort surrounded by some of Switzerland's highest mountains. It was there that the author's close as- sociation with Prager developed through annual pilgrimages from Australia and lengthy discussions which pivoted on Prager's favourite topic of struc- tural optimization. These exchanges took place in the picturesque setting of Graubunden, on the terrace of an alpine restaurant overlooking snow-capped peaks, on ski-lifts or mountain walks, or during evening meals in the cosy hotels of Savognin, Parsonz and Riom.
This book describes a new type of computer aided VLSI design tool, called a VLSI System Planning, that is meant to aid designers dur ing the early, or conceptual, state of design. During this stage of design, the objective is to define a general design plan, or approach, that is likely to result in an efficient implementation satisfying the initial specifications, or to determine that the initial specifications are not realizable. A design plan is a collection of high level design decisions. As an example, the conceptual design of digital filters involves choosing the type of algorithm to implement (e. g. , finite impulse response or infinite impulse response), the type of polyno mial approximation (e. g. , Equiripple or Chebyshev), the fabrication technology (e. g. , CMOS or BiCMOS), and so on. Once a particu lar design plan is chosen, the detailed design phase can begin. It is during this phase that various synthesis, simulation, layout, and test activities occur to refine the conceptual design, gradually filling more detail until the design is finally realized. The principal advantage of VLSI System Planning is that the increasingly expensive resources of the detailed design process are more efficiently managed. Costly redesigns are minimized because the detailed design process is guided by a more credible, consistent, and correct design plan.
I am indebted to my thesis advisor, Michael Genesereth, for his guidance, inspiration, and support which has made this research possible. As a teacher and a sounding board for new ideas, Mike was extremely helpful in pointing out Haws, and suggesting new directions to explore. I would also like to thank Harold Brown for introducing me to the application of artificial intelligence to reasoning about designs, and his many valuable comments as a reader of this thesis. Significant contribu tions by the other members of my reading committee, Mark Horowitz, and Allen Peterson have greatly improved the content and organization of this thesis by forcing me to communicate my ideas more clearly. I am extremely grateful to the other members of the Logic Group at the Heuristic Programming Project for being a sounding board for my ideas, and providing useful comments. In particular, I would like to thank Matt Ginsberg, Vineet Singh, Devika Subramanian, Richard Trietel, Dave Smith, Jock Mackinlay, and Glenn Kramer for their pointed criticisms. This research was supported by Schlumberger Palo Alto Research (previously Fairchild Laboratory for Artificial Intelligence). I am grateful to Peter Hart, the former head of the AI lab, and his successor Marty Tenenbaum for providing an excellent environment for performing this research."
This monograph represents a summary of our work in the last two years in applying the method of simulated annealing to the solution of problems that arise in the physical design of VLSI circuits. Our study is experimental in nature, in that we are con cerned with issues such as solution representations, neighborhood structures, cost functions, approximation schemes, and so on, in order to obtain good design results in a reasonable amount of com putation time. We hope that our experiences with the techniques we employed, some of which indeed bear certain similarities for different problems, could be useful as hints and guides for other researchers in applying the method to the solution of other prob lems. Work reported in this monograph was partially supported by the National Science Foundation under grant MIP 87-03273, by the Semiconductor Research Corporation under contract 87-DP- 109, by a grant from the General Electric Company, and by a grant from the Sandia Laboratories."
This book is a result of the lectures and discussions during the conference "Theory and Practice of Geometric Modeling." The event has been organized by the Wilhelm-Schickard-Institut fiir Informatik, Universitat Tiibingen and took place at the Heinrich-Fabri-Institut in Blaubeuren from October 3 to 7, 1988. The conference brought together leading experts from academic and industrial research institutions, CAD system developers and experien ced users to exchange their ideas and to discuss new concepts and future directions in geometric modeling. The main intention has been to bridge the gap between theoretical results, performance of existing CAD systems and the real problems of users. The contents is structured in five parts: A Algorithmic Aspects B Surface Intersection, Blending, Ray Tracing C Geometric Tools D Different Representation Schemes in Solid Modeling E Product Modeling in High Level Specifications The material presented in this book reflects the current state of the art in geometric modeling and should therefore be of interest not only to university and industry researchers, but also to system developers and practitioners who wish to keep up to date on recent advances and new concepts in this rapidly expanding field. The editors express their sincere appreciation to the contributing authors, and to the members of the program committee, W. Boehm, J. Hoschek, A. Massabo, H. Nowacki, M. Pratt, J. Rossignac, T. Sederberg and W. Tiller, for their close cooperation and their time and effort that made the conference and this book a success."
This volume contains a collection of papers presented at the NATO Advanced Study Institute on *Testing and Diagnosis of VLSI and ULSI" held at Villa Olmo, Como (Italy) June 22 -July 3,1987. High Density technologies such as Very-Large Scale Integration (VLSI), Wafer Scale Integration (WSI) and the not-so-far promises of Ultra-Large Scale Integration (ULSI), have exasperated the problema associated with the testing and diagnosis of these devices and systema. Traditional techniques are fast becoming obsolete due to unique requirements such as limited controllability and observability, increasing execution complexity for test vector generation and high cost of fault simulation, to mention just a few. New approaches are imperative to achieve the highly sought goal of the * three months* turn around cycle time for a state-of-the-art computer chip. The importance of testing and diagnostic processes is of primary importance if costs must be kept at acceptable levels. The objective of this NATO-ASI was to present, analyze and discuss the various facets of testing and diagnosis with respect to both theory and practice. The contents of this volume reflect the diversity of approaches currently available to reduce test and diagnosis time. These approaches are described in a concise, yet clear way by renowned experts of the field. Their contributions are aimed at a wide readership: the uninitiated researcher will find the tutorial chapters very rewarding. The expert wiII be introduced to advanced techniques in a very comprehensive manner. |
![]() ![]() You may like...
Reference for Modern Instrumentation…
R.N. Thurston, Allan D. Pierce
Hardcover
R3,675
Discovery Miles 36 750
Implementing Distributed Systems with…
Markus Aleksy, Axel Korthaus, …
Hardcover
R1,729
Discovery Miles 17 290
|