![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Computer graphics, computer-aided design, and computer-aided manufacturing are tools that have become indispensable to a wide array of activities in contemporary society. Euclidean processing provides the basis for these computer-aided design systems although it contains elements that inevitably lead to an inaccurate, non-robust, and complex system. The primary cause of the deficiencies of Euclidean processing is the division operation, which becomes necessary if an n-space problem is to be processed in n-space. The difficulties that accompany the division operation may be avoided if processing is conducted entirely in (n+1)-space. The paradigm attained through the logical extension of this approach, totally four-dimensional processing, is the subject of this book. This book offers a new system of geometric processing techniques that attain accurate, robust, and compact computations, and allow the construction of a systematically structured CAD system.
Complex Automated Negotiations have been widely studied and are becoming an important, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. These factors include the number of issues, dependency between issues, representation of utility, negotiation protocol, negotiation form (bilateral or multi-party), time constraints, etc. Software agents can support automation or simulation of such complex negotiations on the behalf of their owners, and can provide them with adequate bargaining strategies. In many multi-issue bargaining settings, negotiation becomes more than a zero-sum game, so bargaining agents have an incentive to cooperate in order to achieve efficient win-win agreements. Also, in a complex negotiation, there could be multiple issues that are interdependent. Thus, agent's utility will become more complex than simple utility functions. Further, negotiation forms and protocols could be different between bilateral situations and multi-party situations. To realize such a complex automated negotiation, we have to incorporate advanced Artificial Intelligence technologies includes search, CSP, graphical utility models, Bays nets, auctions, utility graphs, predicting and learning methods. Applications could include e-commerce tools, decision-making support tools, negotiation support tools, collaboration tools, etc. In this book, we solicit papers on all aspects of such complex automated negotiations in the field of Autonomous Agents and Multi-Agent Systems. In addition, this book includes papers on the ANAC 2010 (Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is one of real testbeds in which strategies for automated negotiating agents are evaluated in a tournament style.
Rapid increases in chip complexity, increasingly faster clocks, and the proliferation of portable devices have combined to make power dissipation an important design parameter. The power consumption of a digital system determines its heat dissipation as well as battery life. For some systems, power has become the most critical design constraint. Computer-Aided Design Techniques for Low Power Sequential Logic Circuits presents a methodology for low power design. The authors first present a survey of techniques for estimating the average power dissipation of a logic circuit. At the logic level, power dissipation is directly related to average switching activity. A symbolic simulation method that accurately computes the average switching activity in logic circuits is then described. This method is extended to handle sequential logic circuits by modeling correlation in time and by calculating the probabilities of present state lines. Computer-Aided Design Techniques for Low Power Sequential Logic Circuits then presents a survey of methods to optimize logic circuits for low power dissipation which target reduced switching activity. A method to retime a sequential logic circuit where registers are repositioned such that the overall glitching in the circuit is minimized is also described. The authors then detail a powerful optimization method that is based on selectively precomputing the output logic values of a circuit one clock cycle before they are required, and using the precomputed value to reduce internal switching activity in the succeeding clock cycle. Presented next is a survey of methods that reduce switching activity in circuits described at the register-transfer and behavioral levels. Also described is a scheduling algorithm that reduces power dissipation by maximising the inactivity period of the modules in a given circuit. Computer-Aided Design Techniques for Low Power Sequential Logic Circuits concludes with a summary and directions for future research.
E. KONTIZAS Astronomical Institute National Observatory of Athens P. O. Box 20048 Athens GR-1181O GREECE The international conference on "Wide-Field Spectroscopy" and its sub ject matter were agreed during the general assembly of the International Astronomical Union (IAU) in August 1994 by the Working Group of Com mision 9 "Wi de-Field Imaging". This meeting gave an opportunity to world experts on this subject to gather in Athens, in order to discuss the cur rent exploitation and the impending opportunities that exist in the area of multi-object spectroscopy, with particular emphasis on: 1. Astronomical instruments, data acquisition, processing and analysis techniques. 2. Astrophysical problems best tackled through wide-field, multi-object spectroscopy. The new fibre optic technology offers an important tool for the advancement of basic research and the development of industrial applications. Astronom ical spectroscopy is a field of astronomy which has contributed much to the advancement of fundamental physics. The spectra of hot stars have been used to determine the well-known Balmer formula for the wavelength of hydrogen lines, in the late 19th century. Since then, spectroscopy has made enormous progress in stellar atmosphere studies, in kinematics, and in the detection of high redshifts in the Universe. The traditional techniques of obtaining wide-field spectroscopic data are based on slitless spectroscopy (objective prism). Several observations, world wide, make use ofthese tech niques in order to obtain information on the spectral properties of objects in large areas of the sky.
With the ever-increasing speed of integrated circuits, violations of the performance specifications are becoming a major factor affecting the product quality level. The need for testing timing defects is further expected to grow with the current design trend of moving towards deep submicron devices. After a long period of prevailing belief that high stuck-at fault coverage is sufficient to guarantee high quality of shipped products, the industry is now forced to rethink other types of testing. Delay testing has been a topic of extensive research both in industry and in academia for more than a decade. As a result, several delay fault models and numerous testing methodologies have been proposed. Delay Fault Testing for VLSI Circuits presents a selection of existing delay testing research results. It combines introductory material with state-of-the-art techniques that address some of the current problems in delay testing. Delay Fault Testing for VLSI Circuits covers some basic topics such as fault modeling and test application schemes for detecting delay defects.It also presents summaries and conclusions of several recent case studies and experiments related to delay testing. A selection of delay testing issues and test techniques such as delay fault simulation, test generation, design for testability and synthesis for testability are also covered. Delay Fault Testing for VLSI Circuits is intended for use by CAD and test engineers, researchers, tool developers and graduate students. It requires a basic background in digital testing. The book can used as supplementary material for a graduate-level course on VLSI testing.
This book is jointly authored by leading academic and industry researchers. The material is unique in that it spans IC interconnect topics ranging from IBM's revolutionary copper process to an in-depth exploration into interconnect-aware computer architectures.
High Speed CMOS Design Styles is written for the graduate-level student or practicing engineer who is primarily interested in circuit design. It is intended to provide practical reference, or `horse-sense', to mechanisms typically described with a more academic slant. This book is organized so that it can be used as a textbook or as a reference book. High Speed CMOS Design Styles provides a survey of design styles in use in industry, specifically in the high speed microprocessor design community. Logic circuit structures, I/O and interface, clocking, and timing schemes are reviewed and described. Characteristics, sensitivities and idiosyncrasies of each are highlighted. High Speed CMOS Design Styles also pulls together and explains contributors to performance variability that are associated with process, applications conditions and design. Rules of thumb and practical references are offered. Each of the general circuit families is then analyzed for its sensitivity and response to this variability. High Speed CMOS Design Styles is an excellent source of ideas and a compilation of observations that highlight how different approaches trade off critical parameters in design and process space.
3. 2 Input Encoding Targeting Two-Level Logic . . . . . . . . 27 3. 2. 1 One-Hot Coding and Multiple-Valued Minimization 28 3. 2. 2 Input Constraints and Face Embedding 30 3. 3 Satisfying Encoding Constraints . . . . . . . 32 3. 3. 1 Definitions . . . . . . . . . . . . . . . 32 3. 3. 2 Column-Based Constraint Satisfaction 33 3. 3. 3 Row-Based Constraint Satisfaction . . 37 3. 3. 4 Constraint Satisfaction Using Dichotomies . 38 3. 3. 5 Simulated Annealing for Constraint Satisfaction 41 3. 4 Input Encoding Targeting Multilevel Logic. . 43 3. 4. 1 Kernels and Kernel Intersections . . . 44 3. 4. 2 Kernels and Multiple-Valued Variables 46 3. 4. 3 Multiple-Valued Factorization. . . . . 48 3. 4. 4 Size Estimation in Algebraic Decomposition . 53 3. 4. 5 The Encoding Step . 54 3. 5 Conclusion . . . . . . . . . 55 4 Encoding of Symbolic Outputs 57 4. 1 Heuristic Output Encoding Targeting Two-Level Logic. 59 4. 1. 1 Dominance Relations. . . . . . . . . . . . . . . . 59 4. 1. 2 Output Encoding by the Derivation of Dominance Relations . . . . . . . . . . . . . . . . . . . . . 60 . . 4. 1. 3 Heuristics to Minimize the Number of Encoding Bits . . . . . . . . . . . . 64 4. 1. 4 Disjunctive Relationships . . . . . . . . . . . 65 4. 1. 5 Summary . . . . . . . . . . . . . . . . . . 66 . . 4. 2 Exact Output Encoding Targeting Two-Level Logic. 66 4. 2. 1 Generation of Generalized Prime Implicants . 68 4. 2. 2 Selecting a Minimum Encodeable Cover . . . 68 4. 2. 3 Dominance and Disjunctive Relationships to S- isfy Constraints . . . . . . . . . . . 70 4. 2. 4 Constructing the Optimized Cover 73 4. 2. 5 Correctness of the Procedure . . 73 4. 2. 6 Multiple Symbolic Outputs . . .
'Subdivision' is a way of representing smooth shapes in a computer. A curve or surface (both of which contain an in?nite number of points) is described in terms of two objects. One object is a sequence of vertices, which we visualise as a polygon, for curves, or a network of vertices, which we visualise by drawing the edges or faces of the network, for surfaces. The other object is a set of rules for making denser sequences or networks. When applied repeatedly, the denser and denser sequences are claimed to converge to a limit, which is the curve or surface that we want to represent. This book focusses on curves, because the theory for that is complete enough that a book claiming that our understanding is complete is exactly what is needed to stimulate research proving that claim wrong. Also because there are already a number of good books on subdivision surfaces. The way in which the limit curve relates to the polygon, and a lot of interesting properties of the limit curve, depend on the set of rules, and this book is about how one can deduce those properties from the set of rules, and how one can then use that understanding to construct rules which give the properties that one wants.
Standardization of hardware description languages and the availability of synthesis tools has brought about a remarkable increase in the productivity of hardware designers. Yet design verification methods and tools lag behind and have difficulty in dealing with the increasing design complexity. This may get worse because more complex systems are now constructed by (re)using Intellectual Property blocks developed by third parties. To verify such designs, abstract models of the blocks and the system must be developed, with separate concerns, such as interface communication, functionality, and timing, that can be verified in an almost independent fashion. Standard Hardware Description Languages such as VHDL and Verilog are inspired by procedural `imperative' programming languages in which function and timing are inherently intertwined in the statements of the language. Furthermore, they are not conceived to state the intent of the design in a simple declarative way that contains provisions for design choices, for stating assumptions on the environment, and for indicating uncertainty in system timing. Hierarchical Annotated Action Diagrams: An Interface-Oriented Specification and Verification Method presents a description methodology that was inspired by Timing Diagrams and Process Algebras, the so-called Hierarchical Annotated Diagrams. It is suitable for specifying systems with complex interface behaviors that govern the global system behavior. A HADD specification can be converted into a behavioral real-time model in VHDL and used to verify the surrounding logic, such as interface transducers. Also, function can be conservatively abstracted away and the interactions between interconnected devices can be verified using Constraint Logic Programming based on Relational Interval Arithmetic. Hierarchical Annotated Action Diagrams: An Interface-Oriented Specification and Verification Method is of interest to readers who are involved in defining methods and tools for system-level design specification and verification. The techniques for interface compatibility verification can be used by practicing designers, without any more sophisticated tool than a calculator.
Power supply current monitoring to detect CMOS IC defects during production testing quietly laid down its roots in the mid-1970s. Both Sandia Labs and RCA in the United States and Philips Labs in the Netherlands practiced this procedure on their CMOS ICs. At that time, this practice stemmed simply from an intuitive sense that CMOS ICs showing abnormal quiescent power supply current (IDDQ) contained defects. Later, this intuition was supported by data and analysis in the 1980s by Levi (RACD, Malaiya and Su (SUNY-Binghamton), Soden and Hawkins (Sandia Labs and the University of New Mexico), Jacomino and co-workers (Laboratoire d'Automatique de Grenoble), and Maly and co-workers (Carnegie Mellon University). Interest in IDDQ testing has advanced beyond the data reported in the 1980s and is now focused on applications and evaluations involving larger volumes of ICs that improve quality beyond what can be achieved by previous conventional means. In the conventional style of testing one attempts to propagate the logic states of the suspended nodes to primary outputs. This is done for all or most nodes of the circuit. For sequential circuits, in particular, the complexity of finding suitable tests is very high. In comparison, the IDDQ test does not observe the logic states, but measures the integrated current that leaks through all gates. In other words, it is like measuring a patient's temperature to determine the state of health. Despite perceived advantages, during the years that followed its initial announcements, skepticism about the practicality of IDDQ testing prevailed. The idea, however, provided a great opportunity to researchers. New results on test generation, fault simulation, design for testability, built-in self-test, and diagnosis for this style of testing have since been reported. After a decade of research, we are definitely closer to practice.
Models in System Design tracks the general trend in electronics in terms of size, complexity and difficulty of maintenance. System design is by nature combined with prototyping, mixed domain design, and verification, and it is no surprise that today's modeling and models are used in various levels of system design and verification. In order to deal with constraints induced by volume and complexity, new methods and techniques have been defined. Models in System Design provides an overview of the latest modeling techniques for use by system designers. The first part of the book considers system level design, discussing such issues as abstraction, performance and trade-offs. There is also a section on automating system design. The second part of the book deals with some of the newest aspects of embedded system design. These include co-verification and prototyping. Finally, the book includes a section on the use of the MCSE methodology for hardware/software co-design. Models in System Design will help designers and researchers to understand these latest techniques in system design and as such will be of interest to all involved in embedded system design.
After long years of work that have seen little industrial application, high-level synthesis is finally on the verge of becoming a practical tool. The state of high-level synthesis today is similar to the state of logic synthesis ten years ago. At present, logic-synthesis tools are widely used in digital system design. In the future, high-level synthesis will play a key role in mastering design complexity and in truly exploiting the potential of ASIes and PLDs, which demand extremely short design cycles. Work on high-level synthesis began over twenty years ago. Since substantial progress has been made in understanding the basic then, problems involved, although no single universally-accepted theoretical framework has yet emerged. There is a growing number of publications devoted to high-level synthesis, specialized workshops are held regularly, and tutorials on the topic are commonly held at major conferences. This book gives an extensive survey of the research and development in high-level synthesis. In Part I, a short tutorial explains the basic concepts used in high-level synthesis, and follows an example design throughout the synthesis process. In Part II, current high-level synthesis systems are surveyed.
May the Forcing Functions be with You: The Stimulating World of AIED and ITS Research It is my pleasure to write the foreword for Advances in Intelligent Tutoring S- tems. This collection, with contributions from leading researchers in the field of artificial intelligence in education (AIED), constitutes an overview of the many challenging research problems that must be solved in order to build a truly intel- gent tutoring system (ITS). The book not only describes some of the approaches and techniques that have been explored to meet these challenges, but also some of the systems that have actually been built and deployed in this effort. As discussed in the Introduction (Chapter 1), the terms "AIED" and "ITS" are often used int- changeably, and there is a large overlap in the researchers devoted to exploring this common field. In this foreword, I will use the term "AIED" to refer to the - search area, and the term "ITS" to refer to the particular kind of system that AIED researchers build. It has often been said that AIED is "AI-complete" in that to produce a tutoring system as sophisticated and effective as a human tutor requires solving the entire gamut of artificial intelligence research (AI) problems.
Algorithms for VLSI Physical Design Automation, Second Edition is a core reference text for graduate students and CAD professionals. Based on the very successful First Edition, it provides a comprehensive treatment of the principles and algorithms of VLSI physical design, presenting the concepts and algorithms in an intuitive manner. Each chapter contains 3-4 algorithms that are discussed in detail. Additional algorithms are presented in a somewhat shorter format. References to advanced algorithms are presented at the end of each chapter. Algorithms for VLSI Physical Design Automation covers all aspects of physical design. In 1992, when the First Edition was published, the largest available microprocessor had one million transistors and was fabricated using three metal layers. Now we process with six metal layers, fabricating 15 million transistors on a chip. Designs are moving to the 500-700 MHz frequency goal. These stunning developments have significantly altered the VLSI field: over-the-cell routing and early floorplanning have come to occupy a central place in the physical design flow. This Second Edition introduces a realistic picture to the reader, exposing the concerns facing the VLSI industry, while maintaining the theoretical flavor of the First Edition. New material has been added to all chapters, new sections have been added to most chapters, and a few chapters have been completely rewritten. The textual material is supplemented and clarified by many helpful figures. Audience: An invaluable reference for professionals in layout, design automation and physical design.
The time has come for high-level synthesis. When research into synthesizing hardware from abstract, program-like de scriptions started in the early 1970' s, there was no automated path from the register transfer design produced by high-level synthesis to a complete hardware imple mentation. As a result, it was very difficult to measure the effectiveness of high level synthesis methods; it was also hard to justify to users the need to automate architecture design when low-level design had to be completed manually. Today's more mature CAD techniques help close the gap between an automat ically synthesized design and a manufacturable design. Market pressures encour age designers to make use of any and all automated tools. Layout synthesis, logic synthesis, and specialized datapath generators make it feasible to quickly imple ment a register-transfer design in silicon, leaving designers more time to consider architectural improvements. As IC design becomes more automated, customers are increasing their demands; today's leading edge designers using logic synthesis systems are training themselves to be tomorrow's consumers of high-level synthe sis systems. The need for very fast turnaround, a competitive fabrication market WhlCh makes small-quantity ASIC manufacturing possible, and the ever growing co: n plexity of the systems being designed, all make higher-level design automaton inevitable."
In order to design and build computers that achieve and sustain high performance, it is essential that reliability issues be considered care fully. The problem has several aspects. Certainly, considering reliability implies that an engineer must be able to analyze how design decisions affect the incidence of failure. For instance, in order design reliable inte gritted circuits, it is necessary to analyze how decisions regarding design rules affect the yield, i.e., the percentage of functional chips obtained by the manufacturing process. Of equal importance in producing reliable computers is the detection of failures in its Very Large Scale Integrated (VLSI) circuit components, caused by errors in the design specification, implementation, or manufacturing processes. Design verification involves the checking of the specification of a design for correctness prior to carrying out an implementation. Implementation verification ensures that the manual design or automatic synthesis process is correct, i.e., the mask-level description correctly implements the specification. Manufacture test involves the checking of the complex fabrication process for correctness, i.e., ensuring that there are no manufacturing defects in the integrated circuit. It should be noted that all the above verification mechanisms deal not only with verifying the functionality of the integrated circuit but also its performance."
Machine vision systems offer great potential in a large number of areas of manufacturing industry and are used principally for Automated Visual Inspection and Robot Vision. This publication presents the state of the art in image processing. It discusses techniques which have been developed for designing machines for use in industrial inspection and robot control, putting the emphasis on software and algorithms. A comprehensive set of image processing subroutines, which together form the basic vocabulary for the versatile image processing language IIPL, is presented. This language has proved to be extremely effective, working as a design tool, in solving numerous practical inspection problems. The merging of this language with Prolog provides an even more powerful facility which retains the benefits of human and machine intelligence. The authors bring together the practical experience and the picture material from a leading industrial research laboratory and the mathematical foundations necessary to understand and apply concepts in image processing. Interactive Image Processing is a self-contained reference book that can also be used in graduate level courses in electrical engineering, computer science and physics.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a 'friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3.
Short turnaround has become critical in the design of electronic systems. Software- programmable components such as microprocessors and digital signal processors have been used extensively in such systems since they allow rapid design revisions. However, the inherent performance limitations of software-programmable systems mean that they are inadequate for high-performance designs. Designers thus turned to gate arrays as a solution. User-programmable gate arrays (field-programmable gate arrays, FPGAs) have recently emerged and are changing the way electronic systems are designed and implemented. The growing complexity of the logic circuits that can be packed onto an FPGA chip means that it has become important to have automatic synthesis tools that implement logic functions on these architectures. Logic Synthesis for Field-Programmable Gate Arrays describes logic synthesis for both look-up table (LUT) and multiplexor-based architectures, with a balanced presentation of existing techniques together with algorithms and the system developed by the authors. Audience: A useful reference for VLSI designers, developers of computer-aided design tools, and anyone involved in or with FPGAs.
The ever-increasing miniaturization of digital electronic components is hampering the conventional testing of Printed Circuit Boards (PCBs) by means of bed-of-nails fixtures. Basically this is caused by the very high scale of integration of ICs, through which packages with hundreds of pins at very small pitches of down to a fraction of a millimetre, have become available. As a consequence the trace distances between the copper tracks on a printed circuit board cmne down to the same value. Not only the required small physical dimensions of the test nails have made conventional testing unfeasible, but also the complexity to provide test signals for the many hundreds of test nails has grown out of limits. Therefore a new board test methodology had to be invented. Following the evolution in the IC test technology. Boundary-Scan testing hm; become the new approach to PCB testing. By taking precautions in the design of the IC (design for testability), testing on PCB level can be simplified 10 a great extent. This condition has been essential for the success of the introduction of Boundary-Sc, m Test (BST) at board level
When it comes to frameworks, the familiar story of the elephant and the six blind philosophers seems to apply. As each philoso pher encountered a separate part of the elephant, each pronounced his considered, but flawed judgement. One blind philosopher felt a leg and thought it a tree. Another felt the tail and thought he held a rope. Another felt the elephant's flank and thought he stood before a wall. We're supposed to learn about snap judgements from this alle gory, but its author might well have been describing design automation frameworks. For in the reality of today's product development requirements, a framework must be many things to many people. xiv CAD Frameworks: Integration Technology for CAD As the authors of this book note, framework design is an optimi zation problem. Somehow, it has to be both a superior rope for one and a tremendous tree for another. Somehow it needs to provide a standard environment for exploiting the full potential of computer-aided engineering tools. And, somehow, it has to make real such abstractions as interoperability and interchangeability. For years, we've talked about a framework as something that provides application-oriented services, just as an operating system provides system-level support. And for years, that simple statement has hid the tremendous complexity of actually providing those services.
VHDL Answers to Frequently asked Questions is a follow-up to the author's book VHDL Coding Styles and Methodologies (ISBN 0-7923-9598-0). On completion of his first book, the author continued teaching VHDL and actively participated in the comp. lang. vhdl newsgroup. During his experiences, he was enlightened by the many interesting issues and questions relating to VHDL and synthesis. These pertained to: misinterpretations in the use of the language; methods for writing error free, and simulation efficient, code for testbench designs and for synthesis; and general principles and guidelines for design verification. As a result of this wealth of public knowledge contributed by a large VHDL community, the author decided to act as a facilitator of this information by collecting different classes of VHDL issues, and by elaborating on these topics through complete simulatable examples. TItis book is intended for those who are seeking an enhanced proficiency in VHDL. Its target audience includes: 1. Engineers. The book addresses a set of problems commonly experienced by real users of VHDL. It provides practical explanations to the questions, and suggests practical solutions to the raised issues. It also includes packages of common utilities that are useful in the generation of debug code and testbench designs. These packages include conversions to strings (the IMAGE package), generation of Linear Feedback Shift Registers (LFSR), Multiple Input Shift Register (MISR), and random number generators.
Improvement in the quality of integrated circuit designs and a designer's productivity can be achieved by a combination of two factors: * Using more structured design methodologies for extensive reuse of existing components and subsystems. It seems that 70% of new designs correspond to existing components that cannot be reused because of a lack of methodologies and tools. * Providing higher level design tools allowing to start from a higher level of abstraction. After the success and the widespread acceptance of logic and RTL synthesis, the next step is behavioral synthesis, commonly called architectural or high-level synthesis. Behavioral Synthesis and Component Reuse with VHDL provides methods and techniques for VHDL based behavioral synthesis and component reuse. The goal is to develop VHDL modeling strategies for emerging behavioral synthesis tools. Special attention is given to structured and modular design methods allowing hierarchical behavioral specification and design reuse.The goal of this book is not to discuss behavioral synthesis in general or to discuss a specific tool but to describe the specific issues related to behavioral synthesis of VHDL description. This book targets designers who have to use behavioral synthesis tools or who wish to discover the real possibilities of this emerging technology. The book will also be of interest to teachers and students interested to learn or to teach VHDL based behavioral synthesis.
Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: `With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley |
![]() ![]() You may like...
Algorithmic and Experimental Methods in…
Gebhard Boeckle, Wolfram Decker, …
Hardcover
R4,519
Discovery Miles 45 190
Business Math Using Excel (R)
Sharon Burton, Nelda Shelton
Spiral bound
Computational Number Theory…
Attila Pethoe, Michael Pohst, …
Hardcover
R3,570
Discovery Miles 35 700
Exploring the Riemann Zeta Function…
Hugh Montgomery, Ashkan Nikeghbali, …
Hardcover
R4,995
Discovery Miles 49 950
|