![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
The Language of Design articulates the theory that there is a language of design. Drawing upon insights from computational language processing, the language of design is modeled computationally through latent semantic analysis (LSA), lexical chain analysis (LCA), and sentiment analysis (SA). The statistical co-occurrence of semantics (LSA), semantic relations (LCA), and semantic modifiers (SA) in design text is used to illustrate how the reality producing effect of language is itself an enactment of design, allowing a new understanding of the connections between creative behaviors. The computation of the language of design makes it possible to make direct measurements of creative behaviors which are distributed across social spaces and mediated through language. The book demonstrates how machine understanding of design texts based on computation over the language of design yields practical applications for design management.
Object-oriented systems have gained a great deal of popularity recently and their application to graphics has been very successful. This book documents a number of recent advances and indicates numerous areas of current research. The purpose of the book is: - to demonstrate the extraordinary practical utility of object-oriented methods in computer graphics (including user interfaces, image synthesis, CAD), - to examine outstanding research issues in the field of object-oriented graphics, and in particular to investi- gate extensions and shortcomings of the methodology when applied to computer graphics. Papers included in the book extend existing object-oriented graphical techniques, such as Smalltalk's "model view controller" or "constraints," introduce the use of complex and persistent objects in graphics, and give approaches to direct manipulation interfaces. The reader is presented with an in-depth treatment of a number of significant existing graphics systems, both for user interfaces and for image synthesis. There are theoretical surveys and chapters pointing to new directions in the broad field of computer graphics. Computer language scientists will find a useful critique of object-oriented language constructs and suggested ways to extend object-oriented theory.
This volume contains the proceedings of a workshop on Analog Integrated Neural Systems held May 8, 1989, in connection with the International Symposium on Circuits and Systems. The presentations were chosen to encompass the entire range of topics currently under study in this exciting new discipline. Stringent acceptance requirements were placed on contributions: (1) each description was required to include detailed characterization of a working chip, and (2) each design was not to have been published previously. In several cases, the status of the project was not known until a few weeks before the meeting date. As a result, some of the most recent innovative work in the field was presented. Because this discipline is evolving rapidly, each project is very much a work in progress. Authors were asked to devote considerable attention to the shortcomings of their designs, as well as to the notable successes they achieved. In this way, other workers can now avoid stumbling into the same traps, and evolution can proceed more rapidly (and less painfully). The chapters in this volume are presented in the same order as the corresponding presentations at the workshop. The first two chapters are concerned with fmding solutions to complex optimization problems under a predefmed set of constraints. The first chapter reports what is, to the best of our knowledge, the first neural-chip design. In each case, the physics of the underlying electronic medium is used to represent a cost function in a natural way, using only nearest-neighbor connectivity.
Interest in product data exchange and interfaces in the CAD/CAM area is steadi ly growing. The rapidly increasing graphics applications in engineering and sci ence has led to a great variety of heterogeneous hardware and software products. This has become a major obstacle in the progress of systems integration. To improve this situation CAD/CAM users have called for specification and imple mentation of standardized product data interfaces. These needs resulted in the definition of preliminary standards in this area. Since 1975 activities have been concentrated on developing standards for three major areas: - computer graphics, - sculptured surfaces, and - data exchange for engineering drawings. The Graphical Kernel System (GKS) has been accepted as an international standard for graphics programming in 1984, Y14.26M (IGES) was adopted as an American Standard in 1981 and the VDA Surface Interface (VDAFS) has been accepted by the German National Standardization Institute (DIN NAM 96.4). Although considerable progress has been achieved, the complexity of the subject and the dynamics of the CAD/CAM-development still calls for more generality and compatibility of the interfaces. This has resulted in an inter national discussion on further improvements of the standards. The major goal of this book is to bring together the different views and experiences in industry and university in the area of Product Data Interfaces, thereby contributing to the ongoing work in improving the state of the art."
The term "Office Automation" implies much and means little. The word "Office" is usually reserved for units in an organization that have a rather general function. They are supposed to support different activities, but it is notoriously difficult to determine what an office is supposed to do. Automation in this loose context may mean many different things. At one extreme, it is nothing more than giving people better tools than typewriters and telephones with which to do their work more efficiently and effectively. At the opposite extreme, it implies the replacement of people by machines which perform office procedures automatically. In this book we will take the approach that "Office Automation" is much more than just better tools, but falls significantly short of replacing every person in an office. It may reduce the need for clerks, it may take over some secretarial functions, and it may lessen the dependence of principals on support personnel. Office Automation will change the office environment. It will eliminate the more mundane and well understood functions and will highlight the decision-oriented activities in an office. The goal of this book is to provide some understanding of office . activities and to evaluate the potential of Office Information Systems for office procedure automation. To achieve this goal, we need to explore concepts, elaborate on techniques, and outline tools.
These proceedings contain lectures presented at the NATO Advanced Study Institute on Concurrent Engineering Tools and Technologies for Mechanical System Design held in Iowa City, Iowa, 25 May -5 June, 1992. Lectures were presented by leaders from Europe and North America in disciplines contributing to the emerging international focus on Concurrent Engineering of mechanical systems. Participants in the Institute were specialists from throughout NATO in disciplines constituting Concurrent Engineering, many of whom presented contributed papers during the Institute and all of whom participated actively in discussions on technical aspects of the subject. The proceedings are organized into the following five parts: Part 1 Basic Concepts and Methods Part 2 Application Sectors Part 3 Manufacturing Part 4 Design Sensitivity Analysis and Optimization Part 5 Virtual Prototyping and Human Factors Each of the parts is comprised of papers that present state-of-the-art concepts and methods in fields contributing to Concurrent Engineering of mechanical systems. The lead-off papers in each part are based on invited lectures, followed by papers based on contributed presentations made by participants in the Institute.
Manufacturing contributes to over 60 % of the gross national product of the highly industrialized nations of Europe. The advances in mechanization and automation in manufacturing of international competitors are seriously challenging the market position of the European countries in different areas. Thus it becomes necessary to increase significantly the productivity of European industry. This has prompted many governments to support the development of new automation resources. Good engineers are also needed to develop the required automation tools and to apply these to manufacturing. It is the purpose ofthis book to discuss new research results in manufacturing with engineers who face the challenge of building tomor row's factories. Early automation efforts were centered around mechanical gear-and-cam technology and hardwired electrical control circuits. Because of the decreasing life cycle of most new products and the enormous model diversification, factories cannot be automated efficiently any more by these conventional technologies. With the digital computer, its fast calculation speed and large memory capacity, a new tool was created which can substantially improve the productivity of manufactur ing processes. The computer can directly control production and quality assurance functions and adapt itself quickly to changing customer orders and new products."
Computer-aided design syst, ems have become a big business. Advances in technology have made it commercially feasible to place a powerful engineering workstation on every designer's desk. A major selling point for these workstations is the computer aided design software they provide, rather than the actual hardware. The trade magazines are full of advertisements promising full menu design systems, complete with an integrated database (preferably "relational"). What does it all mean? This book focuses on the critical issues of managing the information about a large design project. While undeniably one of the most important areas of CAD, it is also one of the least understood. Merely glueing a database system to a set of existing tools is not a solution. Several additional system components must be built to create a true design management system. These are described in this book. The book has been written from the viewpoint of how and when to apply database technology to the problems encountered by builders of computer-aided design systems. Design systems provide an excellent environment for discovering how far we can generalize the existing database concepts for non-commercial applications. This has emerged as a major new challenge for database system research. We have attem pted to avoid a "database egocentric" view by pointing out where existing database technology is inappropriate for design systems, at least given the current state of the database art. Acknowledgements."
Raster graphics differs from the more traditional vector or line graphics in the sense that images are not made up from line segments but from discrete elements orderly arranged in a two-dimensional rectangular region. There are two reasons for the growing popularity of raster graphics or bit-mapped displays: I) the possibilities they offer to show extremely realistic pictures 2) the dropping prices of those displays and associated processors and memories. With the rise of raster graphics, all kinds of new techniques, methods, algorithms and data representations are associated -such as ray tracing, raster operations, and quadtrees-bringing with them a lot of fruitful research. As stated above raster graphics allows to create extremely realistic (synthesized) pictures. There are important applications in such diverse areas as industrial deSign, flight Simulation, education, image processing and animation. Unfortunately many applications are hampered by the fact that with the present state of the art they reqUire an excessive amount of computing resources. Hence it is worthwhile to investigate methods and techniques which may be of help in redudng computer costs associated with raster graphics applications. Since the choice of data srtuc tures influences the efficiency of algorithms in a crudal way, a workshop was set up in order to bring together a (limited) number of experienced researchers to dis cuss this topic. The workshop was held from 24 to 28 June 1985 at Steensel, a tiny village in the neighbourhood of Eindhoven, the Netherlands.
Fixtures are crucial to new manufacturing techniques and largely dictate the level of flexibility a manufacturing system can achieve. Advanced Fixture Design for FMS provides a systematic basis for the selection and design of fixturing systems. It gives a review of the current state of the art of flexible and reconfigurable fixturing systems. Recent developments in design methodology using CAD are analysed in depth. Fixture design is seen as an inseparable part of process planning. The primary objective of a fixture system is to ensure that the part being manufactured can be made consistently within the tolerance specified in the design. A new method of tolerance analysis is used to check the suitability of location surfaces and the sequence of operations and is explained in detail.
This book is the first issue of a EUROGRAPHICS publication series in the field of computer graphics, an important field of research and a versatile tool for various application areas. The availability of powerful hardware at an affordable price and the evolution of high standard software have led to a rapidly increas ing expansion of computer graphics and the penetration of compu ter graphics techniques and systems into a wide range of applica tion areas. This book series will cover state-of-the-art surveys as well as scientific contributions on specific areas of research and develop ment. The first book in the series contains the Tutorial Notes of the EUROGRAPHICS '83 conference, held in Zagreb, Yugoslavia, in September 1983. It covers four major aspects of computer graphics today: - The first part contains a detailed introduction into computer graphics, its concepts, its methods, its tools, and its devices. It gives an easy access for the newcomer to the field and it offers an overview of the state of the art in computer graphics. - The second part is devoted to interactive techniques. This is currently one of the most important fields of research in computer graphics. Important aspects of this research and its current state are reported. From the developments described here, in the near future powerful generally applicable user interface management systems are likely to evolve."
"Developments in Computer-Integrated Manufacturing" arose from the joint work of members of the IFIP-Working Group 5.3 - Discrete Manufacturing, and other IFIP members. Within the Technical Committee 5 of the International Federation of Information Processing (lFIP) the aim of this Working Group is the advancement of computers and their application to the field of discrete part manufacturing. Capabilities will be expanded in the general areas of planning, selection, and con trol of manufacturing equipment and systems. Tools for problem solution include: mathematics, geometry, algorithms, computer techniques, and manufacturing technology. This technology will influence many industries - machine tool, auto mation, aircraft, appliance, and electronics, to name but a few. The Working Group undertook the following specific tasks: 1. To maintain liaison with other national and international organizations work ing in the same field, cooperating with them whenever desirable to further the common goal 2. To be responsible for the IFIP's work in organizing and presenting the PRO LAMA T Conferences 3. To conduct other working conferences and symposia as deemed appropriate in furthering its mission 4. To develop and sponsor research and industrial and social studies into the various aspects of its mission. The book can be regarded as an attempt to underline the main aspects of techno logy from the point of view of its software and hardware realization. Because of limitations in size and the availability of literature, the problems of robotics and quality control are not described in detail.
This book constitutes the refereed proceedings of the 5th
International Conference on Industrial Applications of Holonic and
Multi-Agent Systems, HoloMAS 2011, held in Toulouse, France, August
29-31, 2011.
Logic Synthesis Using Synopsys (R), Second Edition is for anyone who hates reading manuals but would still like to learn logic synthesis as practised in the real world. Synopsys Design Compiler, the leading synthesis tool in the EDA marketplace, is the primary focus of the book. The contents of this book are specially organized to assist designers accustomed to schematic capture-based design to develop the required expertise to effectively use the Synopsys Design Compiler. Over 100 `Classic Scenarios' faced by designers when using the Design Compiler have been captured, discussed and solutions provided. These scenarios are based on both personal experiences and actual user queries. A general understanding of the problem-solving techniques provided should help the reader debug similar and more complicated problems. In addition, several examples and dc_shell scripts (Design Compiler scripts) have also been provided. Logic Synthesis Using Synopsys (R), Second Edition is an updated and revised version of the very successful first edition. The second edition covers several new and emerging areas, in addition to improvements in the presentation and contents in all chapters from the first edition. With the rapid shrinking of process geometries it is becoming increasingly important that `physical' phenomenon like clusters and wire loads be considered during the synthesis phase. The increasing demand for FPGAs has warranted a greater focus on FPGA synthesis tools and methodology. Finally, behavioral synthesis, the move to designing at a higher level of abstraction than RTL, is fast becoming a reality. These factors have resulted in the inclusion of separate chapters in the second edition to cover Links to Layout, FPGA Synthesis and Behavioral Synthesis, respectively. Logic Synthesis Using Synopsys (R), Second Edition has been written with the CAD engineer in mind. A clear understanding of the synthesis tool concepts, its capabilities and the related CAD issues will help the CAD engineer formulate an effective synthesis-based ASIC design methodology. The intent is also to assist design teams to better incorporate and effectively integrate synthesis with their existing in-house design methodology and CAD tools.
Software Synthesis from Dataflow Graphs addresses the problem of generating efficient software implementations from applications specified as synchronous dataflow graphs for programmable digital signal processors (DSPs) used in embedded real- time systems. The advent of high-speed graphics workstations has made feasible the use of graphical block diagram programming environments by designers of signal processing systems. A particular subset of dataflow, called Synchronous Dataflow (SDF), has proven efficient for representing a wide class of unirate and multirate signal processing algorithms, and has been used as the basis for numerous DSP block diagram-based programming environments such as the Signal Processing Workstation from Cadence Design Systems, Inc., COSSAP from Synopsys (R) (both commercial tools), and the Ptolemy environment from the University of California at Berkeley. A key property of the SDF model is that static schedules can be determined at compile time. This removes the overhead of dynamic scheduling and is thus useful for real-time DSP programs where throughput requirements are often severe. Another constraint that programmable DSPs for embedded systems have is the limited amount of on-chip memory. Off-chip memory is not only expensive but is also slower and increases the power consumption of the system; hence, it is imperative that programs fit in the on-chip memory whenever possible. Software Synthesis from Dataflow Graphs reviews the state-of-the-art in constructing static, memory-optimal schedules for programs expressed as SDF graphs. Code size reduction is obtained by the careful organization of loops in the target code. Data buffering is optimized by constructing the loop hierarchy in provably optimal ways for many classes of SDF graphs. The central result is a uniprocessor scheduling framework that provably synthesizes the most compact looping structures, called single appearance schedules, for a certain class of SDF graphs. In addition, algorithms and heuristics are presented that generate single appearance schedules optimized for data buffering usage. Numerous practical examples and extensive experimental data are provided to illustrate the efficacy of these techniques.
This monograph represents a summary of our work in the last two years in applying the method of simulated annealing to the solution of problems that arise in the physical design of VLSI circuits. Our study is experimental in nature, in that we are con cerned with issues such as solution representations, neighborhood structures, cost functions, approximation schemes, and so on, in order to obtain good design results in a reasonable amount of com putation time. We hope that our experiences with the techniques we employed, some of which indeed bear certain similarities for different problems, could be useful as hints and guides for other researchers in applying the method to the solution of other prob lems. Work reported in this monograph was partially supported by the National Science Foundation under grant MIP 87-03273, by the Semiconductor Research Corporation under contract 87-DP- 109, by a grant from the General Electric Company, and by a grant from the Sandia Laboratories."
Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.
This book describes a new type of computer aided VLSI design tool, called a VLSI System Planning, that is meant to aid designers dur ing the early, or conceptual, state of design. During this stage of design, the objective is to define a general design plan, or approach, that is likely to result in an efficient implementation satisfying the initial specifications, or to determine that the initial specifications are not realizable. A design plan is a collection of high level design decisions. As an example, the conceptual design of digital filters involves choosing the type of algorithm to implement (e. g. , finite impulse response or infinite impulse response), the type of polyno mial approximation (e. g. , Equiripple or Chebyshev), the fabrication technology (e. g. , CMOS or BiCMOS), and so on. Once a particu lar design plan is chosen, the detailed design phase can begin. It is during this phase that various synthesis, simulation, layout, and test activities occur to refine the conceptual design, gradually filling more detail until the design is finally realized. The principal advantage of VLSI System Planning is that the increasingly expensive resources of the detailed design process are more efficiently managed. Costly redesigns are minimized because the detailed design process is guided by a more credible, consistent, and correct design plan.
Theory and practice of tolerances are very important for designing and manufacturing engineering artifacts on a rational basis. Tolerance specifies a degree of "discrepancy" between an idealized object and its physical realization. Such discrepancy inevitably comes into our product realization processes because of practical cost consideration or our inability to fully control manufacturing processes. Major product and production characteristics which are affected by tolerances are product quality and cost. For achieving high precision machines tight tolerance specification is necessary, but this will normally increase product cost. In order to optimally compromise the conflicting requirements of quality and cost, it is essential to take into account of the total product life cycle throughout product planning, design, manufacturing, maintenance and recycling. For example, in order to construct durable products under severe working conditions, low sensitivity of product functionality with respect to tolerances is required. In future, re-use of components or parts will become important, and tolerance synthesis with respect to this aspect will be an interesting future research topics.
In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."
In system design, generation of high-level abstract models that can be closely associated with evolving lower-level models provides designers with the ability to incrementally test' an evolving design against a model of a specification. Such high-level models may deal with areas such as performance, reliability, availability, maintainability, and system safety. Abstract models also allow exploration of the hardware versus software design space in an incremental fashion as a fuller, detailed design unfolds, leaving behind the old practice of hardware-software binding too early in the design process. Such models may also allow the inclusion of non-functional aspects of design (e.g. space, power, heat) in a simulatable information model dealing with the system's operation. This book addresses Model Generation and Application specifically in the following domains: Specification modeling (linking object/data modeling, behavior modeling, and activity modeling). Operational specification modeling (modeling the way the system is supposed to operate - from a user's viewpoint). Linking non-functional parameters with specification models. Hybrid modeling (linking performance and functional elements). Application of high-level modeling to hardware/software approaches. Mathematical analysis techniques related to the modeling approaches. Reliability modeling. Applications of High Level Modeling. Reducing High Level Modeling to Practice. High-Level System Modeling: Specification and Design Methodologies describes the latest research and practice in the modeling of electronic systems and as such is an important update for all researchers, design engineers and technical managers working in design automation and circuit design.
Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom munications, robotics, automotive electronics, image processing, intelli gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar chitecture and circuit compilation, device sizing and the layout genera tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers."
Machine Vision technology is becoming an indispensible part of the manufacturing industry. Biomedical and scientific applications of machine vision and imaging are becoming more and more sophisticated, and new applications continue to emerge. This book gives an overview of ongoing research in machine vision and presents the key issues of scientific and practical interest. A selected board of experts from the US, Japan and Europe provides an insight into some of the latest work done on machine vision systems and appliccations.
Object-oriented techniques and languages have been proven to significantly increase engineering efficiency in software development. Many benefits are expected from their introduction into electronic modeling. Among them are better support for model reusability and flexibility, more efficient system modeling, and more possibilities in design space exploration and prototyping. Object-Oriented Modeling explores the latest techniques in object-oriented methods, formalisms and hardware description language extensions. The seven chapters comprising this book provide an overview of the latest object-oriented techniques for designing systems and hardware. Many examples are given in C++, VHDL and real-time programming languages. Object-Oriented Modeling describes further the use of object-oriented techniques in applications such as embedded systems, telecommunications and real-time systems, using the very latest techniques in object-oriented modeling. It is an essential guide to researchers, practitioners and students involved in software, hardware and system design.
Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models." |
You may like...
Digital Control Engineering - Analysis…
M. Sami Fadali, Antonio Visioli
Paperback
R2,709
Discovery Miles 27 090
Recent Trends in Computer-aided…
Saptarshi Chatterjee, Debangshu Dey, …
Paperback
R2,570
Discovery Miles 25 700
SolidWorks Flow Simulation 2022 Black…
Gaurav Verma, Matt Weber
Hardcover
R1,276
Discovery Miles 12 760
Mem-elements for Neuromorphic Circuits…
Christos Volos, Viet-Thanh Pham
Paperback
R3,613
Discovery Miles 36 130
Creo Parametric 9.0 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,149
Discovery Miles 21 490
Up and Running with AutoCAD 2022 - 2D…
Elliot J. Gindis, Robert C. Kaebisch
Paperback
R2,087
Discovery Miles 20 870
|