Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
The rapid evolution and explosive growth of integrated circuit technology have impacted society more than any other technological development of the 20th century. Integrated circuits (ICs) are used universally and the expanding use of IC technology requires more accurate circuit analysis methods and tools, prompting the introduction of computers into the design process. The goal of this book is to build a firm foundation in the use of computer-assisted techniques for IC device and process design. Both practical and analytical viewpoints are stressed to give the reader the background necessary to appreciate CAD tools and to feel comfortable with their use. Technology CAD - Computer Simulation of IC Processes and Devices presents a unified discourse on process and device CAD as interrelated subjects, building on a wide range of experiences and applications of the SUPREM program. Chapter 1 focuses on the motivation for coupled process and device CAD. In Chapter 2 SUPREM III is introduced, and process CAD is discussed in terms of ion-implantation, impurity diffusion, and oxidation models.Chapter 3 introduces the Stanford device analysis program SEDAN III (SEmiconductor Device ANalysis). The next three chapters move into greater detail concerning device operating principles and analysis techniques. Chapter 4 reviews the classical formulation of pn junction theory and uses device analysis (SEDAN) both to evaluate some of the classical assumptions and to investigate the difficult problem of high level injection. Chapter 5 returns to MOS devices, reviews the first-order MOS theory, and introduces some important second-order effects. Chapter 6 considers the bipolar transistor. Chapter 7 considers the application of process simulation and device analysis to technology design. The BiCMOS process is selected as a useful design vehicle for two reasons. First, it allows the reader to pull together concepts from the entire book. Second, the inherent nature of BiCMOS technology offers real constraints and hence trade-offs which must be understood and accounted for.
The second half of this century will remain as the era of proliferation of electronic computers. They did exist before, but they were mechanical. During next century they may perform other mutations to become optical or molecular or even biological. Actually, all these aspects are only fancy dresses put on mathematical machines. This was always recognized to be true in the domain of software, where "machine" or "high level" languages are more or less rigourous, but immaterial, variations of the universaly accepted mathematical language aimed at specifying elementary operations, functions, algorithms and processes. But even a mathematical machine needs a physical support, and this is what hardware is all about. The invention of hardware description languages (HDL's) in the early 60's, was an attempt to stay longer at an abstract level in the design process and to push the stage of physical implementation up to the moment when no more technology independant decisions can be taken. It was also an answer to the continuous, exponential growth of complexity of systems to be designed. This problem is common to hardware and software and may explain why the syntax of hardware description languages has followed, with a reasonable delay of ten years, the evolution of the programming languages: at the end of the 60's they were" Algol like" , a decade later "Pascal like" and now they are "C or ADA-like". They have also integrated the new concepts of advanced software specification languages.
In order to design and build computers that achieve and sustain high performance, it is essential that reliability issues be considered care fully. The problem has several aspects. Certainly, considering reliability implies that an engineer must be able to analyze how design decisions affect the incidence of failure. For instance, in order design reliable inte gritted circuits, it is necessary to analyze how decisions regarding design rules affect the yield, i.e., the percentage of functional chips obtained by the manufacturing process. Of equal importance in producing reliable computers is the detection of failures in its Very Large Scale Integrated (VLSI) circuit components, caused by errors in the design specification, implementation, or manufacturing processes. Design verification involves the checking of the specification of a design for correctness prior to carrying out an implementation. Implementation verification ensures that the manual design or automatic synthesis process is correct, i.e., the mask-level description correctly implements the specification. Manufacture test involves the checking of the complex fabrication process for correctness, i.e., ensuring that there are no manufacturing defects in the integrated circuit. It should be noted that all the above verification mechanisms deal not only with verifying the functionality of the integrated circuit but also its performance."
Machine vision systems offer great potential in a large number of areas of manufacturing industry and are used principally for Automated Visual Inspection and Robot Vision. This publication presents the state of the art in image processing. It discusses techniques which have been developed for designing machines for use in industrial inspection and robot control, putting the emphasis on software and algorithms. A comprehensive set of image processing subroutines, which together form the basic vocabulary for the versatile image processing language IIPL, is presented. This language has proved to be extremely effective, working as a design tool, in solving numerous practical inspection problems. The merging of this language with Prolog provides an even more powerful facility which retains the benefits of human and machine intelligence. The authors bring together the practical experience and the picture material from a leading industrial research laboratory and the mathematical foundations necessary to understand and apply concepts in image processing. Interactive Image Processing is a self-contained reference book that can also be used in graduate level courses in electrical engineering, computer science and physics.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a 'friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3.
Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.
One of the main applications of VHDL is the synthesis of electronic circuits. Circuit Synthesis with VHDL is an introduction to the use of VHDL logic (RTL) synthesis tools in circuit design. The modeling styles proposed are independent of specific market tools and focus on constructs widely recognized as synthesizable by synthesis tools. A statement of the prerequisites for synthesis is followed by a short introduction to the VHDL concepts used in synthesis. Circuit Synthesis with VHDL presents two possible approaches to synthesis: the first starts with VHDL features and derives hardware counterparts; the second starts from a given hardware component and derives several description styles. The book also describes how to introduce the synthesis design cycle into existing design methodologies and the standard synthesis environment. Circuit Synthesis with VHDL concludes with a case study providing a realistic example of the design flow from behavioral description down to the synthesized level. Circuit Synthesis with VHDL is essential reading for all students, researchers, design engineers and managers working with VHDL in a synthesis environment.
Short turnaround has become critical in the design of electronic systems. Software- programmable components such as microprocessors and digital signal processors have been used extensively in such systems since they allow rapid design revisions. However, the inherent performance limitations of software-programmable systems mean that they are inadequate for high-performance designs. Designers thus turned to gate arrays as a solution. User-programmable gate arrays (field-programmable gate arrays, FPGAs) have recently emerged and are changing the way electronic systems are designed and implemented. The growing complexity of the logic circuits that can be packed onto an FPGA chip means that it has become important to have automatic synthesis tools that implement logic functions on these architectures. Logic Synthesis for Field-Programmable Gate Arrays describes logic synthesis for both look-up table (LUT) and multiplexor-based architectures, with a balanced presentation of existing techniques together with algorithms and the system developed by the authors. Audience: A useful reference for VLSI designers, developers of computer-aided design tools, and anyone involved in or with FPGAs.
The ever-increasing miniaturization of digital electronic components is hampering the conventional testing of Printed Circuit Boards (PCBs) by means of bed-of-nails fixtures. Basically this is caused by the very high scale of integration of ICs, through which packages with hundreds of pins at very small pitches of down to a fraction of a millimetre, have become available. As a consequence the trace distances between the copper tracks on a printed circuit board cmne down to the same value. Not only the required small physical dimensions of the test nails have made conventional testing unfeasible, but also the complexity to provide test signals for the many hundreds of test nails has grown out of limits. Therefore a new board test methodology had to be invented. Following the evolution in the IC test technology. Boundary-Scan testing hm; become the new approach to PCB testing. By taking precautions in the design of the IC (design for testability), testing on PCB level can be simplified 10 a great extent. This condition has been essential for the success of the introduction of Boundary-Sc, m Test (BST) at board level
The intense drive for signal integrity has been at the forefront of rapid and new developments in CAD algorithms. With increasing demands for high signal speeds coupled with a decrease in feature size, interconnect effects such as signal delay, distortion and crosstalk become the dominant factor limiting overall performance of VLSI systems. Although SPICE is used on a daily basis by many engineers for analog simulation and general circuit analysis, current versions of SPICE do not handle adequately the new emerging challenges of interconnect effects. Moment-matching techniques, such as asymptotic waveform evaluation, have recently proven useful in the analysis of large interconnect structures containing elements such as lossy coupled transmission lines with linear or nonlinear terminations. At a CPU cost of a little more than one DC analysis, these techniques are 2--3 orders of magnitude faster than full simulation techniques such as FFT. Asymptotic Waveform Evaluation presents an overview of the diverse algorithms and applications of moment matching techniques. The material is presented systematically and is supported by many examples.Issues such as sensitivity analysis and three-dimensional analysis are also covered. Asymptotic Waveform Evaluation will be of interest to engineers, students and researchers involved in the development and study of circuit simulation as well as interconnect analysis. It will also interest design engineers who are involved in dealing with high-speed issues, and graduate students who are active in the development of CAD tools for electronic systems.
Model Generation in Electronic Design covers a wide range of model applications and research. The book begins by describing a model generator to create component models. It goes on to discuss ASIC design and ASIC library generation. This section includes chapters on the requirements for developing and ASIC library, a case study in which VITAL is used to create such a library, and the analysis and description of the accuracy required in modeling interconnections in ASIC design. Other chapters describe the development of thermal models for electronic devices, the development of a set of model packages for VHDL floating point operations, a techniques for model validation and verification, and a tool for model encryption. Model Generation in Electronic Design is an essential update for users, vendors, model producers, technical managers, designers and researchers working in electronic design.
The purpose of this book is to introduce VHSIC Hardware Description Lan guage (VHDL) and its use for synthesis. VHDL is a hardware description language which provides a means of specifying a digital system over different levels of abstraction. It supports behavior specification during the early stages of a design process and structural specification during the later implementation stages. VHDL was originally introduced as a hardware description language that per mitted the simulation of digital designs. It is now increasingly used for design specifications that are given as the input to synthesis tools which translate the specifications into netlists from which the physical systems can be built. One problem with this use of VHDL is that not all of its constructs are useful in synthesis. The specification of delay in signal assignments does not have a clear meaning in synthesis, where delays have already been determined by the im plementationtechnolo y. VHDL has data-structures such as files and pointers, useful for simulation purposes but not for actual synthesis. As a result synthe sis tools accept only subsets of VHDL. This book tries to cover the synthesis aspect of VHDL, while keeping the simulation-specifics to a minimum. This book is suitable for working professionals as well as for graduate or under graduate study. Readers can view this book as a way to get acquainted with VHDL and how it can be used in modeling of digital designs."
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 9 QUADRATIC 0-1 PROGRAMMING 8S 9. 1 Energy Minimization 86 9. 2 Notation and Tenninology . . . . . . . . . . . . . . . . . 87 9. 3 Minimization Technique . . . . . . . . . . . . . . . . . . 88 9. 4 An Example . . . . . . . . . . . . . . . . . . . . . . . . 92 9. 5 Accelerated Energy Minimization. . . . . . . . . . . . . 94 9. 5. 1 Transitive Oosure . . . . . . . . . . . . . . . . . 94 9. 5. 2 Additional Pairwise Relationships 96 9. 5. 3 Path Sensitization . . . . . . . . . . . . . . . . . 97 9. 6 Experimental Results 98 9. 7 Summary. . . . . . . . . . . . . . . . . . . . . . . . . . 100 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 10 TRANSITIVE CLOSURE AND TESTING 103 10. 1 Background . . . . . . . . . . . . . . . . . . . . . . . . 104 10. 2 Transitive Oosure Definition 105 10. 3 Implication Graphs 106 10. 4 A Test Generation Algorithm 107 10. 5 Identifying Necessary Assignments 112 10. 5. 1 Implicit Implication and Justification 113 10. 5. 2 Transitive Oosure Does More Than Implication and Justification 115 10. 5. 3 Implicit Sensitization of Dominators 116 10. 5. 4 Redundancy Identification 117 10. 6 Summary 119 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 11 POLYNOMIAL-TIME TESTABILITY 123 11. 1 Background 124 11. 1. 1 Fujiwara's Result 125 11. 1. 2 Contribution of the Present Work . . . . . . . . . 126 11. 2 Notation and Tenninology 127 11. 3 A Polynomial TlDle Algorithm 128 11. 3. 1 Primary Output Fault 129 11. 3. 2 Arbitrary Single Fault 135 11. 3. 3 Multiple Faults. . . . . . . . . . . . . . . . . . . 137 11. 4 Summary. . . . . . . . . . . . . . . . . . . . . . . . . . 139 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 ix 12 SPECIAL CASES OF HARD PROBLEMS 141 12. 1 Problem Statement 142 12. 2 Logic Simulation 143 12. 3 Logic Circuit Modeling . 146 12. 3. 1 Modelfor a Boolean Gate . . . . . . . . . . . . . 147 12. 3. 2 Circuit Modeling 148 12.
Practical Synthesis of High-Performance Analog Circuits presents a technique for automating the design of analog circuits. Market competition and the astounding pace of technological innovation exert tremendous pressure on circuit design engineers to turn ideas into products quickly and get them to market. In digital Application Specific Integrated Circuit (ASIC) design, computer aided design (CAD) tools have substantially eased this pressure by automating many of the laborious steps in the design process, thereby allowing the designer to maximise his design expertise. But the world is not solely digital. Cellular telephones, magnetic disk drives, neural networks and speech recognition systems are a few of the recent technological innovations that rely on a core of analog circuitry and exploit the density and performance of mixed analog/digital ASICs. To maximize profit, these mixed-signal ASICs must also make it to market as quickly as possible. However, although the engineer working on the digital portion of the ASIC can rely on sophisticated CAD tools to automate much of the design process, there is little help for the engineer working on the analog portion of the chip. With the exception of simulators to verify the circuit design when it is complete, there are almost no general purpose CAD tools that an analog design engineer can take advantage of to automate the analog design flow and reduce his time to market. Practical Synthesis of High-Performance Analog Circuits presents a new variation-tolerant analog synthesis strategy that is a significant step towards ending the wait for a practical analog synthesis tool. A new synthesis strategy is presented that can fully automate the path from a circuit topology and performance specifications to a sized variation-tolerant circuit schematic. This strategy relies on asymptotic waveform evaluation to predict circuit performance and simulated annealing to solve a novel non-linear infinite programming optimization formulation of the circuit synthesis problem via a sequence of smaller optimization problems. Practical Synthesis of High-Performance Analog Circuits will be of interest to analog circuit designers, CAD EDA industry professionals, academics and students.
Parallel CFD 2008, the twentieth in the high-level international series of meetings featuring different aspect of parallel computing in computational?uid dynamics and other modern scienti?c domains was held May 19?22, 2008 in Lyon, France. The themes of the 2008 meeting included the traditional emphases of this c- ference, and experiences with contemporary architectures. Around 70 presentations were included into the conference program in the following sessions: Parallel Algorithms and solvers Parallel performances with contemporary architectures Structured and unstructured grid methods, boundary methods software framework and components architecture CFD applications(Bio ?uid, environmentalproblem)Lattice Boltzmannmethodand SPH Optimisation in Aerodynamics This book presents an up-to-date overviewof the state of the art in Parallel C- putational Fluid Dynamics from Asia, Europe, and North America. This reviewed proceedingsincluded about sixty percent of the oral lectures presented at the conf- ence. The editors. VI Preface Parallel CFD 2008 was organized by the Institut Camille Jordan of the Univ- sity of Lyon 1 in collaboration with the Center for the Development of the Parallel Scienti?c Computing. The Scienti?c Committee and Local Organizers of Parallel CFD 2008 are - lighted to acknowledge the generous sponsorship of the following organizations, through ?nancial or in-kind assistance. Assistance of our sponsors allowed to - ganize scienti?c as well as social program of the conference.
It is a great honor to provide a few words of introduction for Dr. Georges Gielen's and Prof. Willy Sansen's book "Symbolic analysis for automated design of analog integrated circuits." The symbolic analysis method presented in this book represents a significant step forward in the area of analog circuit design. As demonstrated in this book, symbolic analysis opens up new possibilities for the development of computer-aided design (CAD) tools that can analyze an analog circuit topology and automatically size the components for a given set of specifications. Symbolic analysis even has the potential to improve the training of young analog circuit designers and to guide more experienced designers through second-order phenomena such as distortion. This book can also serve as an excellent reference for researchers in the analog circuit design area and creators of CAD tools, as it provides a comprehensive overview and comparison of various approaches for analog circuit design automation and an extensive bibliography. The world is essentially analog in nature, hence most electronic systems involve both analog and digital circuitry. As the number of transistors that can be integrated on a single integrated circuit (IC) substrate steadily increases over time, an ever increasing number of systems will be implemented with one, or a few, very complex ICs because of their lower production costs.
This book presents a powerful new language and methodology for programming complex reactive systems in a scenario-based manner. The language is live sequence charts (LSCs), a multimodal extension of sequence charts and UML's sequence diagrams, used in the past mainly for requirements. The methodology is play-in/play-out, an unusually convenient means for specifying inter-object scenario-based behavior directly from a GUI or an object model diagram, with the surprising ability to execute that behavior, or those requirements, directly. The language and methodology are supported by a fully implemented tool the Play-Engine which is attached to the book in CD form. Comments from experts in the field: The design of reactive systems is one of the most challenging problems in computer science. This books starts with a critical insight to explain the difficulty of this problem: there is a fundamental gap between the scenario-based way in which people think about such systems and the state-based way in which these systems are implemented. The book then offers a radical proposal to bridge this gap by means of playing scenarios. Systems can be specified by playing in scenarios and implemented by means of a Play-Engine that plays out scenarios. This idea is carried out and developed, lucidly, formally and playfully, to its fullest. The result is a compelling proposal, accompanied by a prototype software engine, for reactive systems design, which is bound to cause a splash in the software-engineering community. Moshe Y. Vardi, Rice University, Houston, Texas, USA Scenarios are a primary exchange tool in explaining system behavior to others, but their limited expressive power never made them able to fully describe systems, thus limiting their use. The language of Live Sequence Charts (LSCs) presented in this beautifully written book achieves this goal, and the attached Play-Engine software makes these LSCs really come alive. This is undoubtedly a key breakthrough that will start long-awaited and exciting new directions in systems specification, synthesis, and analysis. Gerard Berry, Esterel Technologies and INRIA, Sophia-Antipolis, France The approach of David Harel and Rami Marelly is a fascinating way of combining prototyping techniques with techniques for identifying behavior and user interfaces. Manfred Broy, Technical University of Munich, Germany"
The Verilog Programming Language Interface, commonly called the Verilog PU, is one of the more powerful features of Verilog. The PU provides a means for both hardware designers and software engineers to interface their own programs to commercial Verilog simulators. Through this interface, a Verilog simulator can be customized to perform virtually any engineering task desired. Just a few of the common uses of the PU include interfacing Veri log simulations to C language models, adding custom graphical tools to a simulator, reading and writing proprietary file formats from within a simulation, performing test coverage analysis during simulation, and so forth. The applications possible with the Verilog PLI are endless. Intended audience: this book is written for digital design engineers with a background in the Verilog Hardware Description Language and a fundamental knowledge of the C programming language. It is expected that the reader: Has a basic knowledge of hardware engineering, specifically digital design of ASIC and FPGA technologies. Is familiar with the Verilog Hardware Description Language (HDL), and can write models of hardware circuits in Verilog, can write simulation test fixtures in Verilog, and can run at least one Verilog logic simulator. Knows basic C-language programming, including the use of functions, pointers, structures and file I/O. Explanations of the concepts and terminology of digital
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
Recent years have seen rapid strides in the level of sophistication of VLSI circuits. On the performance front, there is a vital need for techniques to design fast, low-power chips with minimum area for increasingly complex systems, while on the economic side there is the vastly increased pressure of time-to-market. These pressures have made the use of CAD tools mandatory in designing complex systems. Timing Analysis and Optimization of Sequential Circuits describes CAD algorithms for analyzing and optimizing the timing behavior of sequential circuits with special reference to performance parameters such as power and area. A unified approach to performance analysis and optimization of sequential circuits is presented. The state of the art in timing analysis and optimization techniques is described for circuits using edge-triggered or level-sensitive memory elements. Specific emphasis is placed on two methods that are true sequential timing optimizations techniques: retiming and clock skew optimization.Timing Analysis and Optimization of Sequential Circuits covers the following topics: * Algorithms for sequential timing analysis * Fast algorithms for clock skew optimization and their applications * Efficient techniques for retiming large sequential circuits * Coupling sequential and combinational optimizations. Timing Analysis and Optimization of Sequential Circuits is written for graduate students, researchers and professionals in the area of CAD for VLSI and VLSI circuit design.
History of the Book The last three decades have witnessed an explosive development in integrated circuit fabrication technologies. The complexities of cur rent CMOS circuits are reaching beyond the 100 nanometer feature size and multi-hundred million transistors per integrated circuit. To fully exploit this technological potential, circuit designers use sophisticated Computer-Aided Design (CAD) tools. While supporting the talents of innumerable microelectronics engineers, these CAD tools have become the enabling factor responsible for the successful design and implemen tation of thousands of high performance, large scale integrated circuits. This research monograph originated from a body of doctoral disserta tion research completed by the first author at the University of Rochester from 1994 to 1999 while under the supervision of Prof. Eby G. Friedman. This research focuses on issues in the design of the clock distribution net work in large scale, high performance digital synchronous circuits and particularly, on algorithms for non-zero clock skew scheduling. During the development of this research, it has become clear that incorporating timing issues into the successful integrated circuit design process is of fundamental importance, particularly in that advanced theoretical de velopments in this area have been slow to reach the designers' desktops."
Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: `With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley
Integrated circuit technology is widely used for the full integration of electronic systems. In general, these systems are realized using digital techniques implemented in CMOS technology. The low power dissipation, high packing density, high noise immunity, ease of design and the relative ease of scaling are the driving forces of CMOS technology for digital applications. Parts of these systems cannot be implemented in the digital domain and will remain analog. In order to achieve complete system integration these analog functions are preferably integrated in the same CMOS technology. An important class of analog circuits that need to be integrated in CMOS are analog filters. This book deals with very high frequency (VHF) filters, which are filters with cut-off frequencies ranging from the low megahertz range to several hundreds of megahertz. Until recently the maximal cut-off frequencies of CMOS filters were limited to the low megahertz range. By applying the techniques presented in this book the limit could be pushed into the true VHF domain, and integrated VHF filters become feasible. Application of these VHF filters can be found in the field of communication, instrumentation and control systems. For example, pre and post filtering for high-speed AD and DA converters, signal reconstruction, signal decoding, etc. The general design philosophy used in this book is to allow only the absolute minimum of signal carrying nodes throughout the whole filter. This strategy starts at the filter synthesis level and is extended to the level of electronic circuitry. The result is a filter realization in which all capacitators (including parasitics) have a desired function. The advantage of this technique is that high frequency parasitic effects (parasitic poles/zeros) are minimally present. The book is a reference for engineers in research or development, and is suitable for use as a text for advanced courses on the subject. >
The automation of layout synthesis design under stringent timing specifications is essential for state-of-the-art VLSI circuits and systems design. Especially, the timing-driven layout synthesis with optimal placement and routing of transistors with proper sizing is most critical in view of the chip area, interconnection parasitics, circuit delay and power dissipation. This book presents a systematic and unified view of the layout synthesis problem with a strong focus on CMOS technology. The criticality of RC parasitics in the interconnects and the optimal sizing of both p-channel and n-channel translators are illustrated for motivation. Following the motivation, the problems of modeling circuit delays and translator sizing are formulated and solved with mathematical rigor. Various delay models for CMOS circuits are discussed to account for realistic interconnection parasitics, the effect of transistor sizes, and also the input slew rates. Also many of the efficient transistor sizing algorithms are critically reviewed and the most recent transistor sizing algorithm based on convex programming techniques is introduced.For design automation of the rigorous CMOS layout synthesis, an integrated system that employs a suite of functional modules is introduced for step-by-step illustration of the design optimization process that produces highly compact CMOS layouts that meet user-specified timing and logical netlist requirements. Through most rigorous discussion of the essential design automation process steps and important models and algorithms this book presents a unified systems approach that can be practiced for high-performance CMOS VLSI designs. This book serves as an excellent reference, and can be used as text in advanced courses covering VLSI design, especially for design automation of physical design.
This book deals with Web applications in product design and manufacture, thus filling an information gap in digital manufacturing in the Internet era. It helps both developers and users to appreciate the potentials, as well as difficulties, in developing and adopting Web applications. The objective is to equip potential users and practitioners of Web applications with a better appreciation of the technology. In addition, Web application developers and new researchers in this field will gain a clearer understanding of the selection of system architecture and design, development and implementation techniques, and deployment strategies. The book is divided into two main parts. The first part gives an overview of Web and Internet and the second explains eight typical Web applications. |
You may like...
Digital Conversion on the Way to…
Numan M. Durakbasa, M. Gunes Gencyilmaz
Hardcover
R5,678
Discovery Miles 56 780
Industry 4.0 and Advanced Manufacturing…
Amaresh Chakrabarti, Manish Arora
Hardcover
R6,916
Discovery Miles 69 160
Building Information Modelling (BIM) in…
W. P. de Wilde, L. Mahdjoubi, …
Hardcover
R4,856
Discovery Miles 48 560
Creo Parametric 9.0 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,261
Discovery Miles 22 610
SolidWorks Flow Simulation 2022 Black…
Gaurav Verma, Matt Weber
Hardcover
R1,331
Discovery Miles 13 310
Autodesk Revit 2023 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,001
Discovery Miles 20 010
|