![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
In Thermal and Power Management of Integrated Circuits, power and thermal management issues in integrated circuits during normal operating conditions and stress operating conditions are addressed. Thermal management in VLSI circuits is becoming an integral part of the design, test, and manufacturing. Proper thermal management is the key to achieve high performance, quality and reliability. Performance and reliability of integrated circuits are strong functions of the junction temperature. A small increase in junction temperature may result in significant reduction in the device lifetime. This book reviews the significance of the junction temperature as a reliability measure under nominal and burn-in conditions. The latest research in the area of electro-thermal modeling of integrated circuits will also be presented. Recent models and associated CAD tools are covered and various techniques at the circuit and system levels are reviewed. Subsequently, the authors provide an insight into the concept of thermal runaway and how it may best be avoided. A section on low temperature operation of integrated circuits concludes the book.
This volume provides a complete understanding of the fundamental causes of routing congestion in present-day and next-generation VLSI circuits, offers techniques for estimating and relieving congestion, and provides a critical analysis of the accuracy and effectiveness of these techniques. The book includes metrics and optimization techniques for routing congestion at various stages of the VLSI design flow. The subjects covered include an explanation of why the problem of congestion is important and how it will trend, plus definitions of metrics that are appropriate for measuring congestion, and descriptions of techniques for estimating and optimizing routing congestion issues in cell-/library-based VLSI circuits.
Data Access and Storage Management for Embedded Programmable
Processors gives an overview of the state-of-the-art in
system-level data access and storage management for embedded
programmable processors. The targeted application domain covers
complex embedded real-time multi-media and communication
applications. Many of these applications are data-dominated in the
sense that their cost related aspects, namely power consumption and
footprint are heavily influenced (if not dominated) by the data
access and storage aspects. The material is mainly based on
research at IMEC in this area in the period 1996-2001. In order to
deal with the stringent timing requirements and the data dominated
characteristics of this domain, we have adopted a target
architecture style that is compatible with modern embedded
processors, and we have developed a systematic step-wise
methodology to make the exploration and optimization of such
applications feasible in a source-to-source precompilation
approach.
Analog design is one of the more difficult aspects of electrical engineering. The main reason is the apparently vague decisions an experienced designer makes in optimizing his circuit. To enable fresh designers, like students electrical engineering, to become acquainted with analog circuit design, structuring the analog design process is of utmost importance. Structured Electronic Design: Negative-Feedback Amplifiers presents a design methodology for negative-feedback amplifiers. The design methodology enables to synthesize a topology and to, at the same time, optimize the performance of that topology. Key issues in the design methodology are orthogonalization, hierarchy and simple models. Orthogonalization enables the separate optimization of the three fundamental quality aspects: noise, distortion and bandwidth. Hierarchy ensures that the right decisions are made at the correct level of abstraction. The use of simple models, results in simple calculations yielding maximum-performance indicators that can be used to reject wrong circuits relatively fast. The presented design methodology divides the design of negative-feedback amplifiers in six independent steps. In the first two steps, the feedback network is designed. During those design steps, the active part is assumed to be a nullor, i.e. the performance with respect to noise, distortion and bandwidth is still ideal. In the subsequent four steps, an implementation for the active part is synthesized. During those four steps the topology of the active part is synthesized such that optimum performance is obtained. Firstly, the input stage is designed with respect to noise performance. Secondly, the output stage is designed with respect to clipping distortion. Thirdly, the bandwidth performance is designed, which may require the addition of an additional amplifying stage. Finally, the biasing circuitry for biasing the amplifying stages is designed. By dividing the design in independent design steps, the total global optimization is reduced to several local optimizations. By the specific sequence of the design steps, it is assured that the local optimizations yield a circuit that is close to the global optimum. On top of that, because of the separate dedicated optimizations, the resource use, like power, is tracked clearly. Structured Electronic Design: Negative-Feedback Amplifiers presents in two chapters the background and an overview of the design methodology. Whereafter, in six chapters the separate design steps are treated with great detail. Each chapter comprises several exercises. An additional chapter is dedicated to how to design current sources and voltage source, which are required for the biasing. The final chapter in the book is dedicated to a thoroughly described design example, showing clearly the benefits of the design methodology. In short, this book is valuable for M.Sc.-curriculum Electrical Engineering students, and of course, for researchers and designers who want to structure their knowledge about analog design further.
Design for Manufacturability and Yield for Nano-Scale CMOS walks the reader through all the aspects of manufacturability and yield in a nano-CMOS process and how to address each aspect at the proper design step starting with the design and layout of standard cells and how to yield-grade libraries for critical area and lithography artifacts through place and route, CMP model based simulation and dummy-fill insertion, mask planning, simulation and manufacturing, and through statistical design and statistical timing closure of the design. It alerts the designer to the pitfalls to watch for and to the good practices that can enhance a design s manufacturability and yield. This book is a must read book the serious practicing IC designer and an excellent primer for any graduate student intent on having a career in IC design or in EDA tool development."
Covers in detail promising solutions at the device, circuit, and architecture levels of abstraction after first explaining the sensitivity of the various MOS leakage sources to these conditions from the first principles. Also treated are the resulting effects so the reader understands the effectiveness of leakage power reduction solutions under these different conditions. Case studies supply real-world examples that reap the benefits of leakage power reduction solutions as the book highlights different device design choices that exist to mitigate increases in the leakage components as technology scales.
This book deals with the analysis and design of CMOS current-mode circuits for data communications. CMOS current-mode sampled-data networks, i.e. switched-current circuits, are excluded. Major subjects covered in the book include: a critical comparison of voltage-mode and current-mode circuits; the building blocks of current-mode circuits: design techniques; modeling of wire channels, electrical signaling for Gbps data communications; ESD protection for current-mode circuits and more. This book will appeal to IC design engineers, hardware system engineers and others.
New Algorithms, Architectures and Applications for Reconfigurable Computing consists of a collection of contributions from the authors of some of the best papers from the Field Programmable Logic conference (FPL 03) and the Design and Test Europe conference (DATE 03). In all, seventy-nine authors, from research teams from all over the world, were invited to present their latest research in the extended format permitted by this special volume. The result is a valuable book that is a unique record of the state of the art in research into field programmable logic and reconfigurable computing. The contributions are organized into twenty-four chapters and are grouped into three main categories: architectures, tools and applications. Within these three broad areas the most strongly represented themes are coarse-grained architectures; dynamically reconfigurable and multi-context architectures; tools for coarse-grained and reconfigurable architectures; networking, security and encryption applications. Field programmable logic and reconfigurable computing are exciting research disciplines that span the traditional boundaries of electronic engineering and computer science. When the skills of both research communities are combined to address the challenges of a single research discipline they serve as a catalyst for innovative research. The work reported in the chapters of this book captures that spirit of that innovation."
Constraint-Based Verification covers an emerging field in functional verification of electronic designs, referred to as the "constraint-based verification." The topics are developed in the context of a wide range of dynamic and static verification approaches including simulation, emulation, and formal methods. The goal is to show how constraints, or assertions, can be used towards automating the generation of testbenches, resulting in a seamless verification flow. Topics such as verification coverage, and connection with assertion based verification, are also covered. The book targets verification engineers as well as researchers. It covers both methodological and technical issues. Particular stress is given to the latest advances in functional verification. The research community has witnessed recent growth of interests in constraint-based functional verification. Various techniques have been developed. They are relatively new, but have reached a level of maturity so that they are appearing in commercial tools such as Vera and System Verilog.
This edited volume is targeted at presenting the latest state-of-the-art methodologies in "Hybrid Evolutionary Algorithms." The chapters deal with the theoretical and methodological aspects, as well as various applications to many real world problems from science, technology, business or commerce. Overall, the book has 14 chapters including an introductory chapter giving the fundamental definitions and some important research challenges. The contributions were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed.
A unified and systematic description of analysis and decision problems within a wide class of uncertain systems, described by traditional mathematical methods and by relational knowledge representations. Prof. Bubnicki takes a unique approach to stability and stabilization of uncertain systems.
The 33 papers presented in this book were selected from amongst the 97 papers presented during the sixth edition of the International Conference on Integrated Design and Manufacturing in Mechanical Engineering during 28 sessions. This conference represents the state-of-the-art research in the field. Two keynote papers introduce the subject of the Conference and are followed by the different themes highlighted during the conference.
The building blocks of today's and future embedded systems are complex intellectual property components, or cores, many of which are programmable processors. Traditionally, these embedded processors mostly have been pro grammed in assembly languages due to efficiency reasons. This implies time consuming programming, extensive debugging, and low code portability. The requirements of short time-to-market and dependability of embedded systems are obviously much better met by using high-level language (e.g. C) compil ers instead of assembly. However, the use of C compilers frequently incurs a code quality overhead as compared to manually written assembly programs. Due to the need for efficient embedded systems, this overhead must be very low in order to make compilers useful in practice. In turn, this requires new compiler techniques that take the specific constraints in embedded system de sign into account. An example are the specialized architectures of recent DSP and multimedia processors, which are not yet sufficiently exploited by existing compilers."
This book provides insight into the behavior and design of power distribution systems for high speed, high complexity integrated circuits. Also presented are criteria for estimating minimum required on-chip decoupling capacitance. Techniques and algorithms for computer-aided design of on-chip power distribution networks are also described; however, the emphasis is on developing circuit intuition and understanding the principles that govern the design and operation of power distribution systems.
Communication between engineers, their managers, suppliers and customers relies on the existence of a common understanding for the meaning of terms. While this is not normally a problem, it has proved to be a significant roadblock in the EDA industry where terms are created as required by any number of people, multiple terms are coined for the same thing, or even worse, the same term is used for many different things. This taxonomy identifies all of the significant terms used by an industry and provides a structural framework in which those terms can be defined and their relationship to other terms identified. The origins of this work go back to 1995 with a government-sponsored program called RASSP. At the termination of their work, VSIA picked up their work and developed it further. Three new taxonomies were introduced by VSIA for additional facets of the system design and development process. Since role of VSIA has now changed so that it no longer maintains these taxonomies, the baton is being passed on again through a group of interested people and manifested in this key reference work.
This book will help engineers write better Verilog/SystemVerilog design and verification code as well as deliver digital designs to market more quickly. It shows over 100 common coding mistakes that can be made with the Verilog and SystemVerilog languages. Each example explains in detail the symptoms of the error, the languages rules that cover the error, and the correct coding style to avoid the error. The book helps digital design and verification engineers to recognize, and avoid, these common coding mistakes. Many of these errors are very subtle, and can potentially cost hours or days of lost engineering time trying to find and debug them.
System Level Design Model with Reuse of System IP addresses system design by providing a framework for assessing and developing system design practices that observe and utilise reuse of system design know-how. The know-how accumulated in the companies represents an intellectual asset, or property ('IP'). The current situation regarding system design in general is, that the methods are insufficient, informally practised, and weakly supported by formal techniques and tools. Regarding system design reuse the methods and tools for exchanging system design data and know-how within companies are ad hoc and insufficient. The means available inside companies being already insufficient, there are actually no ways of exchanging between companies. To establish means for systematic reuse, the required system design concepts are identified through an analysis of existing design flows, and their definitions are catalogued in the form of a glossary and taxonomy. The System Design Conceptual Model (SDCM) formalises the concepts and their relationships by providing meta-models for both the system design process (SDPM) and the system under design (SUDM). The models are generic enough so that they can be applied in various organisations and for various kinds of electronic systems. System design patterns are presented as example means for enhancing reuse. The characteristics of system-level IP, a list of heuristic criteria of system-IP reusability, and guidelines for assessing system IP reusability within a particular design flow provide a pragmatic view to reuse. An analysis of selected languages and formalisms, and guidelines for the analysis of system-level languages provides means for assessing how the expression and representation of system design concepts are supported by languages. System Level Design Model with Reuse of System IP describes both a theoretical framework and various practical means for improving reuse in the design of complex systems. The information can be used in various ways in enhancing system design: Understanding system design, Analysing and assessing existing design flows, reuse practices and languages, Instantiating design flows for new design paradigms, Eliciting requirements for methods and tools, Organising teams, and Educating employees, partners and customers.
For the past decade or so, Computational Intelligence (CI) has been an - tremely "hot" topic amongst researchers working in the ?elds of biomedicine and bioinformatics. There are many successful applications of CI in such areas ascomputationalgenomics, predictionofgeneexpression, proteinstructure, and protein-protein interactions, modeling of evolution, or neuronal systems mod- ing and analysis. However, there still are many problems in biomedicine and bioinformatics that are in desperate need of advanced and e?cient compu- tional methodologies to deal with tremendous amounts of data so prevalent in those kinds of researchpursuits. Unfortunately, scientists in both these ?elds are very often unaware of the abundance of computational techniques that could be put to use to help them analyze and understand the data underlying their research inquiries. On the other hand, computational intelligence practitioners are often unfamiliar with the particular problems that their algorithms could be successfully applied for. The separation between the two worlds is partially caused by the use of di?erent languages in these two spheres of science, but also by a relatively small number of publications devoted solely to the purpose of facilitating the exchange of new computational algorithms and methodologies on one hand, and the needs of the realms of biomedicine and bioinformatics on the other. Inordertohelp?llthegapbetweenthescientistsonbothsidesofthisspectrum, wehavesolicitedcontributionsfromresearchersactivelyapplyingcomputational intelligencetechniquestoimportantproblemsinbiomedicineandbioinformatics. The purpose of this book is to provide an overview of powerful state-of-the-art methodologiesthatarecurrentlyutilizedforbiomedicine-and/orbioinformati- orientedapplications, sothatresearchersworkinginthose?eldscouldlearnofnew methodstohelpthemtackletheirproblems. Ontheotherhand, wealsohopethat the CI community will ?nd this book useful by discovering a new and intriguing area of applications.
Computer Aided Tolerancing (CAT) is an important topic in any field of design and production where parts move relative to one another and/or are assembled together. Geometric variations from specified dimensions and form always occur when parts are manufactured. Improvements in production systems can cause the amounts of the variations to become smaller, but their presence does not disappear. To shorten the time from concept to market of a product, it has been increasingly important to take clearances and the tolerancing of manufacturing variations into consideration right from the beginning, at the stage of design. Hence, geometric models are defined that represent both the complete array of geometric variations possible during manufacture and also the influence of geometry on the function of individual parts and on assemblies of them. The contents of this book originate from a collection of selected papers presented at the 9th CIRP International Seminar on CAT that was held from April 10-12, 2005 at Arizona State University, USA. The CIRP (College International pour la Recherche en Production or International Institution for Production Engineering Research) plans this seminar every two years, and the book is one in a series of Proceedings on CAT. The book is organized into seven parts: Models for Tolerance Representation and Specification, Tolerance Analysis, Tolerance Synthesis, Computational Metrology and Verification, Tolerances in Manufacturing, Applications to Machinery, and Incorporating Elasticity in Tolerance Models."
This book reviews the theoretical fundamentals of grey-box identification and puts the spotlight on MoCaVa, a MATLAB-compatible software tool, for facilitating the procedure of effective grey-box identification. It demonstrates the application of MoCaVa using two case studies drawn from the paper and steel industries. In addition, the book answers common questions which will help in building accurate models for systems with unknown inputs.
Model Predictive Control System Design and Implementation Using MATLAB(r) proposes methods for design and implementation of MPC systems using basis functions that confer the following advantages: - continuous- and discrete-time MPC problems solved in similar design frameworks; - a parsimonious parametric representation of the control trajectory gives rise to computationally efficient algorithms and better on-line performance; and - a more general discrete-time representation of MPC design that becomes identical to the traditional approach for an appropriate choice of parameters. After the theoretical presentation, coverage is given to three industrial applications. The subject of quadratic programming, often associated with the core optimization algorithms of MPC is also introduced and explained. The technical contents of this book is mainly based on advances in MPC using state-space models and basis functions. This volume includes numerous analytical examples and problems and MATLAB(r) programs and exercises.
MARTENS Bob and BROWN Andre Co-conference Chairs, CAAD Futures 2005 Computer Aided Architectural Design is a particularly dynamic field that is developing through the actions of architects, software developers, researchers, technologists, users, and society alike. CAAD tools in the architectural office are no longer prominent outsiders, but have become ubiquitous tools for all professionals in the design disciplines. At the same time, techniques and tools from other fields and uses, are entering the field of architectural design. This is exemplified by the tendency to speak of Information and Communication Technology as a field in which CAAD is embedded. Exciting new combinations are possible for those, who are firmly grounded in an understanding of architectural design and who have a clear vision of the potential use of ICT. CAAD Futures 2005 called for innovative and original papers in the field of Computer Aided Architectural Design, that present rigorous, high-quality research and development work. Papers should point towards the future, but be based on a thorough understanding of the past and present.
Mass Customization and Footwear: Myth, Salvation or Reality is the only book dedicated to the application of mass customization in a particular industry. By showing examples of how a "mature" manufacturing sector like shoe making can be thoroughly renovated in business and mentality by applying this paradigm; Mass Customization and Footwear: Myth, Salvation or Reality will be bought by practitioners in the footwear sector and postgraduates, researchers and lecturers in the area of mass customization.
This book introduces all the relevant information required to understand and put Model Driven Architecture (MDA) into industrial practice. It clearly explains which conceptual primitives should be present in a system specification, how to use UML to properly represent this subset of basic conceptual constructs, how to identify just those diagrams and modeling constructs that are actually required to create a meaningful conceptual schema, and how to accomplish the transformation process between the problem space and the solution space. The approach is fully supported by commercially available tools.
No other book has been published giving a single-volume introduction and survey to production planning in distributed manufacturing networks. The published literature so far includes conference proceedings only. |
![]() ![]() You may like...
SolidWorks CAM 2022 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R1,597
Discovery Miles 15 970
Building Information Modelling (BIM) in…
W. P. de Wilde, L. Mahdjoubi, …
Hardcover
R4,992
Discovery Miles 49 920
SolidWorks Simulation 2022 Black Book…
Gaurav Verma, Matt Weber
Hardcover
R1,774
Discovery Miles 17 740
AutoCAD Electrical 2023 Black Book…
Gaurav Verma, Matt Weber
Hardcover
R1,583
Discovery Miles 15 830
Recent Trends in Computer-aided…
Saptarshi Chatterjee, Debangshu Dey, …
Paperback
R2,729
Discovery Miles 27 290
|