![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This volume provides a complete understanding of the fundamental causes of routing congestion in present-day and next-generation VLSI circuits, offers techniques for estimating and relieving congestion, and provides a critical analysis of the accuracy and effectiveness of these techniques. The book includes metrics and optimization techniques for routing congestion at various stages of the VLSI design flow. The subjects covered include an explanation of why the problem of congestion is important and how it will trend, plus definitions of metrics that are appropriate for measuring congestion, and descriptions of techniques for estimating and optimizing routing congestion issues in cell-/library-based VLSI circuits.
This book deals with the analysis and design of CMOS current-mode circuits for data communications. CMOS current-mode sampled-data networks, i.e. switched-current circuits, are excluded. Major subjects covered in the book include: a critical comparison of voltage-mode and current-mode circuits; the building blocks of current-mode circuits: design techniques; modeling of wire channels, electrical signaling for Gbps data communications; ESD protection for current-mode circuits and more. This book will appeal to IC design engineers, hardware system engineers and others.
Design for Manufacturability and Yield for Nano-Scale CMOS walks the reader through all the aspects of manufacturability and yield in a nano-CMOS process and how to address each aspect at the proper design step starting with the design and layout of standard cells and how to yield-grade libraries for critical area and lithography artifacts through place and route, CMP model based simulation and dummy-fill insertion, mask planning, simulation and manufacturing, and through statistical design and statistical timing closure of the design. It alerts the designer to the pitfalls to watch for and to the good practices that can enhance a design s manufacturability and yield. This book is a must read book the serious practicing IC designer and an excellent primer for any graduate student intent on having a career in IC design or in EDA tool development."
Covers in detail promising solutions at the device, circuit, and architecture levels of abstraction after first explaining the sensitivity of the various MOS leakage sources to these conditions from the first principles. Also treated are the resulting effects so the reader understands the effectiveness of leakage power reduction solutions under these different conditions. Case studies supply real-world examples that reap the benefits of leakage power reduction solutions as the book highlights different device design choices that exist to mitigate increases in the leakage components as technology scales.
SystemC Kernel Extensions for Heterogeneous System Modeling is a result of an almost two year endeavour on our part to understand how SystemC can be made useful for system level modeling at higher levels of abstraction. Making it a truly heterogeneous modeling language and platform, for hardware/software co-design as well as complex embedded hardware designs has been our focus in the work reported in this book.
IDT (Intelligent Decision Technologies) seeks an interchange of research on intelligent systems and intelligent technologies which enhance or improve decision making in industry, government and academia. The focus is interdisciplinary in nature, and includes research on all aspects of intelligent decision technologies, from fundamental development to the applied system. It constitutes a great honor and pleasure for us to publish the works and new research results of scholars from the First KES International Symposium on Intelligent Decision Technologies (KES IDT'09), hosted and organized by University of Hyogo in conjunction with KES International (Himeji, Japan, April, 2009). The symposium was concerned with theory, design, development, implementation, testing and evaluation of intelligent decision systems. Its topics included intelligent agents, fuzzy logic, multi-agent systems, artificial neural networks, genetic algorithms, expert systems, intelligent decision making support systems, information retrieval systems, geographic information systems, and knowledge management systems. These technologies have the potential to support decision making in many areas of management, international business, finance, accounting, marketing, healthcare, military applications, production, networks, traffic management, crisis response, and human interfaces.
This edited volume is targeted at presenting the latest state-of-the-art methodologies in "Hybrid Evolutionary Algorithms." The chapters deal with the theoretical and methodological aspects, as well as various applications to many real world problems from science, technology, business or commerce. Overall, the book has 14 chapters including an introductory chapter giving the fundamental definitions and some important research challenges. The contributions were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed.
Many applications in science and engineering require a digital model of a real physical object. Advanced scanning technology has made it possible to scan such objects and generate point samples on their boundaries. This book, first published in 2007, shows how to compute a digital model from this point sample. After developing the basics of sampling theory and its connections to various geometric and topological properties, the author describes a suite of algorithms that have been designed for the reconstruction problem, including algorithms for surface reconstruction from dense samples, from samples that are not adequately dense and from noisy samples. Voronoi- and Delaunay-based techniques, implicit surface-based methods and Morse theory-based methods are covered. Scientists and engineers working in drug design, medical imaging, CAD, GIS, and many other areas will benefit from this first book on the subject.
The building blocks of today's and future embedded systems are complex intellectual property components, or cores, many of which are programmable processors. Traditionally, these embedded processors mostly have been pro grammed in assembly languages due to efficiency reasons. This implies time consuming programming, extensive debugging, and low code portability. The requirements of short time-to-market and dependability of embedded systems are obviously much better met by using high-level language (e.g. C) compil ers instead of assembly. However, the use of C compilers frequently incurs a code quality overhead as compared to manually written assembly programs. Due to the need for efficient embedded systems, this overhead must be very low in order to make compilers useful in practice. In turn, this requires new compiler techniques that take the specific constraints in embedded system de sign into account. An example are the specialized architectures of recent DSP and multimedia processors, which are not yet sufficiently exploited by existing compilers."
This book provides insight into the behavior and design of power distribution systems for high speed, high complexity integrated circuits. Also presented are criteria for estimating minimum required on-chip decoupling capacitance. Techniques and algorithms for computer-aided design of on-chip power distribution networks are also described; however, the emphasis is on developing circuit intuition and understanding the principles that govern the design and operation of power distribution systems.
Mixed Reality is moving out of the research-labs into our daily lives. It plays an increasing role in architecture, design and construction. The combination of digital content with reality creates an exciting synergy that sets out to enhance engagement within architectural design and construction. State-of-the-art research projects on theories and applications within Mixed Reality are presented by leading researchers covering topics in architecture, design collaboration, construction and education. They discuss current projects and offer insight into the next wave of Mixed Reality possibilities.
For the past decade or so, Computational Intelligence (CI) has been an - tremely "hot" topic amongst researchers working in the ?elds of biomedicine and bioinformatics. There are many successful applications of CI in such areas ascomputationalgenomics, predictionofgeneexpression, proteinstructure, and protein-protein interactions, modeling of evolution, or neuronal systems mod- ing and analysis. However, there still are many problems in biomedicine and bioinformatics that are in desperate need of advanced and e?cient compu- tional methodologies to deal with tremendous amounts of data so prevalent in those kinds of researchpursuits. Unfortunately, scientists in both these ?elds are very often unaware of the abundance of computational techniques that could be put to use to help them analyze and understand the data underlying their research inquiries. On the other hand, computational intelligence practitioners are often unfamiliar with the particular problems that their algorithms could be successfully applied for. The separation between the two worlds is partially caused by the use of di?erent languages in these two spheres of science, but also by a relatively small number of publications devoted solely to the purpose of facilitating the exchange of new computational algorithms and methodologies on one hand, and the needs of the realms of biomedicine and bioinformatics on the other. Inordertohelp?llthegapbetweenthescientistsonbothsidesofthisspectrum, wehavesolicitedcontributionsfromresearchersactivelyapplyingcomputational intelligencetechniquestoimportantproblemsinbiomedicineandbioinformatics. The purpose of this book is to provide an overview of powerful state-of-the-art methodologiesthatarecurrentlyutilizedforbiomedicine-and/orbioinformati- orientedapplications, sothatresearchersworkinginthose?eldscouldlearnofnew methodstohelpthemtackletheirproblems. Ontheotherhand, wealsohopethat the CI community will ?nd this book useful by discovering a new and intriguing area of applications.
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
Computer Aided Tolerancing (CAT) is an important topic in any field of design and production where parts move relative to one another and/or are assembled together. Geometric variations from specified dimensions and form always occur when parts are manufactured. Improvements in production systems can cause the amounts of the variations to become smaller, but their presence does not disappear. To shorten the time from concept to market of a product, it has been increasingly important to take clearances and the tolerancing of manufacturing variations into consideration right from the beginning, at the stage of design. Hence, geometric models are defined that represent both the complete array of geometric variations possible during manufacture and also the influence of geometry on the function of individual parts and on assemblies of them. The contents of this book originate from a collection of selected papers presented at the 9th CIRP International Seminar on CAT that was held from April 10-12, 2005 at Arizona State University, USA. The CIRP (College International pour la Recherche en Production or International Institution for Production Engineering Research) plans this seminar every two years, and the book is one in a series of Proceedings on CAT. The book is organized into seven parts: Models for Tolerance Representation and Specification, Tolerance Analysis, Tolerance Synthesis, Computational Metrology and Verification, Tolerances in Manufacturing, Applications to Machinery, and Incorporating Elasticity in Tolerance Models."
This book reviews the theoretical fundamentals of grey-box identification and puts the spotlight on MoCaVa, a MATLAB-compatible software tool, for facilitating the procedure of effective grey-box identification. It demonstrates the application of MoCaVa using two case studies drawn from the paper and steel industries. In addition, the book answers common questions which will help in building accurate models for systems with unknown inputs.
MARTENS Bob and BROWN Andre Co-conference Chairs, CAAD Futures 2005 Computer Aided Architectural Design is a particularly dynamic field that is developing through the actions of architects, software developers, researchers, technologists, users, and society alike. CAAD tools in the architectural office are no longer prominent outsiders, but have become ubiquitous tools for all professionals in the design disciplines. At the same time, techniques and tools from other fields and uses, are entering the field of architectural design. This is exemplified by the tendency to speak of Information and Communication Technology as a field in which CAAD is embedded. Exciting new combinations are possible for those, who are firmly grounded in an understanding of architectural design and who have a clear vision of the potential use of ICT. CAAD Futures 2005 called for innovative and original papers in the field of Computer Aided Architectural Design, that present rigorous, high-quality research and development work. Papers should point towards the future, but be based on a thorough understanding of the past and present.
Model Predictive Control System Design and Implementation Using MATLAB(r) proposes methods for design and implementation of MPC systems using basis functions that confer the following advantages: - continuous- and discrete-time MPC problems solved in similar design frameworks; - a parsimonious parametric representation of the control trajectory gives rise to computationally efficient algorithms and better on-line performance; and - a more general discrete-time representation of MPC design that becomes identical to the traditional approach for an appropriate choice of parameters. After the theoretical presentation, coverage is given to three industrial applications. The subject of quadratic programming, often associated with the core optimization algorithms of MPC is also introduced and explained. The technical contents of this book is mainly based on advances in MPC using state-space models and basis functions. This volume includes numerous analytical examples and problems and MATLAB(r) programs and exercises.
This book introduces all the relevant information required to understand and put Model Driven Architecture (MDA) into industrial practice. It clearly explains which conceptual primitives should be present in a system specification, how to use UML to properly represent this subset of basic conceptual constructs, how to identify just those diagrams and modeling constructs that are actually required to create a meaningful conceptual schema, and how to accomplish the transformation process between the problem space and the solution space. The approach is fully supported by commercially available tools.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
This book constitutes the thoroughly refereed post-conference proceedings of the 7th International Conference on Numerical Methods and Applications, NMA 2010, held in Borovets, Bulgaria, in August 2010. The 60 revised full papers presented together with 3 invited papers were carefully reviewed and selected from numerous submissions for inclusion in this book. The papers are organized in topical sections on Monte Carlo and quasi-Monte Carlo methods, environmental modeling, grid computing and applications, metaheuristics for optimization problems, and modeling and simulation of electrochemical processes.
The concept of CAST as Computer Aided Systems Theory was introduced by F. Pichler in the late 1980s to refer to computer theoretical and practical developments as tools for solving problems in system science. It was thought of as the third component (the other two being CAD and CAM) required to complete the path from computer and systems sciences to practical developments in science and engineering. Franz Pichler, of the University of Linz, organized the first CAST workshop in April 1988, which demonstrated the acceptance of the concepts by the scientific and technical community. Next, the University of Las Palmas de Gran Canaria joined the University of Linz to organize the first international meeting on CAST (Las Palmas, February 1989) under the name EUROCAST'89. This proved to be a very successful gathering of systems theorists, computer scientists and engineers from most European countries, North America and Japan. It was agreed that EUROCAST international conferences would be organized every two years, alternating between Las Palmas de Gran Canaria and a continental European location. From 2001 the conference has been held exclusively in Las Palmas. Thus, successive EUROCAST meetings took place in Krems (1991), Las Palmas (1993), In- bruck (1995), Las Palmas (1997), Vienna (1999), Las Palmas (2001), Las Palmas (2003) Las Palmas (2005) and Las Palmas (2007), in addition to an extra-European CAST c- ference in Ottawa in 1994.
Assertion-based IP is much more than a comprehensive set of related assertions. It is a full-fledged reusable and configurable transaction-level verification component, which is used to detect both interesting and incorrect behaviors. Upon detecting interesting or incorrect behavior, the assertion-based IP alerts other verification components within a simulation environment, which are responsible for taking appropriate action. The focus of this book is to bring the assertion discussion up to a higher level and introduce a process for creating effective, reusable, assertion-based IP, which easily integrates with the user s existing verification environment, in other words the testbench infrastructure. The guiding principles promoted in this book when creating an assertion-based IP monitor are:
A unique feature of this book is the fully worked out, detailed examples. The concepts presented in the book are drawn from the authors experience developing assertion-based IP, as well as general assertion-based techniques. Creating Assertion-Based IP is an important resource for design and verification engineers. From the Foreword: Creating Assertion-Based IP " reduces to process the creation of
one of the most valuable kinds of VIP: assertion-based VIP This
book will serve as a valuable reference for years to come."
Statistical timing analysis is an area of growing importance in nanometer te- nologies' as the uncertainties associated with process and environmental var- tions increase' and this chapter has captured some of the major efforts in this area. This remains a very active field of research' and there is likely to be a great deal of new research to be found in conferences and journals after this book is published. In addition to the statistical analysis of combinational circuits' a good deal of work has been carried out in analyzing the effect of variations on clock skew. Although we will not treat this subject in this book' the reader is referred to [LNPS00' HN01' JH01' ABZ03a] for details. 7 TIMING ANALYSIS FOR SEQUENTIAL CIRCUITS 7.1 INTRODUCTION A general sequential circuit is a network of computational nodes (gates) and memory elements (registers). The computational nodes may be conceptualized as being clustered together in an acyclic network of gates that forms a c- binational logic circuit. A cyclic path in the direction of signal propagation 1 is permitted in the sequential circuit only if it contains at least one register . In general, it is possible to represent any sequential circuit in terms of the schematic shown in Figure 7.1, which has I inputs, O outputs and M registers. The registers outputs feed into the combinational logic which, in turn, feeds the register inputs. Thus, the combinational logic has I + M inputs and O + M outputs.
Sigma delta modulation has become a very useful and widely applied technique for high performance Analog-to-Digital (A/D) conversion of narrow band signals. Through the use of oversampling and negative feedback, the quantization errors of a coarse quantizer are suppressed in a narrow signal band in the output of the modulator. Bandpass sigma delta modulation is well suited for A/D conversion of narrow band signals modulated on a carrier, as occurs in communication systems such as AM/FM receivers and mobile phones. Due to the nonlinearity of the quantizer in the feedback loop, a sigma delta modulator may exhibit input signal dependent stability properties. The same combination of the nonlinearity and the feedback loop complicates the stability analysis. In Bandpass Sigma Delta Modulators, the describing function method is used to analyze the stability of the sigma delta modulator. The linear gain model commonly used for the quantizer fails to predict small signal stability properties and idle patterns accurately. In Bandpass Sigma Delta Modulators an improved model for the quantizer is introduced, extending the linear gain model with a phase shift. Analysis shows that the phase shift of a sampled quantizer is in fact a phase uncertainty. Stability analysis of sigma delta modulators using the extended model allows accurate prediction of idle patterns and calculation of small-signal stability boundaries for loop filter parameters. A simplified rule of thumb is derived and applied to bandpass sigma delta modulators. The stability properties have a considerable impact on the design of single-loop, one-bit, high-order continuous-time bandpass sigma delta modulators. The continuous-time bandpass loop filter structure should have sufficient degrees of freedom to implement the desired (small-signal stable) sigma delta modulator behavior. Bandpass Sigma Delta Modulators will be of interest to practicing engineers and researchers in the areas of mixed-signal and analog integrated circuit design.
Hierarchical design methods were originally introduced for the design of digital ICs, and they appeared to provide for significant advances in design productivity, Time-to-Market, and first-time right design. These concepts have gained increasing importance in the semiconductor industry in recent years. In the course of time, the supportive quality of hierarchical methods and their advantages were confirmed. System Level Hardware/Software Co-design: An Industrial Approach demonstrates the applicability of hierarchical methods to hardware / software codesign, and mixed analogue / digital design following a similar approach. Hierarchical design methods provide for high levels of design support, both in a qualitative and a quantitative sense. In the qualitative sense, the presented methods support all phases in the product life cycle of electronic products, ranging from requirements analysis to application support. Hierarchical methods furthermore allow for efficient digital hardware design, hardware / software codesign, and mixed analogue / digital design, on the basis of commercially available formalisms and design tools. In the quantitative sense, hierarchical methods have prompted a substantial increase in design productivity. System Level Hardware/Software Co-design: An Industrial Approach reports on a six year study during which time the number of square millimeters of normalized complexity an individual designer contributed every week rose by more than a factor of five. Hierarchical methods therefore enabled designers to keep track of the ever increasing design complexity, while effectively reducing the number of design iterations in the form of redesigns. System Level Hardware/Software Co-design: An Industrial Approach is the first book to provide a comprehensive, coherent system design methodology that has been proven to increase productivity in industrial practice. The book will be of interest to all managers, designers and researchers working in the semiconductor industry. |
You may like...
Configuration Spaces - Geometry…
Filippo Callegaro, Frederick Cohen, …
Hardcover
Ergodic Theory - Finite and Infinite…
Mariusz Urbanski, Mario Roy, …
Hardcover
R4,484
Discovery Miles 44 840
Research in Shape Modeling - Los…
Kathryn Leonard, Sibel Tari
Hardcover
|