![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Computing increasingly happens somewhere, with that geographic location important to the computational process itself. Many new and evolving spatial technologies, such as geosensor networks and smartphones, embody this trend. Conventional approaches to spatial computing are centralized, and do not account for the inherently decentralized nature of "computing somewhere": the limited, local knowledge of individual system components, and the interaction between those components at different locations. On the other hand, despite being an established topic in distributed systems, decentralized computing is not concerned with geographical constraints to the generation and movement of information. In this context, of (centralized) spatial computing and decentralized (non-spatial) computing, the key question becomes: "What makes decentralized spatial computing special?" In Part I of the book the author covers the foundational concepts, structures, and design techniques for decentralized computing with spatial and spatiotemporal information. In Part II he applies those concepts and techniques to the development of algorithms for decentralized spatial computing, stepping through a suite of increasingly sophisticated algorithms: from algorithms with minimal spatial information about their neighborhoods; to algorithms with access to more detailed spatial information, such as direction, distance, or coordinate location; to truly spatiotemporal algorithms that monitor environments that are dynamic, even using networks that are mobile or volatile. Finally, in Part III the author shows how decentralized spatial and spatiotemporal algorithms designed using the techniques explored in Part II can be simulated and tested. In particular, he investigates empirically the important properties of a decentralized spatial algorithm: its computational efficiency and its robustness to unavoidable uncertainty. Part III concludes with a survey of the opportunities for connecting decentralized spatial computing to ongoing research and emerging hot topics in related fields, such as biologically inspired computing, geovisualization, and stream computing. The book is written for students and researchers of computer science and geographic information science. Throughout the book the author's style is characterized by a focus on the broader message, explaining the process of decentralized spatial algorithm design rather than the technical details. Each chapter ends with review questions designed to test the reader's understanding of the material and to point to further work or research. The book includes short appendices on discrete mathematics and SQL. Simulation models written in NetLogo and associated source code for all the algorithms presented in the book can be found on the author's accompanying website.
The Verilog Hardware Description Language (Verilog-HDL) has long been the most popular language for describing complex digital hardware. It started life as a prop- etary language but was donated by Cadence Design Systems to the design community to serve as the basis of an open standard. That standard was formalized in 1995 by the IEEE in standard 1364-1995. About that same time a group named Analog Verilog International formed with the intent of proposing extensions to Verilog to support analog and mixed-signal simulation. The first fruits of the labor of that group became available in 1996 when the language definition of Verilog-A was released. Verilog-A was not intended to work directly with Verilog-HDL. Rather it was a language with Similar syntax and related semantics that was intended to model analog systems and be compatible with SPICE-class circuit simulation engines. The first implementation of Verilog-A soon followed: a version from Cadence that ran on their Spectre circuit simulator. As more implementations of Verilog-A became available, the group defining the a- log and mixed-signal extensions to Verilog continued their work, releasing the defi- tion of Verilog-AMS in 2000. Verilog-AMS combines both Verilog-HDL and Verilog-A, and adds additional mixed-signal constructs, providing a hardware description language suitable for analog, digital, and mixed-signal systems. Again, Cadence was first to release an implementation of this new language, in a product named AMS Designer that combines their Verilog and Spectre simulation engines.
Principles of Verifiable RTL Design: A Functional Coding Style Supporting Verification Processes in Verilog explains how you can write Verilog to describe chip designs at the RT-level in a manner that cooperates with verification processes. This cooperation can return an order of magnitude improvement in performance and capacity from tools such as simulation and equivalence checkers. It reduces the labor costs of coverage and formal model checking by facilitating communication between the design engineer and the verification engineer. It also orients the RTL style to provide more useful results from the overall verification process. The intended audience for Principles of Verifiable RTL Design: A Functional Coding Style Supporting Verification Processes in Verilog is engineers and students who need an introduction to various design verification processes and a supporting functional Verilog RTL coding style. A second intended audience is engineers who have been through introductory training in Verilog and now want to develop good RTL writing practices for verification. A third audience is Verilog language instructors who are using a general text on Verilog as the course textbook but want to enrich their lectures with an emphasis on verification. A fourth audience is engineers with substantial Verilog experience who want to improve their Verilog practice to work better with RTL Verilog verification tools. A fifth audience is design consultants searching for proven verification-centric methodologies. A sixth audience is EDA verification tool implementers who want some suggestions about a minimal Verilog verification subset. Principles of Verifiable RTL Design: A Functional Coding Style Supporting Verification Processes in Verilog is based on the reality that comes from actual large-scale product design process and tool experience.
Writing Testbenches: Functional Verification of HDL Models first introduces the necessary concepts and tools of verification, then describes a process for carrying out an effective functional verification of a design. This book also presents techniques for applying a stimulus and monitoring the response of a design by abstracting the operations using bus-functional models. The architecture of testbenches built around these bus-functional models is important for minimizing development and maintenance effort. Behavioral modeling is another important concept presented in this book. It is used to parallelize the implementation and verification of a design and to perform more efficient simulations. For many, behavioral modeling is synonymous with synthesizeable or RTL modeling. In this book, the term 'behavioural' is used to describe any model that adequately emulates the functionality of a design, usually using non-synthesizeable constructs and coding style. Writing Testbenches: Functional Verification of HDL Models focuses on the functional verification of hardware designs using either VHDL or Verilog.The reader should have at least a basic knowledge of one of the languages. Ideally, he or she should have experience in writing synthesizeable models and be familiar with running a simulation using any of the available VHDL or Verilog simulators. From the Foreword 'With gate counts and system complexity growing exponentially, engineers confront the most perplexing challenge in product design: functional verification. The bulk of the time consumed in the design of new ICs and systems is now spent on verification. New and interesting design technologies like physical synthesis and design reuse that create ever- larger designs only aggravate the problem. What the EDA tool industry has continuously failed to realize is that the real problem is not how to create a 12 million gate IC that runs at 600 MHz, but how to verify it. This text marks the first genuine effort at defining a verification methodology that is independent of both tools and applications. Engineers now have a true reference text for quickly and accurately verifying the functionality of their designs.' Michael Horne, President and CEO, Qualis Design Corporation
This book constitutes the refereed proceedings of the 11th International Conference on Cooperative Design, Visualization, and Engineering, CDVE 2014, held in Seattle, WA, USA, in September 2014. The 33 full and 10 short papers presented were carefully reviewed and selected from 78 submissions. The papers cover topics such as cloud technology; the use of cloud for manufacturing, re-source selection, service evaluation, and control; methods for processing and visualizing big data created by the social media, such as Twitter and Facebook; real-time data about human interaction; sentiment analysis; trend analysis; location-based crowdsourcing; effective teamwork; cooperative visualization.
In the past decade, substrate noise has had a constant and significant impact on the design of analog and mixed-signal integrated circuits. Only recently, with advances in chip miniaturization and innovative circuit design, has substrate noise begun to plague fully digital circuits as well. To combat the effects of substrate noise, heavily over-designed structures are generally adopted, thus seriously limiting the advantages of innovative technologies. Substrate Noise: Analysis and Optimization for IC Design addresses the main problems posed by substrate noise from both an IC and a CAD designer perspective. The effects of substrate noise on performance in digital, analog, and mixed-signal circuits are presented, along with the mechanisms underlying noise generation, injection, and transport. Popular solutions to the substrate noise problem and the trade-offs often debated by designers are extensively discussed. Non-traditional approaches as well as semi-automated techniques to combat substrate noise are also addressed. Substrate Noise: Analysis and Optimization for IC Design will be of interest to researchers and professionals interested in signal integrity, as well as to mixed signal and RF designers.
Regular Fabrics in Deep Sub-Micron Integrated-Circuit Design discusses new approaches to better timing-closure and manufacturability of DSM Integrated Circuits. The key idea presented is the use of regular circuit and interconnect structures such that area/delay can be predicted with high accuracy. The co-design of structures and algorithms allows great opportunities for achieving better final results, thus closing the gap between IC and CAD designers. The regularities also provide simpler and possibly better manufacturability. In this book we present not only algorithms for solving particular sub-problems but also systematic ways of organizing different algorithms in a flow to solve the design problem as a whole. A timing-driven chip design flow is developed based on the new structures and their design algorithms, which produces faster chips in a shorter time.
System designers, computer scientists and engineers have c- tinuously invented and employed notations for modeling, speci- ing, simulating, documenting, communicating, teaching, verifying and controlling the designs of digital systems. Initially these s- tems were represented via electronic and fabrication details. F- lowing C. E. Shannon's revelation of 1948, logic diagrams and Boolean equations were used to represent digital systems in a fa- ion that de-emphasized electronic and fabrication detail while revealing logical behavior. A small number of circuits were made available to remove the abstraction of these representations when it was desirable to do so. As system complexity grew, block diagrams, timing charts, sequence charts, and other graphic and symbolic notations were found to be useful in summarizing the gross features of a system and describing how it operated. In addition, it always seemed necessary or appropriate to augment these documents with lengthy verbal descriptions in a natural language. While each notation was, and still is, a perfectly valid means of expressing a design, lack of standardization, conciseness, and f- mal definitions interfered with communication and the understa- ing between groups of people using different notations. This problem was recognized early and formal languages began to evolve in the 1950s when I. S. Reed discovered that flip-flop input equations were equivalent to a register transfer equation, and that xvi tor-like notation. Expanding these concepts Reed developed a no- tion that became known as a Register Transfer Language (RTL).
Differential equations is a subject of wide applicability, and knowledge of dif Differential equations is a subject of wide applicability, and knowledge of dif ferential ferential equations equations topics topics permeates permeates all all areas areas of of study study in in engineering engineering and and applied applied mathematics. mathematics. Some Some differential differential equations equations are are susceptible susceptible to to analytic analytic means means of of so so lution, lution, while while others others require require the the generation generation of of numerical numerical solution solution trajectories trajectories to to see see the the behavior behavior of of the the system system under under study. study. For For both both situations, situations, the the software software package package Maple Maple can can be be used used to to advantage. advantage. To To the the student student Making Making effective effective use use of of differential differential equations equations requires requires facility facility in in recognizing recognizing and and solving solving standard standard "tractable" "tractable" problems, problems, as as well well as as having having the the background background in in the the subject subject to to make make use use of of tools tools for for dealing dealing with with situations situations that that are are not not amenable amenable to to simple simple analytic analytic approaches. approaches.
As integrated circuit (IC) feature sizes scaled below a quarter of a micron, thereby defining the deep submicron (DSM) era, there began a gradual shift in the impact on performance due to the metal interconnections among the active circuit components. Once viewed as merely parasitics in terms of their relevance to the overall circuit behavior, the interconnect can now have a dominant impact on the IC area and performance. Beginning in the late 1980's there was significant research toward better modeling and characterization of the resistance, capacitance and ultimately the inductance of on-chip interconnect. IC Interconnect Analysis covers the state-of-the-art methods for modeling and analyzing IC interconnect based on the past fifteen years of research. This is done at a level suitable for most practitioners who work in the semiconductor and electronic design automation fields, but also includes significant depth for the research professionals who will ultimately extend this work into other areas and applications. IC Interconnect Analysis begins with an in-depth coverage of delay metrics, including the ubiquitous Elmore delay and its many variations. This is followed by an outline of moment matching methods, calculating moments efficiently, and Krylov subspace methods for model order reduction. The final two chapters describe how to interface these reduced-order models to circuit simulators and gate-level timing analyzers respectively. IC Interconnect Analysis is written for CAD tool developers, IC designers and graduate students.
The Analog to Digital Converters represent one half of the link between the world we live in - analog - and the digital world of computers, which can handle the computations required in digital signal processing. These devices are mathematically very complex due to their nonlinear behavior and thus fairly difficult to analyze without the use of simulation tools. High Speed A/D Converters: Understanding Data Converters Through SPICE presents the subject from the practising engineer's point of view rather than from the academic's point of view. A practical approach is emphasized. High Speed A/D Converters: Understanding Data Converters Through SPICE is intended as a learning tool by providing building blocks that can be stacked on top of each other to build higher order systems. The book provides a guide to understanding the various topologies used in A/D converters by suggesting simple methods for the blocks used in an A/D converter. The converters discussed throughout the book constitute a class of devices called undersampled or Nyquist converters.The tools used in deriving the results presented are: * TopSpice(R) by Penzar - a mixed mode SPICE simulator - version 5.90. The files included in Appendix A were written for this tool. However, most circuit files need only minor adjustments to be used on other SPICE simulators such as PSpice, Hspice, IS_Spice and Micro-Cap IV; * Mathcad 2000 - Professional by Mathsoft. This tool is very useful in performing FFT analysis as well as drawing some of the graphs. Again, the mathcad files are included to help the user analyze the data. High Speed A/D Converters: Understanding Data Converters Through SPICE not only supplies the models for the A/D converters for SPICE program but also describes the physical reasons for the converter's performance.
This volume is a welcome effort towards improving some of the practices in chip design today. The authors provide a comprehensive reference work on Automatic Layout Modification which will be valuable to VLSI courses at universities, and to CAD and circuit engineers and engineering managers.
This is the first book to cover verification strategies and methodologies for SOC verification from system level verification to the design sign-off. All the verification aspects in this exciting new book are illustrated with a single reference design for Bluetooth application.
From the reviews: "[...] a welcome addition to the literature. [...] This book promises to make a valuable contribution to the education of graduate students in electrical and computer engineering, and a very useful addition to the library of the maturer investigator in SoC designs or related fields." Microelectronics Reliability
The current trend towards the realization of complex and versatile Systems on a Chip requires the combined efforts and attention of experts in a wide range of areas including microsystems, embedded hardware/software systems, dedicated ASIC and programmable logic hardware, reconfigurable computing, wireless communications and RF issues, video and image processing, memory systems, low power design techniques, design, test and verification algorithms, modeling and simulation, logic synthesis, and interconnect analysis. Thus, the contributions presented herein address a wide range of Systems on a Chip problems. VLSI: Systems on a Chip comprises the selected proceedings of the Tenth International Conference on Very Large Scale Integration (VLSI '99), which was sponsored by the International Federation for Information Processing (IFIP) and was held in Lisbon, Portugal, in December 1999.The volume is organized around two themes, in which the following topics are addressed: VLSI Systems Design and Applications * Analog Systems Design * Analog Modeling and Design * Image Processing * Reconfigurable Computing * Memory and System Design * Low Power Design VLSI Design Methods and CAD * Test and Verification * Analog CAD and Interconnect * Fundamental CAD Algorithms * Verification and Simulation * CAD for Physical Design * High-Level Synthesis and Verification of Embedded Systems VLSI: Systems on a Chip is essential reading for researchers working on system integration, design, and CAD.
Analog Behavioral Modeling With The Verilog-A Language provides the IC designer with an introduction to the methodologies and uses of analog behavioral modeling with the Verilog-A language. In doing so, an overview of Verilog-A language constructs as well as applications using the language are presented. In addition, the book is accompanied by the Verilog-A Explorer IDE (Integrated Development Environment), a limited capability Verilog-A enhanced SPICE simulator for further learning and experimentation with the Verilog-A language. This book assumes a basic level of understanding of the usage of SPICE-based analog simulation and the Verilog HDL language, although any programming language background and a little determination should suffice. From the Foreword: `Verilog-A is a new hardware design language (HDL) for analog circuit and systems design. Since the mid-eighties, Verilog HDL has been used extensively in the design and verification of digital systems. However, there have been no analogous high-level languages available for analog and mixed-signal circuits and systems. Verilog-A provides a new dimension of design and simulation capability for analog electronic systems. Previously, analog simulation has been based upon the SPICE circuit simulator or some derivative of it. Digital simulation is primarily performed with a hardware description language such as Verilog, which is popular since it is easy to learn and use. Making Verilog more worthwhile is the fact that several tools exist in the industry that complement and extend Verilog's capabilities ... Behavioral Modeling With the Verilog-A Language provides a good introduction and starting place for students and practicing engineers with interest in understanding this new level of simulation technology. This book contains numerous examples that enhance the text material and provide a helpful learning tool for the reader. The text and the simulation program included can be used for individual study or in a classroom environment ...' Dr. Thomas A. DeMassa, Professor of Engineering, Arizona State University
Computer Aided Design (CAD) technology plays a key role in today's advanced manufacturing environment. To reduce the time to market, achieve zero defect quality the first time, and use available production and logistics resources effectively, product and design process knowledge covering the whole product life-cycle must be used throughout product design. Once generated, this intensive design knowledge should be made available to later life-cycle activities. Due to the increasing concern about global environmental issues and rapidly changing economical situation worldwide, design must exhibit high performance not only in quality and productivity, but also in life-cycle issues, including extended producer's liability. These goals require designers and engineers to use various kinds of design knowledge intensively during product design and to generate design information for use in later stages of the product life-cycle such as production, distribution, operation, maintenance, reclamation, and recycling. Therefore, future CAD systems must incorporate product and design process knowledge, which are not explicitly dealt with in the current systems, in their design tools and design object models.
Cell-based design methodologies have dominated layout generation of digital circuits. Unfortunately, the growing demands for transparent process portability, increased performance, and low-level device sizing for timing/power are poorly handled in a fixed cell library. Direct Transistor-Level Layout For Digital Blocks proposes a direct transistor-level layout approach for small blocks of custom digital logic as an alternative that better accommodates demands for device-level flexibility. This approach captures essential shape-level optimizations, yet scales easily to netlists with thousands of devices, and incorporates timing optimization during layout. The key idea is early identification of essential diffusion-merged MOS device groups, and their preservation in an uncommitted geometric form until the very end of detailed placement. Roughly speaking, essential groups are extracted early from the transistor-level netlist, placed globally, optimized locally, and then finally committed each to a specific shape-level form while concurrently optimizing for both density and routability. The essential flaw in prior efforts is an over-reliance on geometric assumptions from large-scale cell-based layout algorithms. Individual transistors may seem simple, but they do not pack as gates do. Algorithms that ignore these shape-level issues suffer the consequences when thousands of devices are poorly packed. The approach described in this book can pack devices much more densely than a typical cell-based layout. Direct Transistor-Level Layout For Digital Blocks is a comprehensive reference work on device-level layout optimization, which will be valuable to CAD tool and circuit designers.
The manufacturing industry will reap significant benefits from encouraging the development of digital manufacturing science and technology. Digital Manufacturing Science uses theorems, illustrations and tables to introduce the definition, theory architecture, main content, and key technologies of digital manufacturing science. Readers will be able to develop an in-depth understanding of the emergence and the development, the theoretical background, and the techniques and methods of digital manufacturing science. Furthermore, they will also be able to use the basic theories and key technologies described in Digital Manufacturing Science to solve practical engineering problems in modern manufacturing processes. Digital Manufacturing Science is aimed at advanced undergraduate and postgraduate students, academic researchers and researchers in the manufacturing industry. It allows readers to integrate the theories and technologies described with their own research works, and to propose new ideas and new methods to improve the theory and application of digital manufacturing science.
I am glad to see this new book on the e language and on verification. I am especially glad to see a description of the e Reuse Methodology (eRM). The main goal of verification is, after all, finding more bugs quicker using given resources, and verification reuse (module-to-system, old-system-to-new-system etc. ) is a key enabling component. This book offers a fresh approach in teaching the e hardware verification language within the context of coverage driven verification methodology. I hope it will help the reader und- stand the many important and interesting topics surrounding hardware verification. Yoav Hollander Founder and CTO, Verisity Inc. Preface This book provides a detailed coverage of the e hardware verification language (HVL), state of the art verification methodologies, and the use of e HVL as a facilitating verification tool in implementing a state of the art verification environment. It includes comprehensive descriptions of the new concepts introduced by the e language, e language syntax, and its as- ciated semantics. This book also describes the architectural views and requirements of verifi- tion environments (randomly generated environments, coverage driven verification environments, etc. ), verification blocks in the architectural views (i. e. generators, initiators, c- lectors, checkers, monitors, coverage definitions, etc. ) and their implementations using the e HVL. Moreover, the e Reuse Methodology (eRM), the motivation for defining such a gui- line, and step-by-step instructions for building an eRM compliant e Verification Component (eVC) are also discussed.
This book introduces 'functional networks', a novel neural-based paradigm, and shows that functional network architectures can be efficiently applied to solve many interesting practical problems. Included is an introduction to neural networks, a description of functional networks, examples of applications, and computer programs in Mathematica and Java languages implementing the various algorithms and methodologies. Special emphasis is given to applications in several areas such as: * Box-Jenkins AR(p), MA(q), ARMA(p, q), and ARIMA (p, d, q) models with application to real-life economic problems such as the consumer price index, electric power consumption and international airlines' passenger data. Random time series and chaotic series are considered in relation to the Henon, Lozi, Holmes and Burger maps, as well as the problems of noise reduction and information masking. * Learning differential equations from data and deriving the corresponding equivalent difference and functional equations. Examples of a mass supported by two springs and a viscous damper or dashpot, and a loaded beam, are used to illustrate the concepts.* The problem of obtaining the most general family of implicit, explicit and parametric surfaces as used in Computer Aided Design (CAD). * Applications of functional networks to obtain general nonlinear regression models are given and compared with standard techniques. Functional Networks with Applications: A Neural-Based Paradigm will be of interest to individuals who work in computer science, physics, engineering, applied mathematics, statistics, economics, and other neural networks and data analysis related fiel
IFIP Working Group 5.2 has organized a series of workshops extending the concept of intelligent CAD to the concept of knowledge intensive engineering. The concept advocates that intensive life-cycle knowledge regarding products and design processes must be incorporated in the center of the CAD architecture. It focuses on the systematization and sharing of knowledge across the life-cycle stages and organizational boundaries. From Knowledge Intensive CAD to Knowledge Intensive Engineering comprises the Proceedings of the Fourth Workshop on Knowledge Intensive CAD, which was sponsored by the International Federation for Information Processing (IFIP) and held in Parma, Italy in May 2000. This workshop looked at the evolution of knowledge intensive design for the product life cycle moving towards knowledge intensive engineering. The 18 selected papers present an overview of the state-of-the-art in knowledge intensive engineering, discussing theoretical aspects and also practical systems and experiences gained in this area.An invited speaker paper is also included, discussing the role of knowledge in product and process innovation and technology for processing semantic knowledge. Main issues discussed in the book are: * Architectures for knowledge intensive CAD; * Tools for knowledge intensive CAD; * Methodologies for knowledge intensive CAD; * Implementation of knowledge intensive CAD; * Applications of knowledge intensive CAD; * Evolution of knowledge intensive design for the life-cycle; * Formal methods. The volume is essential reading for researchers, graduate and postgraduate students, systems developers of advanced computer-aided design and manufacturing systems, and engineers involved in industrial applications.
by Kurt Keutzer Those looking for a quick overview of the book should fast-forward to the Introduction in Chapter 1. What follows is a personal account of the creation of this book. The challenge from Earl Killian, formerly an architect of the MIPS processors and at that time Chief Architect at Tensilica, was to explain the significant performance gap between ASICs and custom circuits designed in the same process generation. The relevance of the challenge was amplified shortly thereafter by Andy Bechtolsheim, founder of Sun Microsystems and ubiquitous investor in the EDA industry. At a dinner talk at the 1999 International Symposium on Physical Design, Andy stated that the greatest near-term opportunity in CAD was to develop tools to bring the performance of ASIC circuits closer to that of custom designs. There seemed to be some synchronicity that two individuals so different in concern and character would be pre-occupied with the same problem. Intrigued by Earl and Andy's comments, the game was afoot. Earl Killian and other veterans of microprocessor design were helpful with clues as to the sources of the performance discrepancy: layout, circuit design, clocking methodology, and dynamic logic. I soon realized that I needed help in tracking down clues. Only at a wonderful institution like the University of California at Berkeley could I so easily commandeer an ab- bodied graduate student like David Chinnery with a knowledge of architecture, circuits, computer-aided design and algorithms.
This text addresses the design methodologies and CAD tools available for the systematic design and design automation of analogue integrated circuits. Two complementary approaches discussed increase analogue design productivity, demonstrated throughout using design times of the different design experiments undertaken.
The modern wireless communication industry has put great demands on circuit designers for smaller, cheaper transceivers in the gigahertz frequency range. One tool which has assisted designers in satisfying these requirements is the use of on-chip inductiveelements (inductors and transformers) in silicon (Si) radio-frequency (RF) integrated circuits (ICs). These elements allow greatly improved levels of performance in Si monolithic low-noise amplifiers, power amplifiers, up-conversion and down-conversion mixers and local oscillators. Inductors can be used to improve the intermodulation distortion performance and noise figure of small-signal amplifiers and mixers. In addition, the gain of amplifier stages can be enhanced and the realization of low-cost on-chip local oscillators with good phase noise characteristics is made feasible. In order to reap these benefits, it is essential that the IC designer be able to predict and optimize the characteristics of on-chip inductiveelements. Accurate knowledge of inductance values, quality factor (Q) and the influence of ad- cent elements (on-chip proximity effects) and substrate losses is essential. In this book the analysis, modeling and application of on-chip inductive elements is considered. Using analyses based on Maxwells equations, an accurate and efficient technique is developed to model these elements over a wide frequency range. Energy loss to the conductive substrate is modeled through several mechanisms, including electrically induced displacement and conductive c- rents and by magnetically induced eddy currents. These techniques have been compiled in a user-friendly software tool ASITIC (Analysis and Simulation of Inductors and Transformers for Integrated Circuits). |
You may like...
|