Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 4 of 4 matches in All Departments
Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5-6s (0.
As VLSI technology moves to the nanometer scale for transistor feature sizes, the impact of manufacturing imperfections result in large variations in the circuit performance. Traditional CAD tools are not well-equipped to handle this scenario, since they do not model this statistical nature of the circuit parameters and performances, or if they do, the existing techniques tend to be over-simplified or intractably slow. Novel Algorithms for Fast Statistical Analysis of Scaled Circuits draws upon ideas for attacking parallel problems in other technical fields, such as computational finance, machine learning and actuarial risk, and synthesizes them with innovative attacks for the problem domain of integrated circuits. The result is a set of novel solutions to problems of efficient statistical analysis of circuits in the nanometer regime.
Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5-6s (0.
As VLSI technology moves to the nanometer scale for transistor feature sizes, the impact of manufacturing imperfections result in large variations in the circuit performance. Traditional CAD tools are not well-equipped to handle this scenario, since they do not model this statistical nature of the circuit parameters and performances, or if they do, the existing techniques tend to be over-simplified or intractably slow. Novel Algorithms for Fast Statistical Analysis of Scaled Circuits draws upon ideas for attacking parallel problems in other technical fields, such as computational finance, machine learning and actuarial risk, and synthesizes them with innovative attacks for the problem domain of integrated circuits. The result is a set of novel solutions to problems of efficient statistical analysis of circuits in the nanometer regime.
|
You may like...
Heart Of A Strong Woman - From Daveyton…
Xoliswa Nduneni-Ngema, Fred Khumalo
Paperback
Comparative Corporate Governance
Afra Afsharipour, Martin Gelter
Hardcover
R6,752
Discovery Miles 67 520
|