![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Offers users the first resource guide that combines both the methodology and basics of SystemVerilog Addresses how all these pieces fit together and how they should be used to verify complex chips rapidly and thoroughly. Unique in its broad coverage of SystemVerilog, advanced functional verification, and the combination of the two.
This book provides a broad overview of current research in optical interconnect technologies and architectures. Introductory chapters on high-performance computing and the associated issues in conventional interconnect architectures, and on the fundamental building blocks for integrated optical interconnect, provide the foundations for the bulk of the book which brings together leading experts in the field of optical interconnect architectures for data communication. Particular emphasis is given to the ways in which the photonic components are assembled into architectures to address the needs of data-intensive on-chip communication, and to the performance evaluation of such architectures for specific applications.
As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a computation but also about the software flow that supports the design process. The goal of this book is to help designers become comfortable with these issues, and thus be able to exploit the vast opportunities possible with reconfigurable logic.
This monograph is motivated by the challenges faced in designing reliable VLSI systems in modern VLSI processes. The reliable operation of integrated circuits (ICs) has become increasingly dif?cult to achieve in the deep submicron (DSM) era. With continuouslydecreasing device feature sizes, combinedwith lower supply voltages and higher operating frequencies, the noise immunity of VLSI circuits is decreasing alarmingly. Thus, VLSI circuits are becoming more vulnerable to noise effects such as crosstalk, power supply variations, and radiation-inducedsoft errors. Among these noise sources, soft errors(or error caused by radiation particle strikes) have become an increasingly troublesome issue for memory arrays as well as c- binational logic circuits. Also, in the DSM era, process variations are increasing at a signi?cant rate, making it more dif?cult to design reliable VLSI circuits. Hence, it is important to ef?ciently design robust VLSI circuits that are resilient to radiation particle strikes and process variations. The work presented in this research mo- graph presents several analysis and design techniques with the goal of realizing VLSI circuits, which are radiation and process variation tolerant.
Internationally refereed papers present the state of the art in computer-aided architectural design research. These papers reflect the theme of the 12th International Conference of CAADFutures, Integrating Technologies for Computer-Aided Design. Collectively, they provide the technological foundation for new ways of thinking about using computers to design. In addition, they address the education of designers themselves.
The Verilog hardware description language (HDL) provides the ability to describe digital and analog systems. This ability spans the range from descriptions that express conceptual and architectural design to detailed descriptions of implementations in gates and transistors. Verilog was developed originally at Gateway Design Automation Corporation during the mid-eighties. Tools to verify designs expressed in Verilog were implemented at the same time and marketed. Now Verilog is an open standard of IEEE with the number 1364. Verilog HDL is now used universally for digital designs in ASIC, FPGA, microprocessor, DSP and many other kinds of design-centers and is supported by most of the EDA companies. The research and education that is conducted in many universities is also using Verilog. This book introduces the Verilog hardware description language and describes it in a comprehensive manner. Verilog HDL was originally developed and specified with the intent of use with a simulator. Semantics of the language had not been fully described until now. In this book, each feature of the language is described using semantic introduction, syntax and examples. Chapter 4 leads to the full semantics of the language by providing definitions of terms, and explaining data structures and algorithms. The book is written with the approach that Verilog is not only a simulation or synthesis language, or a formal method of describing design, but a complete language addressing all of these aspects. This book covers many aspects of Verilog HDL that are essential parts of any design process.
Simulation of computer architectures has made rapid progress recently. The primary application areas are hardware/software performance estimation and optimization as well as functional and timing verification. Recent, innovative technologies such as retargetable simulator generation, dynamic binary translation, or sampling simulation have enabled widespread use of processor and system-on-chip (SoC) simulation tools in the semiconductor and embedded system industries. Simultaneously, processor and SoC simulation is still a very active research area, e.g. what amounts to higher simulation speed, flexibility, and accuracy/speed trade-offs. This book presents and discusses the principle technologies and state-of-the-art in high-level hardware architecture simulation, both at the processor and the system-on-chip level.
Uncertainty in key parameters within a chip and between different chips in the deep sub micron area plays a more and more important role. As a result, manufacturing process spreads need to be considered during the design process. Quantitative methodology is needed to ensure faultless functionality, despite existing process variations within given bounds, during product development. This book presents the technological, physical, and mathematical fundamentals for a design paradigm shift, from a deterministic process to a probability-orientated design process for microelectronic circuits. Readers will learn to evaluate the different sources of variations in the design flow in order to establish different design variants, while applying appropriate methods and tools to evaluate and optimize their design.
This concise reference helps readers avoid the most commonplace errors in generating or interpreting engineering drawings. Applicable across multiple disciplines, Hanifan's lucid treatment of such essential skills as understanding and conveying data in a drawing, exacting precision in dimension and tolerance notations, and selecting the most-appropriate drawing type for a particular engineering situation, "Perfecting Engineering and Technical Drawing" is an valuable resource for practicing engineers, engineering technologists, and students. Provides straightforward explanation of the requirements for all common engineering drawing types Maximizes reader understanding of engineering drawing requirements, differentiating the types of drawings and their particular characteristics Elucidates electrical reference designation requirements, geometric dimensioning, and tolerancing errors Explains the entire engineering documentation process from concept to delivery
This book provides an in-depth overview of on chip instrumentation technologies and various approaches taken in adding instrumentation to System on Chip (ASIC, ASSP, FPGA, etc.) design that are collectively becoming known as Design for Debug (DfD). On chip instruments are hardware based blocks that are added to a design for the specific purpose and improving the visibility of internal or embedded portions of the design (specific instruction flow in a processor, bus transaction in an on chip bus as examples) to improve the analysis or optimization capabilities for a SoC. DfD is the methodology and infrastructure that surrounds the instrumentation. Coverage includes specific design examples and discussion of implementations and DfD tradeoffs in a decision to design or select instrumentation or SoC that include instrumentation. Although the focus will be on hardware implementations, software and tools will be discussed in some detail.
The Future of Design Methodology gives a holistic overview of perspectives for design methodology, addresses trends for developing a powerful methodical support for design practice and provides a starting point for future design research. The chapters are written by leading scientists from around the world, who have great expertise in design methodology, as well as the farsightedness needed to develop design methodology further. The Future of Design Methodology is a detailed contribution to consolidated design methodology and design research. Instead of articulating the views of one scientist, it provides a comprehensive collection of perspectives and visions. The editor highlights the substantial deficiencies and problems of the current design methodology and summarizes the authors' findings to draw future-oriented conclusions. The comprehensive overview of the status of design methodology given in The Future of Design Methodology will help enhance the individual scientific development of junior researchers, while the authoritative perspectives on future design methodology will challenge the views of experts. It is suitable for readers working in a wide range of design fields, such as design methodology, engineering design and industrial design.
SystemC provides a robust set of extensions to the C++ language that enables rapid development of complex models of hardware and software systems. The authors focus on practical use of the language for modeling real systems, showing: A step-by-step build-up of syntax Code examples for each concept Over 8000 lines of downloadable code examples Updates to reflect the SystemC standard, IEEE 1666 Why features are as they are Many resource references How SystemC fits into an ESL methodology This new edition of an industry best seller is updated to reflect the standardization of SystemC as IEEE 1666 and other improvements that reflect feedback from readers of the first edition. The wide ranging feedback also include suggestions from editors of the Japanese and Korean language translations, professors and students, and computer engineers from a broad industrial and geographical spectrum, all who have successfully used the first edition. New chapters have been added on the SystemC Verification Library and the Transaction Level Modeling, and proposed changes to the current SystemC standard. David Black and Jack Donovan, well known consultants in the EDA industry, have teamed with Bill Bunton and Anna Keist, experienced SystemC modeling engineers, to write the second edition of this highly popular classic. As a team the authors bring over 100 years of ASIC and system design experience together to make a very readable introduction to SystemC.
This book reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. Features: presents an overview of the underlying mathematical theory, covering vector spaces, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations; reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces; examines techniques for computing curvature from polygonal meshes; describes algorithms for mesh smoothing, mesh parametrization, and mesh optimization and simplification; discusses point location databases and convex hulls of point sets; investigates the reconstruction of triangle meshes from point clouds, including methods for registration of point clouds and surface reconstruction; provides additional material at a supplementary website; includes self-study exercises throughout the text.
This book collects together several of the tutorials held at EUROGRAPHICS'89 in Hamburg. The conference was held under the motto "Integration, Visualisation, Interaction" and the tutorials reflect the conference theme. The Springer series EurographicSeminars with the volumes "Advances in Computer Graphics" regularly provides a professional update on current mainstream topics in the field. These publications give readers the opportunity to inform themselves thoroughly on the topics covered. The success of the series is mainly based on the expertise of the contributing authors, who are recognized professionals in their field. Starting out with one of the conference's main topics, the chapter "Visualization of Scientific Data" gives an overview of methods for displaying scientific results in an easily surveyable and comprehensible form. It presents algorithms and methods utilized to achieve visualization results in a form adequate for humans. User interfaces for such systems are also explored, and practical conclusions are drawn. The chapter "Color in Computer Graphics" describes the problems of manipulating and matching color in the real world. After some fundamental statements about color models and their relationships, the main emphasis is placed on the problem of objective color specification for computer graphics systems. It is very hard to match colors between devices such as scanners, printers and displays. Some suggestions on the effective use of color for graphics are also made.
Modern electronics depend on nanoscaled technologies that present new challenges in terms of testing and diagnostics. Memories are particularly prone to defects since they exploit the technology limits to get the highest density. This book is an invaluable guide to the testing and diagnostics of the latest generation of SRAM, one of the most widely applied types of memory. Classical methods for testing memory are designed to handle the so-called "static faults," but these test solutions are not sufficient for faults that are emerging in the latest Very Deep Sub-Micron (VDSM) technologies. These new fault models, referred to as "dynamic faults", are not covered by classical test solutions and require the dedicated test sequences presented in this book.
Many problems in scientific computing are intractable with classical numerical techniques. These fail, for example, in the solution of high-dimensional models due to the exponential increase of the number of degrees of freedom. Recently, the authors of this book and their collaborators have developed a novel technique, called Proper Generalized Decomposition (PGD) that has proven to be a significant step forward. The PGD builds by means of a successive enrichment strategy a numerical approximation of the unknown fields in a separated form. Although first introduced and successfully demonstrated in the context of high-dimensional problems, the PGD allows for a completely new approach for addressing more standard problems in science and engineering. Indeed, many challenging problems can be efficiently cast into a multi-dimensional framework, thus opening entirely new solution strategies in the PGD framework. For instance, the material parameters and boundary conditions appearing in a particular mathematical model can be regarded as extra-coordinates of the problem in addition to the usual coordinates such as space and time. In the PGD framework, this enriched model is solved only once to yield a parametric solution that includes all particular solutions for specific values of the parameters. The PGD has now attracted the attention of a large number of research groups worldwide. The present text is the first available book describing the PGD. It provides a very readable and practical introduction that allows the reader to quickly grasp the main features of the method.Throughout the book, the PGD is applied to problems of increasing complexity, and the methodology is illustrated by means of carefully selected numerical examples. Moreover, the reader has free access to the Matlab(c) software used to generate these examples."
Innovation in Product Design gives an overview of the research fields and achievements in the development of methods and tools for product design and innovation. It presents contributions from experts in many different fields covering a variety of research topics related to product development and innovation. Product lifecycle management, knowledge management, product customization, topological optimization, product virtualization, systematic innovation, virtual humans, design and engineering, and rapid prototyping are the key research areas described in the book. It also details successful case studies developed with industrial companies. Innovation in Product Design is written for academic researchers, graduate students and professionals in product development disciplines who are interested in understanding how novel methodologies and technologies can make the product development process more efficient.
New imaging technology and more sophisticated image processing systems will have a profound effect on those areas of medicine which are concerned with imaging for diagnosis and therapy planning. Digitally formated data will form the basis of an increasing number of medical imaging modalities. Before the diagnostic imaging department of the future will largely be digital, many problems have still to be solved as regards image quality, costs, and ease of use. The computer and other information science derived methods will contribute towards solving many of the problems in these areas. It is widely expected that there will be an information science derived evolution in imaging for radiology and related departments. Computer assistance may be applied to image generation, e.g. CT, MRI, DR and DSR, storing and transferring of images, and viewing, analysing and interpreting of images. The application of computers to these activities (which characterise radiological departments), may be defined as Computer Assisted Radiology (CAR) . In the main, CAR will promote the transition from analog imaging systems to digital systems, integration of digital imaging modalities through Picture Archiving and Communication Systems (PACS') and the graduated employment of Medica~ Work Stations (MWS) for diagnosis and therapy planning. It will transfer geographically, organisationally and/or mentally isolate imaging activities towards fully integrated multi-imaging modality diagnostic departments. This development will have a considerable impact on patient management, on the medical profession and on the health care system.
Ontologies are increasingly recognized as essential tools in information science. Although the concepts are well understood theoretically , the practical implementation of ontologies remains challenging. In this book, researchers in computer science, information systems, ontology engineering, urban planning and design, civil and building engineering, and architecture present an interdisciplinary study of ontology engineering and its application in urban development projects. The first part of the book introduces the general notion of ontology, describing variations in abstraction level, coverage, and formality. It also discusses the use of ontologies to achieve interoperability, and to represent multiple points of view and multilingualism. This is illustrated with examples from the urban domain. The second part is specific to urban development. It covers spatial and geographical knowledge representation, the creation of urban ontologies from various knowledge sources, the interconnection of urban models and the interaction between standards and domain models. The third part presents case studies of the development of ontologies for urban mobility, urban morphological processes, road systems, and cultural heritage. Other cases report on the use of ontologies to solve urban development problems, in construction business models, building regulations and urban regeneration. It concludes with a discussion of key challenges for the future deployment of ontologies in this domain. This book bridges the gap between urban practitioners and computer scientists. As the essence of most urban projects lies in making connections between worldviews, ontology development has an important role to play, in promoting interoperability between data sources, both formal (urban databases, Building Integrated Models, Geographical Information Systems etc.) and less formal (thesauri, text records, web sources etc.). This volume offers a comprehensive introduction to ontology engineering for urban development. It is essential reading for practitioners and ontology designers working in urban development.
Boundary-Scan, formally known as IEEE/ANSI Standard 1149.1-1990, is a collection of design rules applied principally at the Integrated Circuit (IC) level that allow software to alleviate the growing cost of designing, producing and testing digital systems. A fundamental benefit of the standard is its ability to transform extremely difficult printed circuit board testing problems that could only be attacked with ad-hoc testing methods into well-structured problems that software can easily deal with. IEEE standards, when embraced by practicing engineers, are living entities that grow and change quickly. The Boundary-Scan Handbook, Second Edition: Analog and Digital is intended to describe these standards in simple English rather than the strict and pedantic legalese encountered in the standards. The 1149.1 standard is now over eight years old and has a large infrastructure of support in the electronics industry. Today, the majority of custom ICs and programmable devices contain 1149.1. New applications for the 1149.1 protocol have been introduced, most notably the `In-System Configuration' (ISC) capability for Field Programmable Gate Arrays (FPGAs). The Boundary-Scan Handbook, Second Edition: Analog and Digital updates the information about IEEE Std. 1149.1, including the 1993 supplement that added new silicon functionality and the 1994 supplement that formalized the BSDL language definition. In addition, the new second edition presents completely new information about the newly approved 1149.4 standard often termed `Analog Boundary-Scan'. Along with this is a discussion of Analog Metrology needed to make use of 1149.1. This forms a toolset essential for testing boards and systems of the future.
This book constitutes the refereed proceedings of the 10th International Conference on Cooperative Design, Visualization, and Engineering, CDVE 2013, held in Palma de Mallorca, Spain, in September 2013. The 34 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers cover all the topics of cooperative engineering, basic theories, methods and technologies that support CDVE, cooperative design, visualization and applications. There are special contributions dealing with the cooperative issues brought by the Internet of things - such as the situation in the ambient assisted living systems. Other papers in the volume cover a wide range of cooperative application topics such as cooperative e-learning, cooperative decision making and cooperative simulation etc.
This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.
This book proposes a new approach to circuit simulation that is still in its infancy. The reason for publishing this work as a monograph at this time is to quickly distribute these ideas to the research community for further study. The book is based on a doctoral dissertation undertaken at MIT between 1982 and 1985. In 1982 the author joined a research group that was applying bounding techniques to simple VLSI timing analysis models. The conviction that bounding analysis could also be successfully applied to sophisticated digital MOS circuit models led to the research presented here. Acknowledgments 'me author would like to acknowledge many helpful discussions and much support from his research group at MIT, including Lance Glasser, John Wyatt, Jr., and Paul Penfield, Jr. Many others have also contributed to this work in some way, including Albert Ruchli, Mark Horowitz, Rich Zippel, Chtis Terman, Jacob White, Mark Matson, Bob Armstrong, Steve McCormick, Cyrus Bamji, John Wroclawski, Omar Wing, Gary Dare, Paul Bassett, and Rick LaMaire. The author would like to give special thanks to his wife, Deborra, for her support and many contributions to the presentation of this research. The author would also like to thank his parents for their encouragement, and IBM for its financial support of t, I-Jis project through a graduate fellowship. THE BOUNDING APPROACH TO VLSI CIRCUIT SIMULATION 1. INTRODUCTION The VLSI revolution of the 1970's has created a need for new circuit analysis techniques.
From simple cases such as hook and latch attachments found in
Velcro to articulated-wing flying vehicles, biology often has been
used to inspire many creative design ideas. The scientific
challenge now is to transform the paradigm into a repeatable and
scalable methodology. Biologically Inspired Design explores
computational techniques and tools that can help integrate the
method into design practice. By exploring these fundamental theories, techniques and tools
for supporting biologically inspired design, this volume provides a
comprehensive resource for design practitioners wishing to explore
the paradigm, an invaluable guide to design educators interested in
teaching the method, and a preliminary reading for design
researchers wanting to investigate bioinspired design. " |
You may like...
Thin-Walled Structures - Research and…
J.Y.Richard Liew, V. Thevendran, …
Hardcover
R6,092
Discovery Miles 60 920
Computer Modelling of Microporous…
C.Richard A. Catlow, Berend Smit, …
Hardcover
R4,306
Discovery Miles 43 060
Nanofluid Applications for Advanced…
Shriram S. Sonawane, Mohsen Sharifpur
Paperback
R3,922
Discovery Miles 39 220
New Perspectives on Information Systems…
Antonio Miguel Rosado da Cruz, Maria Estrela Ferreira da Cruz
Hardcover
R5,853
Discovery Miles 58 530
Human Resource Information Systems…
Michael J Kavanagh, Richard D. Johnson
Paperback
R2,036
Discovery Miles 20 360
|