![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > General
In 1998-99, at the dawn of the SoC Revolution, we wrote Surviving the SOC Revolution: A Guide to Platform Based Design. In that book, we focused on presenting guidelines and best practices to aid engineers beginning to design complex System-on-Chip devices (SoCs). Now, in 2003, facing the mid-point of that revolution, we believe that it is time to focus on winning. In this book, Winning the SoC Revolution: Experiences in Real Design, we gather the best practical experiences in how to design SoCs from the most advanced design groups, while setting the issues and techniques in the context of SoC design methodologies. As an edited volume, this book has contributions from the leading design houses who are winning in SoCs - Altera, ARM, IBM, Philips, TI, UC Berkeley, and Xilinx. These chapters present the many facets of SoC design - the platform based approach, how to best utilize IP, Verification, FPGA fabrics as an alternative to ASICs, and next generation process technology issues. We also include observations from Ron Wilson of CMP Media on best practices for SoC design team collaboration. We hope that by utilizing this book, you too, will win the SoC Revolution.
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
The building blocks of today's embedded systems-on-a-chip are complex IP components and programmable processor cores. This means that more and more system functionality is implemented in software rather than in custom hardware. In turn, this indicates a growing need for high-level language compilers, capable of generating efficient code for embedded processors. However, traditional compiler technology hardly keeps pace with new developments in embedded processor architectures. Many existing compilers for DSPs and multimedia processors therefore produce code of insufficient quality with respect to performance and/or code size, and a large part of software for embedded systems is still being developed in assembly languages. As both embedded software as well as processors architectures are getting more and more complex, assembly programming clearly violates the demands for a short time-to-market and high dependability in embedded system design. The goal of this book is to provide new methods and techniques to software and compiler developers, that help to make the necessary step from assembly programming to the use of compilers also in embedded system design. Code Optimization Techniques for Embedded Processors discusses the state-of-the-art in the area of compilers for embedded processors. It presents a collection of new code optimization techniques, dedicated to DSP and multimedia processors. These include: compiler support for DSP address generation units, efficient mapping of data flow graphs to irregular architectures, exploitation of SIMD and conditional instructions, as well as function inlining under code size constraints. Comprehensive experimental evaluations are given forreal-life processors, that indicate the code quality improvements which can be achieved as compared to earlier techniques. In addition, C compiler frontend issues are discussed from a practical viewpoint. Code Optimization Techniques for Embedded Processors is intended for researchers and engineers active in software development for embedded systems, and for compiler developers in academia and industry.
Algorithms for VLSI Physical Design Automation, Third Edition covers all aspects of physical design. The book is a core reference for graduate students and CAD professionals. For students, concepts and algorithms are presented in an intuitive manner. For CAD professionals, the material presents a balance of theory and practice. An extensive bibliography is provided which is useful for finding advanced material on a topic. At the end of each chapter, exercises are provided, which range in complexity from simple to research level. Algorithms for VLSI Physical Design Automation, Third Edition provides a comprehensive background in the principles and algorithms of VLSI physical design. The goal of this book is to serve as a basis for the development of introductory-level graduate courses in VLSI physical design automation. It provides self-contained material for teaching and learning algorithms of physical design. All algorithms which are considered basic have been included, and are presented in an intuitive manner. Yet, at the same time, enough detail is provided so that readers can actually implement the algorithms given in the text and use them. The first three chapters provide the background material, while the focus of each chapter of the rest of the book is on each phase of the physical design cycle. In addition, newer topics such as physical design automation of FPGAs and MCMs have been included. The basic purpose of the third edition is to investigate the new challenges presented by interconnect and process innovations. In 1995 when the second edition of this book was prepared, a six-layer process and 15 million transistor microprocessors were in advanced stages of design. In 1998, six metal process and 20 million transistor designs are in production. Two new chapters have been added and new material has been included in almost allother chapters. A new chapter on process innovation and its impact on physical design has been added. Another focus of the third edition is to promote use of the Internet as a resource, so wherever possible URLs have been provided for further investigation. Algorithms for VLSI Physical Design Automation, Third Edition is an important core reference work for professionals as well as an advanced level textbook for students.
Embedded computer systems use both off-the-shelf microprocessors and application-specific integrated circuits (ASICs) to implement specialized system functions. Examples include the electronic systems inside laser printers, cellular phones, microwave ovens, and an automobile anti-lock brake controller. Embedded computing is unique because it is a co-design problem - the hardware engine and application software architecture must be designed simultaneously. Hardware-Software Co-Synthesis of Distributed Embedded Systems proposes new techniques such as fixed-point iterations, phase adjustment, and separation analysis to efficiently estimate tight bounds on the delay required for a set of multi-rate processes preemptively scheduled on a real-time reactive distributed system. Based on the delay bounds, a gradient-search co-synthesis algorithm with new techniques such as sensitivity analysis, priority prediction, and idle- processing elements elimination are developed to select the number and types of processing elements in a distributed engine, and determine the allocation and scheduling of processes to processing elements. New communication modeling is also presented to analyze communication delay under interaction of computation and communication, allocate interprocessor communication links, and schedule communication. Hardware-Software Co-Synthesis of Distributed Embedded Systems is the first book to describe techniques for the design of distributed embedded systems, which have arbitrary hardware and software topologies. The book will be of interest to: academic researchers for personal libraries and advanced-topics courses in co-design as well as industrial designers who are building high-performance, real-time embedded systems with multiple processors.
The purpose of this book is to introduce VHSIC Hardware Description Lan guage (VHDL) and its use for synthesis. VHDL is a hardware description language which provides a means of specifying a digital system over different levels of abstraction. It supports behavior specification during the early stages of a design process and structural specification during the later implementation stages. VHDL was originally introduced as a hardware description language that per mitted the simulation of digital designs. It is now increasingly used for design specifications that are given as the input to synthesis tools which translate the specifications into netlists from which the physical systems can be built. One problem with this use of VHDL is that not all of its constructs are useful in synthesis. The specification of delay in signal assignments does not have a clear meaning in synthesis, where delays have already been determined by the im plementationtechnolo y. VHDL has data-structures such as files and pointers, useful for simulation purposes but not for actual synthesis. As a result synthe sis tools accept only subsets of VHDL. This book tries to cover the synthesis aspect of VHDL, while keeping the simulation-specifics to a minimum. This book is suitable for working professionals as well as for graduate or under graduate study. Readers can view this book as a way to get acquainted with VHDL and how it can be used in modeling of digital designs."
This book grants the reader a comprehensive overview of the state-of-the-art in system-level memory management (data transfer and storage) related issues for complex data-dominated real-time signal and data processing applications. The authors introduce their own system-level data transfer and storage exploration methodology for data-dominated video applications. This methodology tackles the power and area reduction cost components in the architecture for this target domain, namely the system-level busses and the background memories. For the most critical tasks in the methodology, prototype tools have been developed to reduce the design time. The approach is also very heavily application-driven which is illustrated by several realistic demonstrators, partly used as red-thread examples in the book. The quite general applicability and effectiveness has been substantiated for several industrial data-dominated applications, including H.263 video conferencing decoding and medical computer tomography (CT) back projection. To the researcher the book will serve as an excellent reference source, both for the overall description of the methodology and for the detailed descriptions of the system-level methodologies and synthesis techniques and algorithms. To the design engineers and CAD managers it offers an invaluable insight into the anticipated evolution of commercially available design tools as well as allowing them to utilize the book's concepts in their own research and development.
Co-Design is the set of emerging techniques which allows for the simultaneous design of Hardware and Software. In many cases where the application is very demanding in terms of various performances (time, surface, power consumption), trade-offs between dedicated hardware and dedicated software are becoming increasingly difficult to decide upon in the early stages of a design. Verification techniques - such as simulation or proof techniques - that have proven necessary in the hardware design must be dramatically adapted to the simultaneous verification of Software and Hardware. Describing the latest tools available for both Co-Design and Co-Verification of systems, Hardware/Software Co-Design and Co-Verification offers a complete look at this evolving set of procedures for CAD environments. The book considers all trade-offs that have to be made when co-designing a system. Several models are presented for determining the optimum solution to any co-design problem, including partitioning, architecture synthesis and code generation. When deciding on trade-offs, one of the main factors to be considered is the flow of communication, especially to and from the outside world. This involves the modeling of communication protocols. An approach to the synthesis of interface circuits in the context of co-design is presented. Other chapters present a co-design oriented flexible component data-base and retrieval methods; a case study of an ethernet bridge, designed using LOTOS and co-design methodologies and finally a programmable user interface based on monitors. Hardware/Software Co-Design and Co-Verification will help designers and researchers to understand these latest techniques in system design and as such will be of interest to all involved in embedded system design.
This is the first book to cover verification strategies and methodologies for SOC verification from system level verification to the design sign-off. All the verification aspects in this exciting new book are illustrated with a single reference design for Bluetooth application.
Object-oriented techniques and languages have been proven to significantly increase engineering efficiency in software development. Many benefits are expected from their introduction into electronic modeling. Among them are better support for model reusability and flexibility, more efficient system modeling, and more possibilities in design space exploration and prototyping. Object-Oriented Modeling explores the latest techniques in object-oriented methods, formalisms and hardware description language extensions. The seven chapters comprising this book provide an overview of the latest object-oriented techniques for designing systems and hardware. Many examples are given in C++, VHDL and real-time programming languages. Object-Oriented Modeling describes further the use of object-oriented techniques in applications such as embedded systems, telecommunications and real-time systems, using the very latest techniques in object-oriented modeling. It is an essential guide to researchers, practitioners and students involved in software, hardware and system design.
From the reviews: "This book crystallizes what may become a defining moment in the electronics industry - the shift to platform-based design. It provides the first comprehensive guidebook for those who will build, and use, the integration platforms that may soon drive the system-on-chip revolution." Electronic Engineering Times
Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.
ConieD is the biannual Congress on Computers in Education, organised by the Spanish Association for the Development of Computers in Education (ADIE). The last Congress, held in Puertollano (Ciudad Real), brought together researchers in different areas, ranging from web applications, educational environments, or Human-Computer Interaction to Artificial Intelligence in Education. The common leitmotiv of the major part of the lectures was the World Wide Web. In particular, the focus was on the real possibilities that this media presents in order to make the access of students to educational resources possible anywhere and anytime. This fact was highlighted in the Conclusions of the Congress following this Preface as the Introduction. From the full 92 papers presented to the Programme Committee we have selected the best 24 papers that we are presenting in this book. The selection of papers was a very difficult process, taking into account that the papers presented in the Congress (60) were all good enough to appear in this book. Only the restrictions of the extension of this book have limited the number of papers to 24. These papers represent the current high-quality contributions of Spanish research groups in Computers in Education. Manuel Ortega Cantero Jose Bravo Rodriguez Editors xiii Introduction ConieD'99 (1st National Congress on Computers in Education) has brought together a very important group of Spanish and Latin American researchers devoted to studying the application and use of computers in education."
The VITAL specification addresses the issues of interoperability, backannotation and high performance simulation for sign-off quality ASIC libraries in VHDL. VITAL provides modeling guidelines and a set of pre-defined packages (containing pre-defined routines for modeling functionality and timing) to facilitate the acceleration of designs which use cells from a VITAL library. The VITAL Level-I guidelines constrain the modeling capabilities provided by VHDL in order to facilitate higher performance (Figure I). Accumulating "gains" Constrained "flexibility" Higher performance & Increased capacity Benefits Flexibility FujI VHDL 1076 Figure 1: VHDL and VITAL Even within the Level-I guidelines, there are several ways in which a model can be written. In this chapter, we highlight the various modeling trade-offs and provide guidelines which can be used for developing efficient models. We will also discuss the techniques that can be used by tool developers to accelerate the simulation of VIT AL based designs. 2.2. OVERVIEW OF A VITAL LEVEL-l ARCIDTECTURE The VITAL specification is versatile enough to support several modeling styles e.g., distributed delay style, pin-to-pin delay style etc. In general, a VITAL Level-I model can have the structure illustrated in Figure 2."
Several aspects of informatics curricula and teaching methods at
the university level are reported in this volume, including: This book contains a selection of the papers presented at the Working Conference on Informatics Curricula, Teaching Methods and Best Practice (ICTEM 2002), which was sponsored by the International Federation for Information Processing (IFIP) Working Group 3.2, and held in Florian polis, Brazil in July 2002. The working groups were organized in three parallel tracks. Working Group 1 discussed the "Directions and Challenges in Informatics Education." The focus of Working Group 2 was "Teaching Programming and Problem Solving." Working Group 3 discussed "Computing: The Shape of an Evolving Discipline." Each working group worked actively and prepared a report with the results of the discussions; these reports are included as the second part of this book.
This text describes the advanced concepts and techniques used for ASIC chip synthesis, formal verification and static timing analysis, using the Synopsys suite of tools. In addition, the entire ASIC design flow methodology targeted for VDSM (Very-Deep-Sub-Micron) technologies is covered in detail. The emphasis of this book is on real-time application of Synopsys tools used to combat various problems seen at VDSM geometries. Readers are exposed to an effective design methodology for handling complex, sub-micron ASIC designs. Significance is placed on HDL coding styles, synthesis and optimization, dynamic simulation, formal verification, DFT scan insertion, links to layout, and static timing analysis. At each step, problems related to each phase of the design flow are identified, with solutions and work-arounds described in detail. In addition, crucial issues related to layout, which includes clock tree synthesis and back-end integration (links to layout) are also discussed at length. The book is intended for anyone who is involved in the ASIC design methodology, starting from RTL synthesis to final tape-out. Target audiences for this book are practicing ASIC design engineers and graduate students undertaking advanced courses in ASIC chip design and DFT techniques.
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
SECIII-Social, Ethical and Cognitive Issues of Informatics and ICT Welcome to the post-conference book of SECIII, the IFIP Open Conference on Social, Ethical and Cognitive Issues of Informatics and ICT (Information and Communication Technology) which took place from July 22-26, 2002 at the University of Dortmund, Germany, in co-operation with the German computer society (Gesellschaft flir Informatik). Unlike most international conferences, those organised within the IFIP education community are active events. This wasn't a dry academic conference - teachers, lecturers and curriculum experts, policy makers, researchers and manufacturers mingled and worked together to explore, reflect and discuss social, ethical and cognitive issues. The added value lies in what they, the participants, took away in new ideas for future research and practice, and in the new networks that were formed, both virtual and real. In addition to Keynote Addresses and Paper Presentations from international authors, there were Provocative Paper sessions, Case Studies, Focussed Debates and Creative Exchange sessions as well as professional Working Groups who debated particular themes. The Focussed Debate sessions helped to stimulate the sense of engagement among conference participants. A Market Place with follow-up Working Groups was a positive highlight and galvanised participants to produce interesting reports. These were presented to the conference on its last day. Cross-fertilisation between the papers generated some surprising and useful cross-referencing and a plethora of social, ethical and cognitive issues emerged in the discussions that followed the paper presentations. |
![]() ![]() You may like...
Agro-Climatology Advances And Challenges
T N Balasubramanian
Hardcover
R7,193
Discovery Miles 71 930
How Big Things Get Done - The Surprising…
Bent Flyvbjerg, Dan Gardner
Paperback
|