![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This volume contains a collection of papers presented at the NATO Advanced Study Institute on *Testing and Diagnosis of VLSI and ULSI" held at Villa Olmo, Como (Italy) June 22 -July 3,1987. High Density technologies such as Very-Large Scale Integration (VLSI), Wafer Scale Integration (WSI) and the not-so-far promises of Ultra-Large Scale Integration (ULSI), have exasperated the problema associated with the testing and diagnosis of these devices and systema. Traditional techniques are fast becoming obsolete due to unique requirements such as limited controllability and observability, increasing execution complexity for test vector generation and high cost of fault simulation, to mention just a few. New approaches are imperative to achieve the highly sought goal of the * three months* turn around cycle time for a state-of-the-art computer chip. The importance of testing and diagnostic processes is of primary importance if costs must be kept at acceptable levels. The objective of this NATO-ASI was to present, analyze and discuss the various facets of testing and diagnosis with respect to both theory and practice. The contents of this volume reflect the diversity of approaches currently available to reduce test and diagnosis time. These approaches are described in a concise, yet clear way by renowned experts of the field. Their contributions are aimed at a wide readership: the uninitiated researcher will find the tutorial chapters very rewarding. The expert wiII be introduced to advanced techniques in a very comprehensive manner.
The advent of computer aided design and the proliferation of computer aided design tools have been instrumental in furthering the state-of-the art in integrated circuitry. Continuing this progress, however, demands an emphasis on creating user-friendly environments that facilitate the interaction between the designer and the CAD tool. The realization of this fact has prompted investigations into the appropriateness for CAD of a number of user-interface technologies. One type of interface that has hitherto not been considered is the natural language interface. It is our contention that natural language interfaces could solve many of the problems posed by the increasing number and sophistication of CAD tools. This thesis represents the first step in a research effort directed towards the eventual development of a natural language interface for the domain of computer aided design. The breadth and complexity of the CAD domain renders the task of developing a natural language interface for the complete domain beyond the scope of a single doctoral thesis. Hence, we have initally focussed on a sub-domain of CAD. Specifically, we have developed a natural language interface, named Cleopatra, for circuit-simulation post-processing. In other words, with Cleopatra a circuit-designer can extract and manipulate, in English, values from the output of a circuit-simulator (currently SPICE) without manually having to go through the output files produced by the simulator."
There is a growing social interest in developing vision-based vehicle guidance systems for improving traffic safety and efficiency and the environment. Ex amples of vision-based vehicle guidance systems include collision warning systems, steering control systems for tracking painted lane marks, and speed control systems for preventing rear-end collisions. Like other guidance systems for aircraft and trains, these systems are ex pected to increase traffic safety significantly. For example, safety improve ments of aircraft landing processes after the introduction of automatic guidance systems have been reported to be 100 times better than prior to installment. Although the safety of human lives is beyond price, the cost for automatic guidance could be compensated by decreased insurance costs. It is becoming more important to increase traffic safety by decreasing the human driver's load in our society, especially with an increasing population of senior people who continue to drive. The second potential social benefit is the improvement of traffic efficiency by decreasing the spacing between vehicles without sacrificing safety. It is reported, for example, that four times the efficiency is expected if the spacing between cars is controlled automatically at 90 cm with a speed of 100 kmjh compared to today's typical manual driving. Although there are a lot of tech nical, psychological, and social issues to be solved before realizing the high density jhigh-speed traffic systems described here, highly efficient highways are becoming more important because of increasing traffic congestion."
This book, and the research it describes, resulted from a simple observation we made sometime in 1986. Put simply, we noticed that many VLSI design tools looked "alike." That is, at least at the overall software architecture level, the algorithms and data structures required to solve problem X looked much like those required to solve problem X'. Unfortunately, this resemblance is often of little help in actually writing the software for problem X' given the software for problem X. In the VLSI CAD world, technology changes rapidly enough that design software must continually strive to keep up. And of course, VLSI design software, and engineering design software in general, is often exquisitely sensitive to some aspects of the domain (technology) in which it operates. Modest changes in functionality have an unfortunate tendency to require substantial (and time-consuming) internal software modifications. Now, observing that large engineering software systems are technology dependent is not particularly clever. However, we believe that our approach to xiv Preface dealing with this problem took an interesting new direction. We chose to investigate the extent to which automatic programming ideas cold be used to synthesize such software systems from high-level specifications. This book is one of the results of that effort."
Hardware Component Modeling highlights the current status of the modeling of electronic components. It includes contributions from many of the leading researchers and practitioners in the field. The contents focus on four important topics. Standards: Three chapters describe current developments in employing standards for the use of component libraries. A major part of these chapters provides an excellent introduction to VITAL (an IEEE standard), its application and some of the issues in using and implementing it. There are, however, other standards with a role to play and these are also covered. Data Types: One chapter describes the latest techniques for using data types in modeling and simulation. Model Generation: One chapter describes a model generator for reusable component models and another describes a generator which takes actual physical data as its source and generates a functional model. Quality Assurance: Two chapters are devoted to improving the quality of models. One introduces a method for quantifying aspects of model quality and the other introduces quality concepts which can lead to an increase in model value through reuse and robustness.Hardware Component Modeling is a valuable reference for researchers and practitioners involved in the process of modeling electronic components.
Symbolic Boolean manipulation using binary decision diagrams (BDDs) has been successfully applied to a wide variety of tasks, particularly in very large scale integration (VLSI) computer-aided design (CAD). The concept of decision graphs as an abstract representation of Boolean functions dates back to the early work by Lee and Akers. In the last ten years, BDDs have found widespread use as a concrete data structure for symbolic Boolean manipulation. With BDDs, functions can be constructed, manipulated, and compared by simple and efficient graph algorithms. Since Boolean functions can represent not just digital circuit functions, but also such mathematical domains as sets and relations, a wide variety of CAD problems can be solved using BDDs. Binary Decision Diagrams and Applications for VLSI CAD provides valuable information for both those who are new to BDDs as well as to long time aficionados.' -from the Foreword by Randal E. Bryant. Over the past ten years ... BDDs have attracted the attention of many researchers because of their suitability for representing Boolean functions. They are now widely used in many practical VLSI CAD systems. ... this book can serve as an introduction to BDD techniques and ... it presents several new ideas on BDDs and their applications. ... many computer scientists and engineers will be interested in this book since Boolean function manipulation is a fundamental technique not only in digital system design but also in exploring various problems in computer science.' - from the Preface by Shin-ichi Minato.
Circuit simulation has been a topic of great interest to the integrated circuit design community for many years. It is a difficult, and interesting, problem be cause circuit simulators are very heavily used, consuming thousands of computer hours every year, and therefore the algorithms must be very efficient. In addi tion, circuit simulators are heavily relied upon, with millions of dollars being gambled on their accuracy, and therefore the algorithms must be very robust. At the University of California, Berkeley, a great deal of research has been devoted to the study of both the numerical properties and the efficient imple mentation of circuit simulation algorithms. Research efforts have led to several programs, starting with CANCER in the 1960's and the enormously successful SPICE program in the early 1970's, to MOTIS-C, SPLICE, and RELAX in the late 1970's, and finally to SPLICE2 and RELAX2 in the 1980's. Our primary goal in writing this book was to present some of the results of our current research on the application of relaxation algorithms to circuit simu lation. As we began, we realized that a large body of mathematical and exper imental results had been amassed over the past twenty years by graduate students, professors, and industry researchers working on circuit simulation. It became a secondary goal to try to find an organization of this mass of material that was mathematically rigorous, had practical relevance, and still retained the natural intuitive simplicity of the circuit simulation subject."
Over the years there has been a large increase in the functionality available on a single integrated circuit. This has been mainly achieved by a continuous drive towards smaller feature sizes, larger dies, and better packing efficiency. However, this greater functionality has also resulted in substantial increases in the capital investment needed to build fabrication facilities. Given such a high level of investment, it is critical for IC manufacturers to reduce manufacturing costs and get a better return on their investment. The most obvious method of reducing the manufacturing cost per die is to improve manufacturing yield. Modern VLSI research and engineering (which includes design manufacturing and testing) encompasses a very broad range of disciplines such as chemistry, physics, material science, circuit design, mathematics and computer science. Due to this diversity, the VLSI arena has become fractured into a number of separate sub-domains with little or no interaction between them. This is the case with the relationships between testing and manufacturing. From Contamination to Defects, Faults and Yield Loss: Simulation and Applications focuses on the core of the interface between manufacturing and testing, i.e., the contamination-defect-fault relationship. The understanding of this relationship can lead to better solutions of many manufacturing and testing problems. Failure mechanism models are developed and presented which can be used to accurately estimate probability of different failures for a given IC. This information is critical in solving key yield-related applications such as failure analysis, fault modeling and design manufacturing.
Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."
Representations of Discrete Functions is an edited volume containing 13 chapter contributions from leading researchers with a focus on the latest research results. The first three chapters are introductions and contain many illustrations to clarify concepts presented in the text. It is recommended that these chapters are read first. The book then deals with the following topics: binary decision diagrams (BDDs), multi-terminal binary decision diagrams (MTBDDs), edge-valued binary decision diagrams (EVBDDs), functional decision diagrams (FDDs), Kronecker decision diagrams (KDDs), binary moment diagrams (BMDs), spectral transform decision diagrams (STDDs), ternary decision diagrams (TDDs), spectral transformation of logic functions, other transformations oflogic functions, EXOR-based two-level expressions, FPRM minimization with TDDs and MTBDDs, complexity theories on FDDs, multi-level logic synthesis, and complexity of three-level logic networks. Representations of Discrete Functions is designed for CAD researchers and engineers and will also be of interest to computer scientists who are interested in combinatorial problems. Exercises prepared by the editors help make this book useful as a graduate level textbook.
Recently there has been increased interest in the development of computer-aided design programs to support the system level designer of integrated circuits more actively. Such design tools hold the promise of raising the level of abstraction at which an integrated circuit is designed, thus releasing the current designers from many of the details of logic and circuit level design. The promise further suggests that a whole new group of designers in neighboring engineering and science disciplines, with far less understanding of integrated circuit design, will also be able to increase their productivity and the functionality of the systems they design. This promise has been made repeatedly as each new higher level of computer-aided design tool is introduced and has repeatedly fallen short of fulfillment. This book presents the results of research aimed at introducing yet higher levels of design tools that will inch the integrated circuit design community closer to the fulfillment of that promise. 1. 1. SYNTHESIS OF INTEGRATED CmCUITS In the integrated circuit (Ie) design process, a behavior that meets certain specifications is conceived for a system, the behavior is used to produce a design in terms of a set of structural logic elements, and these logic elements are mapped onto physical units. The design process is impacted by a set of constraints as well as technological information (i. e. the logic elements and physical units used for the design).
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom munications, robotics, automotive electronics, image processing, intelli gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar chitecture and circuit compilation, device sizing and the layout genera tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers."
This volume, which contains 15 contributions, is based on a minicourse held at the 1987 IEEE Plasma Science Meeting. The purpose of the lectures in the course was to acquaint the students with the multidisciplinary nature of computational techniques and the breadth of research areas in plasma science in which computation can address important physics and engineering design issues. These involve: electric and magnetic fields, MHD equations, chemistry, radiation, ionization etc. The contents of the contributions, written subsequent to the minicourse, stress important aspects of computer applications. They are: 1) the numerical methods used; 2) the range of applicability; 3) how the methods are actually employed in research and in the design of devices; and, as a compendium, 4) the multiplicity of approaches possible for any one problem. The materials in this book are organized by both subject and applications which display some of the richness in computational plasma physics.
Logic Synthesis Using Synopsys (R), Second Edition is for anyone who hates reading manuals but would still like to learn logic synthesis as practised in the real world. Synopsys Design Compiler, the leading synthesis tool in the EDA marketplace, is the primary focus of the book. The contents of this book are specially organized to assist designers accustomed to schematic capture-based design to develop the required expertise to effectively use the Synopsys Design Compiler. Over 100 `Classic Scenarios' faced by designers when using the Design Compiler have been captured, discussed and solutions provided. These scenarios are based on both personal experiences and actual user queries. A general understanding of the problem-solving techniques provided should help the reader debug similar and more complicated problems. In addition, several examples and dc_shell scripts (Design Compiler scripts) have also been provided. Logic Synthesis Using Synopsys (R), Second Edition is an updated and revised version of the very successful first edition. The second edition covers several new and emerging areas, in addition to improvements in the presentation and contents in all chapters from the first edition. With the rapid shrinking of process geometries it is becoming increasingly important that `physical' phenomenon like clusters and wire loads be considered during the synthesis phase. The increasing demand for FPGAs has warranted a greater focus on FPGA synthesis tools and methodology. Finally, behavioral synthesis, the move to designing at a higher level of abstraction than RTL, is fast becoming a reality. These factors have resulted in the inclusion of separate chapters in the second edition to cover Links to Layout, FPGA Synthesis and Behavioral Synthesis, respectively. Logic Synthesis Using Synopsys (R), Second Edition has been written with the CAD engineer in mind. A clear understanding of the synthesis tool concepts, its capabilities and the related CAD issues will help the CAD engineer formulate an effective synthesis-based ASIC design methodology. The intent is also to assist design teams to better incorporate and effectively integrate synthesis with their existing in-house design methodology and CAD tools.
This book constitutes the refereed proceedings of the 5th
International Conference on Industrial Applications of Holonic and
Multi-Agent Systems, HoloMAS 2011, held in Toulouse, France, August
29-31, 2011.
Software Synthesis from Dataflow Graphs addresses the problem of generating efficient software implementations from applications specified as synchronous dataflow graphs for programmable digital signal processors (DSPs) used in embedded real- time systems. The advent of high-speed graphics workstations has made feasible the use of graphical block diagram programming environments by designers of signal processing systems. A particular subset of dataflow, called Synchronous Dataflow (SDF), has proven efficient for representing a wide class of unirate and multirate signal processing algorithms, and has been used as the basis for numerous DSP block diagram-based programming environments such as the Signal Processing Workstation from Cadence Design Systems, Inc., COSSAP from Synopsys (R) (both commercial tools), and the Ptolemy environment from the University of California at Berkeley. A key property of the SDF model is that static schedules can be determined at compile time. This removes the overhead of dynamic scheduling and is thus useful for real-time DSP programs where throughput requirements are often severe. Another constraint that programmable DSPs for embedded systems have is the limited amount of on-chip memory. Off-chip memory is not only expensive but is also slower and increases the power consumption of the system; hence, it is imperative that programs fit in the on-chip memory whenever possible. Software Synthesis from Dataflow Graphs reviews the state-of-the-art in constructing static, memory-optimal schedules for programs expressed as SDF graphs. Code size reduction is obtained by the careful organization of loops in the target code. Data buffering is optimized by constructing the loop hierarchy in provably optimal ways for many classes of SDF graphs. The central result is a uniprocessor scheduling framework that provably synthesizes the most compact looping structures, called single appearance schedules, for a certain class of SDF graphs. In addition, algorithms and heuristics are presented that generate single appearance schedules optimized for data buffering usage. Numerous practical examples and extensive experimental data are provided to illustrate the efficacy of these techniques.
This book contains the proceedings of the International "Workshop on 3D Process Simulation which was held at the Campus of the University Erlangen-Nuremberg in Erlangen on September 5, 1995, in conjunction with the 6th International Conference on "Simulation of Semiconductor Devices and Processes (SISDEP 95). Whereas two-dimensional semiconductor process simulation has achieved a certain degree of maturity, three-dimensional process simulation is a newly emerging field in which most efforts are dedicated to necessary basic developments. Research in this area is promoted by the growing demand to obtain reliable information on device geometries and dopant distributions needed for three-dimensional device simulation, and challenged by the great algorithmic problems caused by moving interfaces and by the requirement to limit computation times and memory requirements. This workshop provided a forum to discuss the industrial needs, technical problems, and solutions being developed in the field of three-dimensional semiconductor process simulation. Invited presentations from leading semiconductor companies and research Centers of Excellence from Japan, the USA, and Europe outlined novel numerical algorithms, physical models, and applications in this rapidly emerging field.
In the last few decades, multiscale algorithms have become a dominant trend in large-scale scientific computation. Researchers have successfully applied these methods to a wide range of simulation and optimization problems. This book gives a general overview of multiscale algorithms; applications to general combinatorial optimization problems such as graph partitioning and the traveling salesman problem; and VLSICAD applications, including circuit partitioning, placement, and VLSI routing. Additional chapters discuss optimization in reconfigurable computing, convergence in multilevel optimization, and model problems with PDE constraints. Audience Written at the graduate level, the book is intended for engineers and mathematical and computational scientists studying large-scale optimization in electronic design automation.
Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.
In system design, generation of high-level abstract models that can be closely associated with evolving lower-level models provides designers with the ability to incrementally test' an evolving design against a model of a specification. Such high-level models may deal with areas such as performance, reliability, availability, maintainability, and system safety. Abstract models also allow exploration of the hardware versus software design space in an incremental fashion as a fuller, detailed design unfolds, leaving behind the old practice of hardware-software binding too early in the design process. Such models may also allow the inclusion of non-functional aspects of design (e.g. space, power, heat) in a simulatable information model dealing with the system's operation. This book addresses Model Generation and Application specifically in the following domains: Specification modeling (linking object/data modeling, behavior modeling, and activity modeling). Operational specification modeling (modeling the way the system is supposed to operate - from a user's viewpoint). Linking non-functional parameters with specification models. Hybrid modeling (linking performance and functional elements). Application of high-level modeling to hardware/software approaches. Mathematical analysis techniques related to the modeling approaches. Reliability modeling. Applications of High Level Modeling. Reducing High Level Modeling to Practice. High-Level System Modeling: Specification and Design Methodologies describes the latest research and practice in the modeling of electronic systems and as such is an important update for all researchers, design engineers and technical managers working in design automation and circuit design.
In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."
Theory and practice of tolerances are very important for designing and manufacturing engineering artifacts on a rational basis. Tolerance specifies a degree of "discrepancy" between an idealized object and its physical realization. Such discrepancy inevitably comes into our product realization processes because of practical cost consideration or our inability to fully control manufacturing processes. Major product and production characteristics which are affected by tolerances are product quality and cost. For achieving high precision machines tight tolerance specification is necessary, but this will normally increase product cost. In order to optimally compromise the conflicting requirements of quality and cost, it is essential to take into account of the total product life cycle throughout product planning, design, manufacturing, maintenance and recycling. For example, in order to construct durable products under severe working conditions, low sensitivity of product functionality with respect to tolerances is required. In future, re-use of components or parts will become important, and tolerance synthesis with respect to this aspect will be an interesting future research topics.
The switching net.work is an important. classic research area in t.ele- communication and comput.er net.works. It.s import.ancc st.ems from both theory and practice. In fact, some open problems, such as Benes conjec- ture on shuffle-exchange networks and Chung-Rmis conjecture on multi- rate rearrangeability, still attract many researchers and the further de- velopment in optical networks requires advanced technology in optical switching networks. In 1997, we had a workshop in switching networks held in NSF Sci- ence and Technology Center in Discrete Mathematics and Theoretical Computer Science (DIMACS), at Princeton University. This workshop was very successful. Many participants wished to have a similar activity every two or three years. This book is a result of such a wish. We are putting together SOllle important developments in this area during last. several years, including articles ill fault-tolerance, rearrang{~ability. non- blocking, optical networks. random permutation generat.ioll. and layout complexity. SOlllC of thos(~ art ides are research papers alld SOIllC an' sur- veys. All articles were reviewed. We would like to uWlItioll two special problems studied in those articles.
The area of computer graphics is characterized by rapid evolution. New techniques in hardware and software developments, e. g., new rendering methods, have led to new ap plications and broader acceptance of graphics in fields such as scientific visualization, multi-media applications, computer aided design, and virtual reality systems. The evolving functionality and the growing complexity of graphics algorithms and sys tems make it more difficult for the application programmer to take full advantage of these systems. Conventional programming methods are no longer suited to manage the increasing complexity, so new programming paradigms and system architectures are re quired. One important step in this direction is the introduction and use of object-oriented methods. Intuition teils us that visible graphical entities are objects, and experience has indeed shown that object-oriented software techniques are quite useful for graphics. The expressiveness of object-oriented languages compared to pure procedurallanguages gives the graphics application programmer much better support when transforming his mental intentions into computer code. Moreover, object-oriented software development is a, weil founded technology, allowing software to be built from reusable and extensible compo nents. This book contains selected, reviewed and thoroughly revised vers ions of papers submit ted to and presented at the Fourth Eurographies Workshops on Object-Oriented Graphics, held on May 9-11, 1994 in Sintra, Portugal." |
![]() ![]() You may like...
The Physics of Inertial Fusion…
Stefano Atzeni, Jurgen Meyer-Ter-Vehn
Hardcover
R10,787
Discovery Miles 107 870
Agile Software Development - Current…
Torgeir Dingsoyr, Tore Dyb a, …
Hardcover
R1,541
Discovery Miles 15 410
Emerging Technologies in Data Mining and…
Joao Manuel R.S. Tavares, Satyajit Chakrabarti, …
Hardcover
R5,821
Discovery Miles 58 210
Challenges of Second and Foreign…
Miroslaw Pawlak, Anna Mystkowska-Wiertelak
Hardcover
R3,678
Discovery Miles 36 780
Introduction to Nonparametric Statistics…
Thomas W. MacFarland, Jan M. Yates
Hardcover
R3,326
Discovery Miles 33 260
Stochastic Models with Power-Law Tails…
Dariusz Buraczewski, Ewa Damek, …
Hardcover
R4,684
Discovery Miles 46 840
|