![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Collaborative virtual environments (CVEs) are multi-user virtual realities which actively support communication and co-operation. This book offers a comprehensive reference volume to the state-of-the-art in the area of design studies in CVEs. It is an excellent mix of contributions from over 25 leading researcher/experts in multiple disciplines from academia and industry, providing up-to-date insight into the current research topics in this field as well as the latest technological advancements and the best working examples. Many of these results and ideas are also applicable to other areas such as CVE for design education. Overall, this bookserves asan excellent reference for postgraduate students, researchers and practitioners who need a comprehensive approach to study the design behaviours in CVEs. It is also a useful and informative source of materials for those interested in learning more on using/developing CVEs to support design and design collaboration. "
This volume is a record of the Workshop on Window Management held at the Ruth erford Appleton Laboratory's Cosener's House between 29 April and 1 May 1985. The main impetus for the Workshop came from the Alvey Programme's Man Machine Interface Director who was concerned at the lack of a formal definition of window management and the lack of focus for research activities in this area. Win dow Management per se is not the complete problem in understanding interaction. However, the appearance of bitmap displays from a variety of vendors enabling an operator to work simultaneously with a number of applications on a single display has focussed attention on what the overall architecture for such a system should be and also on what the interfaces to both the application and operator should be. The format of the Workshop was to spend the first day with presentations from a number of invited speakers. The aim was to get the participants aware of the current state of the art and to highlight the main outstanding issues. The second day consisted of the Workshop participants splitting into three groups and discussing specific issues in depth. Plenary sessions helped to keep the individual groups work ing on similar lines. The third day concentrated on the individual groups presenting their results and interacting with the other groups to identify main areas of con sensus and also a framework for future work."
The summer school on VLSf GAD Tools and Applications was held from July 21 through August 1, 1986 at Beatenberg in the beautiful Bernese Oberland in Switzerland. The meeting was given under the auspices of IFIP WG 10. 6 VLSI, and it was sponsored by the Swiss Federal Institute of Technology Zurich, Switzerland. Eighty-one professionals were invited to participate in the summer school, including 18 lecturers. The 81 participants came from the following countries: Australia (1), Denmark (1), Federal Republic of Germany (12), France (3), Italy (4), Norway (1), South Korea (1), Sweden (5), United Kingdom (1), United States of America (13), and Switzerland (39). Our goal in the planning for the summer school was to introduce the audience into the realities of CAD tools and their applications to VLSI design. This book contains articles by all 18 invited speakers that lectured at the summer school. The reader should realize that it was not intended to publish a textbook. However, the chapters in this book are more or less self-contained treatments of the particular subjects. Chapters 1 and 2 give a broad introduction to VLSI Design. Simulation tools and their algorithmic foundations are treated in Chapters 3 to 5 and 17. Chapters 6 to 9 provide an excellent treatment of modern layout tools. The use of CAD tools and trends in the design of 32-bit microprocessors are the topics of Chapters 10 through 16. Important aspects in VLSI testing and testing strategies are given in Chapters 18 and 19.
Mixed-Signal Layout Generation Concepts covers important
physical-design issues that exist in contemporary analog and
mixed-signal design flows. Due to the increasing pressure on
time-to-market, the steep increase in chip fabrication costs, and
the increasing design complexity, it becomes even more challenging
to produce a first-time right IC layout. The fundamental issues in
creating a layout are placement and routing. Although these coupled
problems have been investigated for many decades, no satisfactory
automated solution has emerged yet. Fortunately, supported by
modern computing power and results of new research that further
improve computation efficiency, significant steps forward have been
taken.
The computer interpretation of line drawings is a classic problem in arti?cial intelligence (AI) which has inspired the development of some fundamental AI tools, including constraint propagation, probabilistic relaxation, the characte- zation of tractable constraint classes and, most recently, the propagationof soft constraintsin?nite-domainoptimizationproblems. Line drawinginterpretation has many distinct applications on the borderline of computer vision and c- puter graphics, including sketch interpretation, the input of 3D object models 1 and the creation of 2 D illustrations in electronic documents. 2 I hope I have made this fascinating topic accessible not only to computer scientistsbutalsotomathematicians,psychologistsandcognitivescientistsand, indeed, to anyone who is intrigued by optical illusions and impossible or - biguous ?gures. This book could not have been written without the support of the CNRS, theFrenchCentreNational deRecherche Scienti?que,who?nancedmyone-year break from teaching at the University of Toulouse III. The UK Engineering and Physical Sciences Research Council also ?nanced several extended visits to the Oxford University Computing Laboratory. Section 9.1 is just a brief summary of the results on tractable constraints that have come out of this very productive joint research programme with David Cohen, Peter Jeavons and Andrei Krokhin. The various soft arc consistency techniques described in Chapter 8 were developed in collaboration with Thomas Schiex and Simon de Givry at INRA, Toulouse. I am also grateful to Ralph Martin and Peter Varley for their comments on the line-labelling constraints presented in Chapter 3.
The purpose of this book is to discuss the state of the art and future trends in the field of computerized production management systems. It is composed of a number of independent papers, each presented in a chapter. Some of the widely recognized experts in the field around the world have been asked to contribute. lowe each of them my sincere gratitude for their kind cooperation. I am also grateful to Peter Falster and Jim Browne for their kind support in helping me to review topics to be covered and to select the authors. This book is a result of the professional work done in the International Federation of Information Processing Technical Committee IFIP TC5 "Com puter Applications in Technology" and especially in the Working Group WG5. 7 "Computer-Aided Production Management." This group was established in 1978 with the aim of promoting and encouraging the advancement of the field of computer systems for the production management of manufacturing, off shore, construction, electronic and similar and related industries. The scope of the work includes, but is not limited to, the following topics: 1) design and implementation of new production planning and control systems taking into account new technology and management philosophy; 2) CAPM in a CIM environment including interfaces to CAD and CAM; 3) project management and cost engineering; 4) knowledge engineering in CAPM; 5) CAPM for Flexible Manufacturing Systems (FMS) and Flexible Assembly Systems (F AS); 6) methods and concepts in CAPM; 7) economic and social implications of CAPM."
This book is a result of the lectures and discussions during the conference "Theory and Practice of Geometric Modeling." The event has been organized by the Wilhelm-Schickard-Institut fiir Informatik, Universitat Tiibingen and took place at the Heinrich-Fabri-Institut in Blaubeuren from October 3 to 7, 1988. The conference brought together leading experts from academic and industrial research institutions, CAD system developers and experien ced users to exchange their ideas and to discuss new concepts and future directions in geometric modeling. The main intention has been to bridge the gap between theoretical results, performance of existing CAD systems and the real problems of users. The contents is structured in five parts: A Algorithmic Aspects B Surface Intersection, Blending, Ray Tracing C Geometric Tools D Different Representation Schemes in Solid Modeling E Product Modeling in High Level Specifications The material presented in this book reflects the current state of the art in geometric modeling and should therefore be of interest not only to university and industry researchers, but also to system developers and practitioners who wish to keep up to date on recent advances and new concepts in this rapidly expanding field. The editors express their sincere appreciation to the contributing authors, and to the members of the program committee, W. Boehm, J. Hoschek, A. Massabo, H. Nowacki, M. Pratt, J. Rossignac, T. Sederberg and W. Tiller, for their close cooperation and their time and effort that made the conference and this book a success."
"During the last two decades, research on structural optimization became increasingly concerned with two aspects: the application of general numeri- cal methods of optimization to structural design of complex real structures, and the analytical derivation of necessary and sufficient conditions for the optimality of broad classes of comparatively simple and more or less ideal- ized structures. Both kinds of research are important: the first for obvious reasons; the second, because it furnishes information that is useful in testing the validity, accuracy and convergence of numerical methods and in assess- ing the efficiency of practical designs. " (Prager and Rozvany, 1977a) The unexpected death of William Prager in March 1980 marked, in a sense, the end of an era in structural mechanics, but his legacy of ideas will re- main a source of inspiration for generations of researchers to come. Since his nominal retirement in the early seventies, Professor and Mrs. Prager lived in Savognin, an isolated alpine village and ski resort surrounded by some of Switzerland's highest mountains. It was there that the author's close as- sociation with Prager developed through annual pilgrimages from Australia and lengthy discussions which pivoted on Prager's favourite topic of struc- tural optimization. These exchanges took place in the picturesque setting of Graubunden, on the terrace of an alpine restaurant overlooking snow-capped peaks, on ski-lifts or mountain walks, or during evening meals in the cosy hotels of Savognin, Parsonz and Riom.
I am indebted to my thesis advisor, Michael Genesereth, for his guidance, inspiration, and support which has made this research possible. As a teacher and a sounding board for new ideas, Mike was extremely helpful in pointing out Haws, and suggesting new directions to explore. I would also like to thank Harold Brown for introducing me to the application of artificial intelligence to reasoning about designs, and his many valuable comments as a reader of this thesis. Significant contribu tions by the other members of my reading committee, Mark Horowitz, and Allen Peterson have greatly improved the content and organization of this thesis by forcing me to communicate my ideas more clearly. I am extremely grateful to the other members of the Logic Group at the Heuristic Programming Project for being a sounding board for my ideas, and providing useful comments. In particular, I would like to thank Matt Ginsberg, Vineet Singh, Devika Subramanian, Richard Trietel, Dave Smith, Jock Mackinlay, and Glenn Kramer for their pointed criticisms. This research was supported by Schlumberger Palo Alto Research (previously Fairchild Laboratory for Artificial Intelligence). I am grateful to Peter Hart, the former head of the AI lab, and his successor Marty Tenenbaum for providing an excellent environment for performing this research."
This book introduces 'functional networks', a novel neural-based paradigm, and shows that functional network architectures can be efficiently applied to solve many interesting practical problems. Included is an introduction to neural networks, a description of functional networks, examples of applications, and computer programs in Mathematica and Java languages implementing the various algorithms and methodologies. Special emphasis is given to applications in several areas such as: * Box-Jenkins AR(p), MA(q), ARMA(p, q), and ARIMA (p, d, q) models with application to real-life economic problems such as the consumer price index, electric power consumption and international airlines' passenger data. Random time series and chaotic series are considered in relation to the Henon, Lozi, Holmes and Burger maps, as well as the problems of noise reduction and information masking. * Learning differential equations from data and deriving the corresponding equivalent difference and functional equations. Examples of a mass supported by two springs and a viscous damper or dashpot, and a loaded beam, are used to illustrate the concepts.* The problem of obtaining the most general family of implicit, explicit and parametric surfaces as used in Computer Aided Design (CAD). * Applications of functional networks to obtain general nonlinear regression models are given and compared with standard techniques. Functional Networks with Applications: A Neural-Based Paradigm will be of interest to individuals who work in computer science, physics, engineering, applied mathematics, statistics, economics, and other neural networks and data analysis related fiel
Representations of Discrete Functions is an edited volume containing 13 chapter contributions from leading researchers with a focus on the latest research results. The first three chapters are introductions and contain many illustrations to clarify concepts presented in the text. It is recommended that these chapters are read first. The book then deals with the following topics: binary decision diagrams (BDDs), multi-terminal binary decision diagrams (MTBDDs), edge-valued binary decision diagrams (EVBDDs), functional decision diagrams (FDDs), Kronecker decision diagrams (KDDs), binary moment diagrams (BMDs), spectral transform decision diagrams (STDDs), ternary decision diagrams (TDDs), spectral transformation of logic functions, other transformations oflogic functions, EXOR-based two-level expressions, FPRM minimization with TDDs and MTBDDs, complexity theories on FDDs, multi-level logic synthesis, and complexity of three-level logic networks. Representations of Discrete Functions is designed for CAD researchers and engineers and will also be of interest to computer scientists who are interested in combinatorial problems. Exercises prepared by the editors help make this book useful as a graduate level textbook.
Over the years there has been a large increase in the functionality available on a single integrated circuit. This has been mainly achieved by a continuous drive towards smaller feature sizes, larger dies, and better packing efficiency. However, this greater functionality has also resulted in substantial increases in the capital investment needed to build fabrication facilities. Given such a high level of investment, it is critical for IC manufacturers to reduce manufacturing costs and get a better return on their investment. The most obvious method of reducing the manufacturing cost per die is to improve manufacturing yield. Modern VLSI research and engineering (which includes design manufacturing and testing) encompasses a very broad range of disciplines such as chemistry, physics, material science, circuit design, mathematics and computer science. Due to this diversity, the VLSI arena has become fractured into a number of separate sub-domains with little or no interaction between them. This is the case with the relationships between testing and manufacturing. From Contamination to Defects, Faults and Yield Loss: Simulation and Applications focuses on the core of the interface between manufacturing and testing, i.e., the contamination-defect-fault relationship. The understanding of this relationship can lead to better solutions of many manufacturing and testing problems. Failure mechanism models are developed and presented which can be used to accurately estimate probability of different failures for a given IC. This information is critical in solving key yield-related applications such as failure analysis, fault modeling and design manufacturing.
This volume contains a collection of papers presented at the NATO Advanced Study Institute on *Testing and Diagnosis of VLSI and ULSI" held at Villa Olmo, Como (Italy) June 22 -July 3,1987. High Density technologies such as Very-Large Scale Integration (VLSI), Wafer Scale Integration (WSI) and the not-so-far promises of Ultra-Large Scale Integration (ULSI), have exasperated the problema associated with the testing and diagnosis of these devices and systema. Traditional techniques are fast becoming obsolete due to unique requirements such as limited controllability and observability, increasing execution complexity for test vector generation and high cost of fault simulation, to mention just a few. New approaches are imperative to achieve the highly sought goal of the * three months* turn around cycle time for a state-of-the-art computer chip. The importance of testing and diagnostic processes is of primary importance if costs must be kept at acceptable levels. The objective of this NATO-ASI was to present, analyze and discuss the various facets of testing and diagnosis with respect to both theory and practice. The contents of this volume reflect the diversity of approaches currently available to reduce test and diagnosis time. These approaches are described in a concise, yet clear way by renowned experts of the field. Their contributions are aimed at a wide readership: the uninitiated researcher will find the tutorial chapters very rewarding. The expert wiII be introduced to advanced techniques in a very comprehensive manner.
The advent of computer aided design and the proliferation of computer aided design tools have been instrumental in furthering the state-of-the art in integrated circuitry. Continuing this progress, however, demands an emphasis on creating user-friendly environments that facilitate the interaction between the designer and the CAD tool. The realization of this fact has prompted investigations into the appropriateness for CAD of a number of user-interface technologies. One type of interface that has hitherto not been considered is the natural language interface. It is our contention that natural language interfaces could solve many of the problems posed by the increasing number and sophistication of CAD tools. This thesis represents the first step in a research effort directed towards the eventual development of a natural language interface for the domain of computer aided design. The breadth and complexity of the CAD domain renders the task of developing a natural language interface for the complete domain beyond the scope of a single doctoral thesis. Hence, we have initally focussed on a sub-domain of CAD. Specifically, we have developed a natural language interface, named Cleopatra, for circuit-simulation post-processing. In other words, with Cleopatra a circuit-designer can extract and manipulate, in English, values from the output of a circuit-simulator (currently SPICE) without manually having to go through the output files produced by the simulator."
There is a growing social interest in developing vision-based vehicle guidance systems for improving traffic safety and efficiency and the environment. Ex amples of vision-based vehicle guidance systems include collision warning systems, steering control systems for tracking painted lane marks, and speed control systems for preventing rear-end collisions. Like other guidance systems for aircraft and trains, these systems are ex pected to increase traffic safety significantly. For example, safety improve ments of aircraft landing processes after the introduction of automatic guidance systems have been reported to be 100 times better than prior to installment. Although the safety of human lives is beyond price, the cost for automatic guidance could be compensated by decreased insurance costs. It is becoming more important to increase traffic safety by decreasing the human driver's load in our society, especially with an increasing population of senior people who continue to drive. The second potential social benefit is the improvement of traffic efficiency by decreasing the spacing between vehicles without sacrificing safety. It is reported, for example, that four times the efficiency is expected if the spacing between cars is controlled automatically at 90 cm with a speed of 100 kmjh compared to today's typical manual driving. Although there are a lot of tech nical, psychological, and social issues to be solved before realizing the high density jhigh-speed traffic systems described here, highly efficient highways are becoming more important because of increasing traffic congestion."
This book, and the research it describes, resulted from a simple observation we made sometime in 1986. Put simply, we noticed that many VLSI design tools looked "alike." That is, at least at the overall software architecture level, the algorithms and data structures required to solve problem X looked much like those required to solve problem X'. Unfortunately, this resemblance is often of little help in actually writing the software for problem X' given the software for problem X. In the VLSI CAD world, technology changes rapidly enough that design software must continually strive to keep up. And of course, VLSI design software, and engineering design software in general, is often exquisitely sensitive to some aspects of the domain (technology) in which it operates. Modest changes in functionality have an unfortunate tendency to require substantial (and time-consuming) internal software modifications. Now, observing that large engineering software systems are technology dependent is not particularly clever. However, we believe that our approach to xiv Preface dealing with this problem took an interesting new direction. We chose to investigate the extent to which automatic programming ideas cold be used to synthesize such software systems from high-level specifications. This book is one of the results of that effort."
Symbolic Boolean manipulation using binary decision diagrams (BDDs) has been successfully applied to a wide variety of tasks, particularly in very large scale integration (VLSI) computer-aided design (CAD). The concept of decision graphs as an abstract representation of Boolean functions dates back to the early work by Lee and Akers. In the last ten years, BDDs have found widespread use as a concrete data structure for symbolic Boolean manipulation. With BDDs, functions can be constructed, manipulated, and compared by simple and efficient graph algorithms. Since Boolean functions can represent not just digital circuit functions, but also such mathematical domains as sets and relations, a wide variety of CAD problems can be solved using BDDs. Binary Decision Diagrams and Applications for VLSI CAD provides valuable information for both those who are new to BDDs as well as to long time aficionados.' -from the Foreword by Randal E. Bryant. Over the past ten years ... BDDs have attracted the attention of many researchers because of their suitability for representing Boolean functions. They are now widely used in many practical VLSI CAD systems. ... this book can serve as an introduction to BDD techniques and ... it presents several new ideas on BDDs and their applications. ... many computer scientists and engineers will be interested in this book since Boolean function manipulation is a fundamental technique not only in digital system design but also in exploring various problems in computer science.' - from the Preface by Shin-ichi Minato.
Circuit simulation has been a topic of great interest to the integrated circuit design community for many years. It is a difficult, and interesting, problem be cause circuit simulators are very heavily used, consuming thousands of computer hours every year, and therefore the algorithms must be very efficient. In addi tion, circuit simulators are heavily relied upon, with millions of dollars being gambled on their accuracy, and therefore the algorithms must be very robust. At the University of California, Berkeley, a great deal of research has been devoted to the study of both the numerical properties and the efficient imple mentation of circuit simulation algorithms. Research efforts have led to several programs, starting with CANCER in the 1960's and the enormously successful SPICE program in the early 1970's, to MOTIS-C, SPLICE, and RELAX in the late 1970's, and finally to SPLICE2 and RELAX2 in the 1980's. Our primary goal in writing this book was to present some of the results of our current research on the application of relaxation algorithms to circuit simu lation. As we began, we realized that a large body of mathematical and exper imental results had been amassed over the past twenty years by graduate students, professors, and industry researchers working on circuit simulation. It became a secondary goal to try to find an organization of this mass of material that was mathematically rigorous, had practical relevance, and still retained the natural intuitive simplicity of the circuit simulation subject."
Recently there has been increased interest in the development of computer-aided design programs to support the system level designer of integrated circuits more actively. Such design tools hold the promise of raising the level of abstraction at which an integrated circuit is designed, thus releasing the current designers from many of the details of logic and circuit level design. The promise further suggests that a whole new group of designers in neighboring engineering and science disciplines, with far less understanding of integrated circuit design, will also be able to increase their productivity and the functionality of the systems they design. This promise has been made repeatedly as each new higher level of computer-aided design tool is introduced and has repeatedly fallen short of fulfillment. This book presents the results of research aimed at introducing yet higher levels of design tools that will inch the integrated circuit design community closer to the fulfillment of that promise. 1. 1. SYNTHESIS OF INTEGRATED CmCUITS In the integrated circuit (Ie) design process, a behavior that meets certain specifications is conceived for a system, the behavior is used to produce a design in terms of a set of structural logic elements, and these logic elements are mapped onto physical units. The design process is impacted by a set of constraints as well as technological information (i. e. the logic elements and physical units used for the design).
This volume, which contains 15 contributions, is based on a minicourse held at the 1987 IEEE Plasma Science Meeting. The purpose of the lectures in the course was to acquaint the students with the multidisciplinary nature of computational techniques and the breadth of research areas in plasma science in which computation can address important physics and engineering design issues. These involve: electric and magnetic fields, MHD equations, chemistry, radiation, ionization etc. The contents of the contributions, written subsequent to the minicourse, stress important aspects of computer applications. They are: 1) the numerical methods used; 2) the range of applicability; 3) how the methods are actually employed in research and in the design of devices; and, as a compendium, 4) the multiplicity of approaches possible for any one problem. The materials in this book are organized by both subject and applications which display some of the richness in computational plasma physics.
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
Direct Engineering (DE) is the creation of a product development cycle into a single, unified process. The design process in most industries is an evolutionary one (i.e., incremental changes to some existing design). DE is a manufacturing process that seeks to improve the design processes by providing complete archival documentation of existing designs. It uses three-dimensional geometric models with integrated manufacturing information throughout the design process. DE reduces the design cycle, and the variety and number of engineering changes. This process decreases the design cycle time, increases productivity, and provides a higher quality product. The required technologies and methodologies that will support the development of the DE environment are: (1) product representation using feature-based modeling; (2) knowledge-based applications that will support the entire product development cycle; (3) an engineering environment implemented around distributed computing and object-oriented systems; (4) direct manufacturing techniques using rapid prototyping. Direct Engineering: Toward Intelligent Manufacturing addresses the following recent topics related to the development, implementation, and integration of the DE environment: (1) the current scope of the research in intelligent manufacturing; (2) the results of the technologies and tools developed for integrated product and process designs, and (3) examination of the methodologies and algorithms used for the implementation of direct engineering.
Embedded systems are becoming one of the major driving forces in computer science. Furthermore, it is the impact of embedded information technology that dictates the pace in most engineering domains. Nearly all technical products above a certain level of complexity are not only controlled but increasingly even dominated by their embedded computer systems. Traditionally, such embedded control systems have been implemented in a monolithic, centralized way. Recently, distributed solutions are gaining increasing importance. In this approach, the control task is carried out by a number of controllers distributed over the entire system and connected by some interconnect network, like fieldbuses. Such a distributed embedded system may consist of a few controllers up to several hundred, as in today's top-range automobiles. Distribution and parallelism in embedded systems design increase the engineering challenges and require new development methods and tools. This book is the result of the International Workshop on Distributed and Parallel Embedded Systems (DIPES'98), organized by the International Federation for Information Processing (IFIP) Working Groups 10.3 (Concurrent Systems) and 10.5 (Design and Engineering of Electronic Systems). The workshop took place in October 1998 in Schloss Eringerfeld, near Paderborn, Germany, and the resulting book reflects the most recent points of view of experts from Brazil, Finland, France, Germany, Italy, Portugal, and the USA. The book is organized in six chapters: `Formalisms for Embedded System Design': IP-based system design and various approaches to multi-language formalisms. `Synthesis from Synchronous/Asynchronous Specification': Synthesis techniques based on Message Sequence Charts (MSC), StateCharts, and Predicate/Transition Nets. `Partitioning and Load-Balancing': Application in simulation models and target systems. <`Verification and Validation': Formal techniques for precise verification and more pragmatic approaches to validation. `Design Environments' for distributed embedded systems and their impact on the industrial state of the art. `Object Oriented Approaches': Impact of OO-techniques on distributed embedded systems. GBP/LISTGBP This volume will be essential reading for computer science researchers and application developers.
This book contains the proceedings of the International "Workshop on 3D Process Simulation which was held at the Campus of the University Erlangen-Nuremberg in Erlangen on September 5, 1995, in conjunction with the 6th International Conference on "Simulation of Semiconductor Devices and Processes (SISDEP 95). Whereas two-dimensional semiconductor process simulation has achieved a certain degree of maturity, three-dimensional process simulation is a newly emerging field in which most efforts are dedicated to necessary basic developments. Research in this area is promoted by the growing demand to obtain reliable information on device geometries and dopant distributions needed for three-dimensional device simulation, and challenged by the great algorithmic problems caused by moving interfaces and by the requirement to limit computation times and memory requirements. This workshop provided a forum to discuss the industrial needs, technical problems, and solutions being developed in the field of three-dimensional semiconductor process simulation. Invited presentations from leading semiconductor companies and research Centers of Excellence from Japan, the USA, and Europe outlined novel numerical algorithms, physical models, and applications in this rapidly emerging field.
Geometric algebra has established itself as a powerful and valuable mathematical tool for solving problems in computer science, engineering, physics, and mathematics. The articles in this volume, written by experts in various fields, reflect an interdisciplinary approach to the subject, and highlight a range of techniques and applications. Relevant ideas are introduced in a self-contained manner and only a knowledge of linear algebra and calculus is assumed. Features and Topics: * The mathematical foundations of geometric algebra are explored * Applications in computational geometry include models of reflection and ray-tracing and a new and concise characterization of the crystallographic groups * Applications in engineering include robotics, image geometry, control-pose estimation, inverse kinematics and dynamics, control and visual navigation * Applications in physics include rigid-body dynamics, elasticity, and electromagnetism * Chapters dedicated to quantum information theory dealing with multi- particle entanglement, MRI, and relativistic generalizations Practitioners, professionals, and researchers working in computer science, engineering, physics, and mathematics will find a wide range of useful applications in this state-of-the-art survey and reference book. Additionally, advanced graduate students interested in geometric algebra will find the most current applications and methods discussed. |
You may like...
Building Information Modelling (BIM) in…
W. P. de Wilde, L. Mahdjoubi, …
Hardcover
R4,604
Discovery Miles 46 040
SolidWorks 2022 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R1,909
Discovery Miles 19 090
|