![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Our purpose in writing this book was two-fold. First, we wanted to compile a chronology of the research in the field of mixed-mode simulation over the last ten to fifteen years. A substantial amount of work was done during this period of time but most of it was published in archival form in Masters theses and Ph. D. dissertations. Since the interest in mixed-mode simulation is growing, and a thorough review of the state-of-the-art in the area was not readily available, we thought it appropriate to publish the information in the form of a book. Secondly, we wanted to provide enough information to the reader so that a proto type mixed-mode simulator could be developed using the algorithms in this book. The SPLICE family of programs is based on the algorithms and techniques described in this book and so it can also serve as docu mentation for these programs. ACKNOWLEDGEMENTS The authors would like to dedicate this book to Prof. D. O. Peder son for inspiring this research work and for providing many years of support and encouragement The authors enjoyed many fruitful discus sions and collaborations with Jim Kleckner, Young Kim, Alberto Sangiovanni-Vincentelli, and Jacob White, and we thank them for their contributions. We also thank the countless others who participated in the research work and read early versions of this book. Lillian Beck provided many useful suggestions to improve the manuscript. Yun cheng Ju did the artwork for the illustrations."
This volume presents the proceedings of the 7th International Confer ence of the Computer Graphics Society, CG International '89, held at the University of Leeds, UK, June 27-30, 1989. Since 1982 this confer ence has continued to attract high-quality research papers in all aspects of computer graphics and its applications. Originally the conference was held in Japan (1982-1987), but in 1988 was held in Geneva, Switzerland. Future conferences are planned for Singapore in 1990, USA in 1991, Japan in 1992, and Canada in 1993. Recent developments in computer graphics have concentrated on the following: greater sophistication of image generation techniques; advances in hardware and emphasis on the exploitation of parallelism, integration of robotics and AI techniques for animation, greater integ ration of CAD and CAM in CIM, use of powerful computer graphics techniques to represent complex physical processes (visualization), advances in computational geometry and in the representation and modelling of complex physical and mathematical objects, and improved tools and methods for HC . These trends and advances are reflected in this present volume. A number of papers deal with important research aspects in many of these areas."
A number of fundamental topics in the field of high performance clock distribution networks is covered in this book. High Performance Clock Distribution Networks is composed of ten contributions from authors at academic and industrial institutions. Topically, these contributions can be grouped within three primary areas. The first topic area deals with exploiting the localized nature of clock skew. The second topic area deals with the implementation of these clock distribution networks, while the third topic area considers more long-range aspects of next-generation clock distribution networks. High Performance Clock Distribution Networks presents a number of interesting strategies for designing and building high performance clock distribution networks. Many aspects of the ideas presented in these contributions are being developed and applied today in next-generation high-performance microprocessors.
The aim of computer-aided surgery (CAS) is to advance the utilization of computers in the development of new technologies for medical services. The Asian Conference on Computer Aided Surgery (ACCAS) series provides a forum for academic researchers, clinical scientists, surgeons, and industrial partners to exchange new ideas, techniques, and the latest developments in the field. The ACCAS brings together researchers from all fields related to medical activity visualization, simulation and modeling, virtual reality for CAS, image-guided diagnosis and therapies, CAS for minimally invasive intervention, medical robotics and instrumentation, surgical navigation, clinical application of CAS, telemedicine and telesurgery, and CAS education. The ACCAS is also interested in promoting collaboration among people from different disciplines and different countries in Asia and the world. This volume helps to achieve that goal and is a valuable resource for researchers and clinicians in the field.
This book constitutes the refereed proceedings of the 8th International Workshop on Self-Organizing Maps, WSOM 2011, held in Espoo, Finland, in June 2011. The 36 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on plenaries; financial and societal applications; theory and methodology; applications of data mining and analysis; language processing and document analysis; and visualization and image processing.
CAD (Computer Aided Design) technology is now crucial for every division of modern industry, from a viewpoint of higher productivity and better products. As technologies advance, the amount of information and knowledge that engineers have to deal with is constantly increasing. This results in seeking more advanced computer technology to achieve higher functionalities, flexibility, and efficient performance of the CAD systems. Knowledge engineering, or more broadly artificial intelligence, is considered a primary candidate technology to build a new generation of CAD systems. Since design is a very intellectual human activity, this approach seems to make sense. The ideas of intelligent CAD systems (ICAD) are now increasingly discussed everywhere. We can observe many conferences and workshops reporting a number of research efforts on this particular subject. Researchers are coming from computer science, artificial intelligence, mechanical engineering, electronic engineering, civil engineering, architectural science, control engineering, etc. But, still we cannot see the direction of this concept, or at least, there is no widely accepted concept of ICAD. What can designers expect from these future generation CAD systems? In which direction must developers proceed? The situation is somewhat confusing.
Adopting new fabrication technologies not only provides higher integration and enhances performance, but also increases the types of manufacturing defects. With design size in millions of gates and working frequency in GHz timing-related defects havv become a high proportion of the total chip defects. For nanometer technology designs, the stuck-at fault test alone cannot ensure a high quality level of chips. At-speed tests using the transition fault model has become a requirement in technologies below 180nm. Traditional at-speed test methods cannot guarantee high quality test results as they face many new challenges. Supply noise (including IR-drop, ground bounce, and Ldi/dt) effects on chip performance, high test pattern volume, low fault/defect coverage, small delay defect test pattern generation, high cost of test implementation and application, and utilizing low-cost testers are among these challenges. This book discusses these challenges in detail and proposes new techniques and methodologies to improve the overall quality of the transition fault test.
Powerful new technology has been made available to researchers by an increasingly competitive workstation market. Papers from Canada, Japan, Italy, Germany, and the U.S., to name a few of the countries represented in this volume, discuss how workstations are used in experiments and what impact this new technology will have on experiments. As usual for IFIP workshops, the emphasis in this volume is on the formulation of strategies for future research, the determination of new market areas, and the identification of new areas for workstation research. This is the first volume of a book series reporting the work of IFIP WG 5.10. The mission of this IFIP work- ing group is to promote, develop and encourage advancement of the field of computer graphics as a basic tool, as an enabling technology and as an important part of various application areas.
The development of the 'factory of the future' by major international corporations such as General Motors, IBM, Westinghouse, etc now involves many practising engineers. This book is an attempt to identify and describe some of the building blocks required for computer aided engineering for manufacture. It begins with numerical control and the infrastructure required for the automation of individual 'islands' within existing factories. Computer aided design and computer aided manufacture are then discussed in detail together with their integration to improve manufacturing efficiency and flexibility. Robotics and flexible manufacturing systems are examined, as well as the management of these systems required for production optimization. Finally, there is an overview of the relatively new field of artificial intelligence, which is being increasingly used in most aspects of computer aided engineering for manufacture. There are many topics which could have been included or expanded upon with advantage, but the authors have attempted to strike a balance so that the reader can obtain the maximum usefulness from a reasonably concise volume.
Multi-objective optimization deals with the simultaneous optimization of two or more objectives which are normally in con?ict with each other. Since mul- objective optimization problems are relatively common in real-world appli- tions, this area has become a very popular research topic since the 1970s. However, the use of bio-inspired metaheuristics for solving multi-objective op- mization problems started in the mid-1980s and became popular until the mid- 1990s. Nevertheless, the e?ectiveness of multi-objective evolutionary algorithms has made them very popular in a variety of domains. Swarm intelligence refers to certain population-based metaheuristics that are inspired on the behavior of groups of entities (i.e., living beings) interacting locallywitheachotherandwiththeirenvironment.Suchinteractionsproducean emergentbehaviorthatismodelledinacomputerinordertosolveproblems.The two most popular metaheuristics within swarm intelligence are particle swarm optimization (which simulates a ?ock of birds seeking food) and ant colony optimization (which simulates the behavior of colonies of real ants that leave their nest looking for food). These two metaheuristics havebecome verypopular inthelastfewyears, andhavebeenwidelyusedinavarietyofoptimizationtasks, including some related to data mining and knowledge discovery in databases. However, such work has been mainly focused on single-objective optimization models. The use of multi-objective extensions of swarm intelligence techniques in data mining has been relatively scarce, in spite of their great potential, which constituted the main motivation to produce this book
System-on-a-Chip (SOC) integrated circuits composed of embedded cores are now commonplace. Nevertheless, there remain several roadblocks to rapid and efficient system integration. Test development is seen as a major bottleneck in SOC design and manufacturing capabilities. Testing SOCs is especially challenging in the absence of standardized test structures, test automation tools, and test protocols. In addition, long interconnects, high density, and high-speed designs lead to new types of faults involving crosstalk and signal integrity. SOC (System-on-a-Chip) Testing for Plug and Play Test Automation is an edited work containing thirteen contributions that address various aspects of SOC testing. SOC (System-on-a-Chip) Testing for Plug and Play Test Automation is a valuable reference for researchers and students interested in various aspects of SOC testing.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
Artificial intelligence provides an environmentally rich paradigm within which design research based on computational constructions can be carried out. This has been one of the foundations for the developing field called "design computing." Recently, there has been a growing interest in what designers do when they design and how they use computational tools. This forms the basis of a newly emergent field called "design cognition" that draws partly on cognitive science. This new conference series aims to provide a bridge between the two fields of "design computing" and "design cognition." The papers in this volume are from the "First International Conference on Design Computing and Cognition" (DCC'04) held at the Massachusetts Institute of Technology, USA. They represent state-of-the art research and development in design computing and cognition. They are of particular interest to researchers, developers and users of advanced computation in design and those who need to gain a better understanding of designing.
Routing of VLSI chips is an important, time consuming, and difficult problem. The difficulty of the problem is attributed to the large number of often conflicting factors that affect the routing quality. Traditional techniques have approached routing by ignoring some of these factors and imposing unnecessary constraints in order to make routing tractable. In addition to the imposition of these restrictions, which simplify the problems to a degree but at the same time reduce the routing quality, traditional approaches use brute force. They often transform the problem into mathematical or graph problems and completely ignore the specific knowledge about the routing task that can greatly help the solution. This thesis overcomes some of the above problems and presents a system that performs routing close to what human designers do. In other words it heavily capitalizes on the knowledge of human expertise in this area, it does not impose unnecessary constraints, it considers all the different factors that affect the routing quality, and most importantly it allows constant user interaction throughout the routing process. To achieve the above, this thesis presents background about some representative techniques for routing and summarizes their characteristics. It then studies in detail the different factors (such as minimum area, number of vias, wire length, etc.) that affect the routing quality, and the different criteria (such as vertical/horizontal constraint graph, merging, minimal rectilinear Steiner tree, etc.) that can be used to optimize these factors.
Machine Vision technology is becoming an indispensible part of the manufacturing industry. Biomedical and scientific applications of machine vision and imaging are becoming more and more sophisticated, and new applications continue to emerge. This book gives an overview of ongoing research in machine vision and presents the key issues of scientific and practical interest. A selected board of experts from the US, Japan and Europe provides an insight into some of the latest work done on machine vision systems and appliccations.
The NATO Advanced Research Workshop on Signal Processing and Pattern Recognition in Nondestructive Evaluation (NOE) of Materials was held August 19-22, 1987 at the Manoir St-Castin, Lac Beauport, Quebec, Canada. Modern signal processing, pattern recognition and artificial intelligence have been playing an increasingly important role in improving nondestructive evaluation and testing techniques. The cross fertilization of the two major areas can lead to major advances in NOE as well as presenting a new research area in signal processing. With this in mind, the Workshop provided a good review of progress and comparison of potential techniques, as well as constructive discussions and suggestions for effective use of modern signal processing to improve flaw detection, classification and prediction, as well as material characterization. This Proceedings volume includes most presentations given at the Workshop. This publication, like the meeting itself, is unique in the sense that it provides extensive interactions among the interrelated areas of NOE. The book starts with research advances on inverse problems and then covers different aspects of digital waveform processing in NOE and eddy current signal analysis. These are followed by four papers of pattern recognition and AI in NOE, and five papers of image processing and reconstruction in NOE. The last two papers deal with parameter estimation problems. Though the list of papers is not extensive, as the field of NOE signal processing is very new, the book has an excellent collection of both tutorial and research papers in this exciting new field.
Raster graphics differs from the more traditional vector or line graphics in the sense that images are not made up from line segments but from discrete elements orderly arranged in a two-dimensional rectangular region. There are two reasons for the growing popularity of raster graphics or bit-mapped displays: I) the possibilities they offer to show extremely realistic pictures 2) the dropping prices of those displays and associated processors and memories. With the rise of raster graphics, all kinds of new techniques, methods, algorithms and data representations are associated -such as ray tracing, raster operations, and quadtrees-bringing with them a lot of fruitful research. As stated above raster graphics allows to create extremely realistic (synthesized) pictures. There are important applications in such diverse areas as industrial deSign, flight Simulation, education, image processing and animation. Unfortunately many applications are hampered by the fact that with the present state of the art they reqUire an excessive amount of computing resources. Hence it is worthwhile to investigate methods and techniques which may be of help in redudng computer costs associated with raster graphics applications. Since the choice of data srtuc tures influences the efficiency of algorithms in a crudal way, a workshop was set up in order to bring together a (limited) number of experienced researchers to dis cuss this topic. The workshop was held from 24 to 28 June 1985 at Steensel, a tiny village in the neighbourhood of Eindhoven, the Netherlands.
Wireless networking enables two or more computers to communicate using standard network protocols without network cables. Since their emergence in the 1970s, wireless networks have become increasingly pop ular in the computing industry. In the past decade, wireless networks have enabled true mobility. There are currently two versions of mobile wireless networks. An infrastructure network contains a wired backbone with the last hop being wireless. The cellular phone system is an exam ple of an infrastructure network. A multihop ad hoc wireless network has no infrastructure and is thus entirely wireless. A wireless sensor network is an example of a multihop ad hoc wireless network. Ad hoc wireless networking is a technique to support robust and ef ficient operation in mobile wireless networks by incorporating routing functionality into mobile hosts. This technique will be used to realize the dream of "anywhere and anytime computing," which is termed mo bile computing. Mobile computing is a new paradigm of computing in which users carrying portable devices have access to shared infrastruc ture in any location at any time. Mobile computing is a very challenging topic for scientists in computer science and electrical engineering. The representative system for ad hoc wireless networking is called MANET, an acronym for "Mobile Ad hoc NETworks." MANET is an autonomous system consisting of mobile hosts connected by wireless links which can be quickly deployed."
Purpose of this book is to combine different approaches in the study of arm movement in space in order to create new synergy between domains of researchwhich tend to be developed independently. It is from these synergies that a new understanding of the control of arm and hand movement can ermerge.Previous books have been devoted to artificial neural networks for sensorimotor control (for example Advanced Neural Computers, R. Eckmiller ed. Elsevier). The present book is the first one to propose a precise and direct comparison between current computational development and new experimental results of neurophysiology and new experimental results of neurophysiology and neurophysics. The book covers different levels of neuralcontrol: spinal cord, red nucleus, premotor cortex, motor cortex, parietal cortex, thalamus and cerebellum. An important place is devoted to the problems of muscle coordination, internal representations of movement variables, in different nervous regions and to the problem of coordinate transformations underlying reaching and manipulation. For the physiologist, the book proposes not only a comprehensive picture of new experimental results but also a theoretical basis for a better understandingof central coding of movement by neuroinal populations. For neural networks and robotics students, this book provides a very rich knowledge on the way the brain controls arm movements by using visual information, it can offer them new concepts and ideas to generate more efficient artificial systems having in mind the powerful capacity of the human brain.
This book contains 26 papers presented at the NATO Advanced Research Workshop on "CAD Based Programming for Sensory Robots," held in IL CIOCCa, Italy, July 4-6, 1988. CAD based robot programming is considered to be the process where CAD (Computer Based) models are used to develop robot programs. If the program is generated, at least partially, by a programmer interacting, for example, with a computer graph i c d sp i 1 ay of the robot and its workce 11 env ironment, the process is referred to as graphical off-line programming. On the other hand, if the robot program is generated automatically, for example, by a computer, then the process is referred to as automatic robot programmi ng. The key element here is the use of CAD models both for interact i ve and automat i c generat i on of robot programs. CAD based programmi ng, therefore, bri ngs together computer based model i ng and robot programmi ng and as such cuts across several discipl ines including geometric model ing, robot programming, kinematic and dynamic modeling, artificial intelligence, sensory monitoring and so-on.
The goal of the research out of which this monograph grew, was to make annealing as much as possible a general purpose optimization routine. At first glance this may seem a straight-forward task, for the formulation of its concept suggests applicability to any combinatorial optimization problem. All that is needed to run annealing on such a problem is a unique representation for each configuration, a procedure for measuring its quality, and a neighbor relation. Much more is needed however for obtaining acceptable results consistently in a reasonably short time. It is even doubtful whether the problem can be formulated such that annealing becomes an adequate approach for all instances of an optimization problem. Questions such as what is the best formulation for a given instance, and how should the process be controlled, have to be answered. Although much progress has been made in the years after the introduction of the concept into the field of combinatorial optimization in 1981, some important questions still do not have a definitive answer. In this book the reader will find the foundations of annealing in a self-contained and consistent presentation. Although the physical analogue from which the con cept emanated is mentioned in the first chapter, all theory is developed within the framework of markov chains. To achieve a high degree of instance independence adaptive strategies are introduced."
Iterative Learning Control for Deterministic Systems is part of the new Advances in Industrial Control series, edited by Professor M.J. Grimble and Dr. M.A. Johnson of the Industrial Control Unit, University of Strathclyde. The material presented in this book addresses the analysis and design of learning control systems. It begins with an introduction to the concept of learning control, including a comprehensive literature review. The text follows with a complete and unifying analysis of the learning control problem for linear LTI systems using a system-theoretic approach which offers insight into the nature of the solution of the learning control problem. Additionally, several design methods are given for LTI learning control, incorporating a technique based on parameter estimation and a one-step learning control algorithm for finite-horizon problems. Further chapters focus upon learning control for deterministic nonlinear systems, and a time-varying learning controller is presented which can be applied to a class of nonlinear systems, including the models of typical robotic manipulators.The book concludes with the application of artificial neural networks to the learning control problem. Three specific ways to neural nets for this purpose are discussed, including two methods which use backpropagation training and reinforcement learning. The appendices in the book are particularly useful because they serve as a tutorial on artificial neural networks.
In the early days of VLSI, the design of the power distribution for an integrated cir cuit was rather simple. Power distribution --the design of the geometric topology for the network of wires that connect the various power supplies, the widths of the indi vidual segments for each of these wires, the number and location of the power I/O pins around the periphery of the chip --was simple because the chips were simpler. Few available wiring layers forced floorplans that allowed simple, planar (non-over lapping) power networks. Lower speeds and circuit density made the choice of the wire widths easier: we made them just fat enough to avoid resistive voltage drops due to switching currents in the supply network. And we just didn't need enormous num bers of power and ground pins on the package for the chips to work. It's not so simple any more. Increased integration has forced us to focus on reliability concerns such as metal elec tromigration, which affects wire sizing decisions in the power network. Extra metal layers have allowed more flexibility in the topological layout of the power networks."
This book presents the results of an international workshop on Modelling and Analysis of Arms Control Problems held in Spitzingsee near Munich in October 1985 under the joint sponsorship of NATO's Scientific Affairs Division and the Volkswagen Foundation. The idea for this workshop evolved in 1983, as a consequence of discussions in the annual Systems Science Seminar at the Computer Science Department of the Federal Armed Forces University ~1unich on the topic of Quantitative Assessment in Arms Control 1) * There was wide agreement among the contribu tors to that seminar and its participants that those efforts to assess the potential contributions of systems and decision sciences, as well as systems analysis and"mathematical modelling, to arms control issues should be ex panded and a forum should be provided for this activity. It was further agreed that such a forum should include political scientists and policy analysts working in the area of arms control.
The finite difference method (FDM) hasbeen used tosolve differential equation systems for centuries. The FDM works well for problems of simple geometry and was widely used before the invention of the much more efficient, robust finite element method (FEM). FEM is now widely used in handling problems with complex geometry. Currently, we are using and developing even more powerful numerical techniques aiming to obtain more accurate approximate solutions in a more convenient manner for even more complex systems. The meshfree or meshless method is one such phenomenal development in the past decade, and is the subject of this book. There are many MFree methods proposed so far for different applications. Currently, three monographs on MFree methods have been published. Mesh Free Methods, Moving Beyond the Finite Element Method d by GR Liu (2002) provides a systematic discussion on basic theories, fundamentals for MFree methods, especially on MFree weak-form methods. It provides a comprehensive record of well-known MFree methods and the wide coverage of applications of MFree methods to problems of solids mechanics (solids, beams, plates, shells, etc.) as well as fluid mechanics. The Meshless Local Petrov-Galerkin (MLPG) Method d by Atluri and Shen (2002) provides detailed discussions of the meshfree local Petrov-Galerkin (MLPG) method and itsvariations. Formulations and applications of MLPG are well addressed in their book. |
![]() ![]() You may like...
Up and Running with AutoCAD 2019 - 2D…
Elliot J. Gindis, Robert C. Kaebisch
Paperback
R1,943
Discovery Miles 19 430
Mastercam 2023 for SolidWorks Black Book…
Gaurav Verma, Matt Weber
Hardcover
R2,502
Discovery Miles 25 020
Recent Trends in Computer-aided…
Saptarshi Chatterjee, Debangshu Dey, …
Paperback
R2,729
Discovery Miles 27 290
AutoCAD Electrical 2023 Black Book…
Gaurav Verma, Matt Weber
Hardcover
R1,583
Discovery Miles 15 830
Autodesk Revit 2023 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,061
Discovery Miles 20 610
|