![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
The Language of Design articulates the theory that there is a language of design. Drawing upon insights from computational language processing, the language of design is modeled computationally through latent semantic analysis (LSA), lexical chain analysis (LCA), and sentiment analysis (SA). The statistical co-occurrence of semantics (LSA), semantic relations (LCA), and semantic modifiers (SA) in design text is used to illustrate how the reality producing effect of language is itself an enactment of design, allowing a new understanding of the connections between creative behaviors. The computation of the language of design makes it possible to make direct measurements of creative behaviors which are distributed across social spaces and mediated through language. The book demonstrates how machine understanding of design texts based on computation over the language of design yields practical applications for design management.
This book analyzes the causes of failures in computing systems, their consequences, as weIl as the existing solutions to manage them. The domain is tackled in a progressive and educational manner with two objectives: 1. The mastering of the basics of dependability domain at system level, that is to say independently ofthe technology used (hardware or software) and of the domain of application. 2. The understanding of the fundamental techniques available to prevent, to remove, to tolerate, and to forecast faults in hardware and software technologies. The first objective leads to the presentation of the general problem, the fault models and degradation mechanisms wh ich are at the origin of the failures, and finally the methods and techniques which permit the faults to be prevented, removed or tolerated. This study concerns logical systems in general, independently of the hardware and software technologies put in place. This knowledge is indispensable for two reasons: * A large part of a product' s development is independent of the technological means (expression of requirements, specification and most of the design stage). Very often, the development team does not possess this basic knowledge; hence, the dependability requirements are considered uniquely during the technological implementation. Such an approach is expensive and inefficient. Indeed, the removal of a preliminary design fault can be very difficult (if possible) if this fault is detected during the product's final testing.
Computer Methods for Analysis of Mixed-Mode Switching Circuits
provides an in-depth treatment of the principles and implementation
details of computer methods and numerical algorithms for analysis
of mixed-mode switching circuits. Major topics include:
The development of any Software (Industrial) Intensive System, e.g. critical embedded software, requires both different notations, and a strong devel- ment process. Different notations are mandatory because different aspects of the Software System have to be tackled. A strong development process is mandatory as well because without a strong organization we cannot warrantee the system will meet its requirements. Unfortunately, much more is needed! The different notations that can be used must all possess at least one property: formality. The development process must also have important properties: a exha- tive coverage of the development phases, and a set of well integrated support tools. In Computer Science it is now widely accepted that only formal notations can guarantee a perfect de?ned meaning. This becomes a more and more important issue since software systems tend to be distributed in large systems (for instance in safe public transportation systems), and in small ones (for instance numerous processors in luxury cars). Distribution increases the complexity of embedded software while safety criteria get harder to be met. On the other hand, during the past decade Software Engineering techniques have been improved a lot, and are now currently used to conduct systematic and rigorous development of large software systems. UML has become the de facto standard notation for documenting Software Engineering projects. UML is supported by many CASE tools that offer graphical means for the UML notation.
When I attended college we studied vacuum tubes in our junior year. At that time an average radio had ?ve vacuum tubes and better ones even seven. Then transistors appeared in 1960s. A good radio was judged to be one with more thententransistors. Latergoodradioshad15-20transistors and after that everyone stopped counting transistors. Today modern processors runing personal computers have over 10milliontransistorsandmoremillionswillbeaddedevery year. The difference between 20 and 20M is in complexity, methodology and business models. Designs with 20 tr- sistors are easily generated by design engineers without any tools, whilst designs with 20M transistors can not be done by humans in reasonable time without the help of Prof. Dr. Gajski demonstrates the Y-chart automation. This difference in complexity introduced a paradigm shift which required sophisticated methods and tools, and introduced design automation into design practice. By the decomposition of the design process into many tasks and abstraction levels the methodology of designing chips or systems has also evolved. Similarly, the business model has changed from vertical integration, in which one company did all the tasks from product speci?cation to manufacturing, to globally distributed, client server production in which most of the design and manufacturing tasks are outsourced.
IDT (Intelligent Decision Technologies) seeks an interchange of research on intelligent systems and intelligent technologies which enhance or improve decision making in industry, government and academia. The focus is interdisciplinary in nature, and includes research on all aspects of intelligent decision technologies, from fundamental development to the applied system. It constitutes a great honor and pleasure for us to publish the works and new research results of scholars from the First KES International Symposium on Intelligent Decision Technologies (KES IDT'09), hosted and organized by University of Hyogo in conjunction with KES International (Himeji, Japan, April, 2009). The symposium was concerned with theory, design, development, implementation, testing and evaluation of intelligent decision systems. Its topics included intelligent agents, fuzzy logic, multi-agent systems, artificial neural networks, genetic algorithms, expert systems, intelligent decision making support systems, information retrieval systems, geographic information systems, and knowledge management systems. These technologies have the potential to support decision making in many areas of management, international business, finance, accounting, marketing, healthcare, military applications, production, networks, traffic management, crisis response, and human interfaces.
Knowledge Discovery today is a significant study and research area. In finding answers to many research questions in this area, the ultimate hope is that knowledge can be extracted from various forms of data around us. This book covers recent advances in unsupervised and supervised data analysis methods in Computational Intelligence for knowledge discovery. In its first part the book provides a collection of recent research on distributed clustering, self organizing maps and their recent extensions. If labeled data or data with known associations are available, we may be able to use supervised data analysis methods, such as classifying neural networks, fuzzy rule-based classifiers, and decision trees. Therefore this book presents a collection of important methods of supervised data analysis. "Classification and Clustering for Knowledge Discovery" also includes variety of applications of knowledge discovery in health, safety, commerce, mechatronics, sensor networks, and telecommunications.
This book approaches the realisation of digital terrain and landscape data through clear and practical examples. From data provision and the creation of revealing analyses to realistic depictions for presentation purposes, the reader is led through the world of digital 3-D graphics. The authors deep knowledge of the scientific fundamentals and many years of experience in 3-D visualization enable them to lead the reader through a complex subject and shed light on previously murky virtual landscapes.
Along the years, rough set theory has earned a well-deserved reputation as a sound methodology for dealing with imperfect knowledge in a simple though mathematically sound way. This edited volume aims at continue stressing the benefits of applying rough sets in many real-life situations while still keeping an eye on topological aspects of the theory as well as strengthening its linkage with other soft computing paradigms. The volume comprises 11 chapters and is organized into three parts. Part 1 deals with theoretical contributions while Parts 2 and 3 focus on several real world data mining applications. Chapters authored by pioneers were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed. Academics, scientists as well as engineers working in the rough set, computational intelligence, soft computing and data mining research area will find the comprehensive coverage of this book invaluable.
In its updated second edition, this book has been extensively revised on a chapter by chapter basis. The book accurately reflects the syntax and semantic changes to the SystemVerilog language standard, making it an essential reference for systems professionals who need the latest version information. In addition, the second edition features a new chapter explaining the SystemVerilog "packages," a new appendix that summarizes the synthesis guidelines presented throughout the book, and all of the code examples have been updated to the final syntax and rerun using the latest version of the Synopsys, Mentor, and Cadance tools.
The world we live in is pervaded with uncertainty and imprecision. Is it likely to rain this afternoon? Should I take an umbrella with me? Will I be able to find parking near the campus? Should I go by bus? Such simple questions are a c- mon occurrence in our daily lives. Less simple examples: What is the probability that the price of oil will rise sharply in the near future? Should I buy Chevron stock? What are the chances that a bailout of GM, Ford and Chrysler will not s- ceed? What will be the consequences? Note that the examples in question involve both uncertainty and imprecision. In the real world, this is the norm rather than exception. There is a deep-seated tradition in science of employing probability theory, and only probability theory, to deal with uncertainty and imprecision. The mon- oly of probability theory came to an end when fuzzy logic made its debut. H- ever, this is by no means a widely accepted view. The belief persists, especially within the probability community, that probability theory is all that is needed to deal with uncertainty. To quote a prominent Bayesian, Professor Dennis Lindley, "The only satisfactory description of uncertainty is probability.
Statistical timing analysis is an area of growing importance in nanometer te- nologies' as the uncertainties associated with process and environmental var- tions increase' and this chapter has captured some of the major efforts in this area. This remains a very active field of research' and there is likely to be a great deal of new research to be found in conferences and journals after this book is published. In addition to the statistical analysis of combinational circuits' a good deal of work has been carried out in analyzing the effect of variations on clock skew. Although we will not treat this subject in this book' the reader is referred to [LNPS00' HN01' JH01' ABZ03a] for details. 7 TIMING ANALYSIS FOR SEQUENTIAL CIRCUITS 7.1 INTRODUCTION A general sequential circuit is a network of computational nodes (gates) and memory elements (registers). The computational nodes may be conceptualized as being clustered together in an acyclic network of gates that forms a c- binational logic circuit. A cyclic path in the direction of signal propagation 1 is permitted in the sequential circuit only if it contains at least one register . In general, it is possible to represent any sequential circuit in terms of the schematic shown in Figure 7.1, which has I inputs, O outputs and M registers. The registers outputs feed into the combinational logic which, in turn, feeds the register inputs. Thus, the combinational logic has I + M inputs and O + M outputs.
This volume reflects the theme of the INFORMS 2004 Meeting in Denver: Back to OR Roots. Emerging as a quantitative approach to problem-solving in World War II, our founders were physicists, mathematicians, and engineers who quickly found peace-time uses. It is fair to say that Operations Research (OR) was born in the same incubator as computer science, and it has spawned many new disciplines, such as systems engineering, health care management, and transportation science. Although people from many disciplines routinely use OR methods, many scientific researchers, engineers, and others do not understand basic OR tools and how they can help them. Disciplines ranging from finance to bioengineering are the beneficiaries of what we do - we take an interdisciplinary approach to problem-solving. Our strengths are modeling, analysis, and algorithm design. We provide a quanti- tive foundation for a broad spectrum of problems, from economics to medicine, from environmental control to sports, from e-commerce to computational - ometry. We are both producers and consumers because the mainstream of OR is in the interfaces. As part of this effort to recognize and extend OR roots in future probl- solving, we organized a set of tutorials designed for people who heard of the topic and want to decide whether to learn it. The 90 minutes was spent addre- ing the questions: What is this about, in a nutshell? Why is it important? Where can I learn more? In total, we had 14 tutorials, and eight of them are published here.
"CAAD Futures" is a bi-annual conference that aims to promote the advancement of computer-aided architectural design in the service of those concerned with the quality of the built environment. The conferences are organized under the auspices of the CAAD Futures Foundation, which has its secretariat at the Eindhoven University of Technology in the Netherlands. This book contains papers prepared for the 10th CAAD Futures conference that took place at the National Cheng Kung University, 28 to 30 April, 2003. The chapters provide an overview of the state-of-the-art in research on computer-aided architectural design at that time. Information on the CAAD Futures Foundation and its conferences can be found at http: //www.caadfutures.arch.tue.nl
Synthesis of Finite State Machines: Functional Optimization is one of two monographs devoted to the synthesis of Finite State Machines (FSMs). This volume addresses functional optimization, whereas the second addresses logic optimization. By functional optimization here we mean the body of techniques that: compute all permissible sequential functions for a given topology of interconnected FSMs, and select a `best' sequential function out of the permissible ones. The result is a symbolic description of the FSM representing the chosen sequential function. By logic optimization here we mean the steps that convert a symbolic description of an FSM into a hardware implementation, with the goal to optimize objectives like area, testability, performance and so on. Synthesis of Finite State Machines: Functional Optimization is divided into three parts. The first part presents some preliminary definitions, theories and techniques related to the exploration of behaviors of FSMs. The second part presents an implicit algorithm for exact state minimization of incompletely specified finite state machines (ISFSMs), and an exhaustive presentation of explicit and implicit algorithms for the binate covering problem. The third part addresses the computation of permissible behaviors at a node of a network of FSMs and the related minimization problems of non-deterministic finite state machines (NDFSMs). Key themes running through the book are the exploration of behaviors contained in a non-deterministic FSM (NDFSM), and the representation of combinatorial problems arising in FSM synthesis by means of Binary Decision Diagrams (BDDs). Synthesis of Finite State Machines: Functional Optimization will be of interest to researchers and designers in logic synthesis, CAD and design automation.
On Optimal Interconnections for VLSI describes, from a geometric perspective, algorithms for high-performance, high-density interconnections during the global and detailed routing phases of circuit layout. First, the book addresses area minimization, with a focus on near-optimal approximation algorithms for minimum-cost Steiner routing. In addition to practical implementations of recent methods, the implications of recent results on spanning tree degree bounds and the method of Zelikovsky are discussed. Second, the book addresses delay minimization, starting with a discussion of accurate, yet algorithmically tractable, delay models. Recent minimum-delay constructions are highlighted, including provably good cost-radius tradeoffs, critical-sink routing algorithms, Elmore delay-optimal routing, graph Steiner arborescences, non-tree routing, and wiresizing. Third, the book addresses skew minimization for clock routing and prescribed-delay routing formulations. The discussion starts with early matching-based constructions and goes on to treat zero-skew routing with provably minimum wirelength, as well as planar clock routing. Finally, the book concludes with a discussion of multiple (competing) objectives, i.e., how to optimize area, delay, skew, and other objectives simultaneously. These techniques are useful when the routing instance has heterogeneous resources or is highly congested, as in FPGA routing, multi-chip packaging, and very dense layouts. Throughout the book, the emphasis is on practical algorithms and a complete self-contained development. On Optimal Interconnections for VLSI will be of use to both circuit designers (CAD tool users) as well as researchers and developers in the area of performance-driven physical design.
One of the foundations for change in our society comes from designing. Its genesis is the notion that the world around us either is unsuited to our needs or can be improved. The need for designing is driven by a society's view that it can improve or add value to human existence well beyond simple subsistence. As a consequence of designing the world which we inhabit is increasingly a designed rather than a naturally occurring one. In that sense it is an "artificial" world. Designing is a fundamental precursor to manufacturing, fabrication, construction or implementation. Design research aims to develop an understanding of designing and to produce models of designing that can be used to aid designing. Artificial intelligence has provided an environmental paradigm within which design research based on computational constructions, can be carried out. Design research can be carried out in variety of ways. It can be viewed as largely an empirical endeavour in which experiments are designed and executed in order to test some hypothesis about some design phenomenon or design behaviour. This is the approach adopted in cognitive science. It often manifests itself through the use of protocol studies of designers. The results of such research form the basis of a computational model. A second view is that design research can be carried out by positing axioms and then deriving consequences from them.
This book contains the revised contributions of 18 tutorial speakers at the seventh AACD '98 in Copenhagen, April 28-30, 1998. The conference was organized by OIe Olesen, ofthe Technical University of Denmark. The pro gram committee consisted of Johan H. Huijsing from Delft University ofTechnology, The Netherlands, Willy Samsen from the Katholieke Universiteit Leuven, Belgium and Rudy J. van de Plassche, Philips Research, The Netherlands. The pro gram was concentrated around three important topics in analog circuit design. Each of these three topics has been covered by six papers. Each of the three chapters of this book contains the six papers of one topic. The three topics are: I-Volt Electronics Design and implementation ofMixed Modes Systems. Low-Noise and RF power Amplifies for the communication. Other topics, which have been covered in this series before are: 1992 OpAmps ADC's AnalogCAD. 1993 Mixed-Mode AlD design Sensor Interfaces Communication circuits. 1994 Low-Power low-Voltage Integrated Filters Smart Power. 1995 Low-Noise, Low-Power, Low-Voltage Mixed Mode with CAD Tirals Voltage, Current and Time References. vii viii 1996 RF CMOS circuit design BandpassSigma Delta and other Converters Translinear circuits. 1997 RF A-D Converters Sensor and Actuator Interfaces Low-noise Oscillators, PLL's and and Synthesizers. We hope to serve the analog design community with these series of books and plan to continue this series in the future.
The design of computer systems to be embedded in critical real-time applications is a complex task. Such systems must not only guarantee to meet hard real-time deadlines imposed by their physical environment, they must guarantee to do so dependably, despite both physical faults (in hardware) and design faults (in hardware or software). A fault-tolerance approach is mandatory for these guarantees to be commensurate with the safety and reliability requirements of many life- and mission-critical applications. This book explains the motivations and the results of a collaborative project', whose objective was to significantly decrease the lifecycle costs of such fault tolerant systems. The end-user companies participating in this project already deploy fault-tolerant systems in critical railway, space and nuclear-propulsion applications. However, these are proprietary systems whose architectures have been tailored to meet domain-specific requirements. This has led to very costly, inflexible, and often hardware-intensive solutions that, by the time they are developed, validated and certified for use in the field, can already be out-of-date in terms of their underlying hardware and software technology."
In VLSI CAD, difficult optimization problems have to be solved on a constant basis. Various optimization techniques have been proposed in the past. While some of these methods have been shown to work well in applications and have become somewhat established over the years, other techniques have been ignored. Recently, there has been a growing interest in optimization algorithms based on principles observed in nature, termed Evolutionary Algorithms (EAs). Evolutionary Algorithms in VLSI CAD presents the basic concepts of EAs, and considers the application of EAs in VLSI CAD. It is the first book to show how EAs could be used to improve IC design tools and processes. Several successful applications from different areas of circuit design, like logic synthesis, mapping and testing, are described in detail. Evolutionary Algorithms in VLSI CAD consists of two parts. The first part discusses basic principles of EAs and provides some easy-to-understand examples. Furthermore, a theoretical model for multi-objective optimization is presented. In the second part a software implementation of EAs is supplied together with detailed descriptions of several EA applications. These applications cover a wide range of VLSI CAD, and different methods for using EAs are described. Evolutionary Algorithms in VLSI CAD is intended for CAD developers and researchers as well as those working in evolutionary algorithms and techniques supporting modern design tools and processes.
Networks on Chip presents a variety of topics, problems and approaches with the common theme to systematically organize the on-chip communication in the form of a regular, shared communication network on chip, an NoC for short. As the number of processor cores and IP blocks integrated on a single chip is steadily growing, a systematic approach to design the communication infrastructure becomes necessary. Different variants of packed switched on-chip networks have been proposed by several groups during the past two years. This book summarizes the state of the art of these efforts and discusses the major issues from the physical integration to architecture to operating systems and application interfaces. It also provides a guideline and vision about the direction this field is moving to. Moreover, the book outlines the consequences of adopting design platforms based on packet switched network. The consequences may in fact be far reaching because many of the topics of distributed systems, distributed real-time systems, fault tolerant systems, parallel computer architecture, parallel programming as well as traditional system-on-chip issues will appear relevant but within the constraints of a single chip VLSI implementation. The book is organized in three parts. The first deals with system design and methodology issues. The second presents problems and solutions concerning the hardware and the basic communication infrastructure. Finally, the third part covers operating system, embedded software and application. However, communication from the physical to the application level is a central theme throughout the book. The book serves as an excellent reference source and may be used as a text for advanced courses on the subject.
By virtue of their special algebraic structures, Pythagorean-hodograph (PH) curves offer unique advantages for computer-aided design and manufacturing, robotics, motion control, path planning, computer graphics, animation, and related fields. This book offers a comprehensive and self-contained treatment of the mathematical theory of PH curves, including algorithms for their construction and examples of their practical applications. It emphasizes the interplay of ideas from algebra and geometry and their historical origins and includes many figures, worked examples, and detailed algorithm descriptions.
This book provides an engineering insight into how to provide a scalable and robust verification solution with ever increasing design complexity and sizes. It describes SAT-based model checking approaches and gives engineering details on what makes model checking practical. The book brings together the various SAT-based scalable emerging technologies and techniques covered can be synergistically combined into a scalable solution.
This practical guide and introduction to the design of key RF building blocks used in high data rate transmitters emphasizes CMOS circuit techniques applicable to oscillators and upconvertors. The book is written in an easily accessible manner, without losing detail on the technical side.
Memory Architecture Exploration for Programmable Embedded Systems
addresses efficient exploration of alternative memory
architectures, assisted by a "compiler-in-the-loop" that allows
effective matching of the target application to the
processor-memory architecture. This new approach for memory
architecture exploration replaces the traditional black-box view of
the memory system and allows for aggressive co-optimization of the
programmable processor together with a customized memory system.
|
![]() ![]() You may like...
Demand for Emerging Transportation…
Constantinos Antoniou, Dimitrios Efthymiou, …
Paperback
R2,683
Discovery Miles 26 830
5th International Symposium of Space…
H. Paul Urbach, Qifeng Yu
Hardcover
R2,942
Discovery Miles 29 420
|