![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
For the past decade or so, Computational Intelligence (CI) has been an - tremely "hot" topic amongst researchers working in the ?elds of biomedicine and bioinformatics. There are many successful applications of CI in such areas ascomputationalgenomics, predictionofgeneexpression, proteinstructure, and protein-protein interactions, modeling of evolution, or neuronal systems mod- ing and analysis. However, there still are many problems in biomedicine and bioinformatics that are in desperate need of advanced and e?cient compu- tional methodologies to deal with tremendous amounts of data so prevalent in those kinds of researchpursuits. Unfortunately, scientists in both these ?elds are very often unaware of the abundance of computational techniques that could be put to use to help them analyze and understand the data underlying their research inquiries. On the other hand, computational intelligence practitioners are often unfamiliar with the particular problems that their algorithms could be successfully applied for. The separation between the two worlds is partially caused by the use of di?erent languages in these two spheres of science, but also by a relatively small number of publications devoted solely to the purpose of facilitating the exchange of new computational algorithms and methodologies on one hand, and the needs of the realms of biomedicine and bioinformatics on the other. Inordertohelp?llthegapbetweenthescientistsonbothsidesofthisspectrum, wehavesolicitedcontributionsfromresearchersactivelyapplyingcomputational intelligencetechniquestoimportantproblemsinbiomedicineandbioinformatics. The purpose of this book is to provide an overview of powerful state-of-the-art methodologiesthatarecurrentlyutilizedforbiomedicine-and/orbioinformati- orientedapplications, sothatresearchersworkinginthose?eldscouldlearnofnew methodstohelpthemtackletheirproblems. Ontheotherhand, wealsohopethat the CI community will ?nd this book useful by discovering a new and intriguing area of applications.
Combinatorial optimisation is a ubiquitous discipline whose usefulness spans vast applications domains. The intrinsic complexity of most combinatorial optimisation problems makes classical methods unaffordable in many cases. To acquire practical solutions to these problems requires the use of metaheuristic approaches that trade completeness for pragmatic effectiveness. Such approaches are able to provide optimal or quasi-optimal solutions to a plethora of difficult combinatorial optimisation problems. The application of metaheuristics to combinatorial optimisation is an active field in which new theoretical developments, new algorithmic models, and new application areas are continuously emerging. This volume presents recent advances in the area of metaheuristic combinatorial optimisation, with a special focus on evolutionary computation methods. Moreover, it addresses local search methods and hybrid approaches. In this sense, the book includes cutting-edge theoretical, methodological, algorithmic and applied developments in the field, from respected experts and with a sound perspective.
Along the years, rough set theory has earned a well-deserved reputation as a sound methodology for dealing with imperfect knowledge in a simple though mathematically sound way. This edited volume aims at continue stressing the benefits of applying rough sets in many real-life situations while still keeping an eye on topological aspects of the theory as well as strengthening its linkage with other soft computing paradigms. The volume comprises 11 chapters and is organized into three parts. Part 1 deals with theoretical contributions while Parts 2 and 3 focus on several real world data mining applications. Chapters authored by pioneers were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed. Academics, scientists as well as engineers working in the rough set, computational intelligence, soft computing and data mining research area will find the comprehensive coverage of this book invaluable.
Functional Design Errors in Digital Circuits Diagnosis covers a wide spectrum of innovative methods to automate the debugging process throughout the design flow: from Register-Transfer Level (RTL) all the way to the silicon die. In particular, this book describes: (1) techniques for bug trace minimization that simplify debugging; (2) an RTL error diagnosis method that identifies the root cause of errors directly; (3) a counterexample-guided error-repair framework to automatically fix errors in gate-level and RTL designs; (4) a symmetry-based rewiring technology for fixing electrical errors; (5) an incremental verification system for physical synthesis; and (6) an integrated framework for post-silicon debugging and layout repair. The solutions provided in this book can greatly reduce debugging effort, enhance design quality, and ultimately enable the design and manufacture of more reliable electronic devices.
This is the first book to focus on emerging technologies for distributed intelligent decision-making in process planning and dynamic scheduling. It has two sections: a review of several key areas of research, and an in-depth treatment of particular techniques. Each chapter addresses a specific problem domain and offers practical solutions to solve it. The book provides a better understanding of the present state and future trends of research in this area.
Many real systems are composed of multi-state components with different performance levels and several failure modes. These affect the whole system's performance. Most books on reliability theory cover binary models that allow a system only to function perfectly or fail completely. "The Universal Generating Function in Reliability Analysis and Optimization" is the first book that gives a comprehensive description of the universal generating function technique and its applications in binary and multi-state system reliability analysis. Features: This monograph will be of value to anyone interested in system reliability, performance analysis and optimization in industrial, electrical and nuclear engineering.
Many new topologies and circuit design techniques have emerged recently to improve the performance of active inductors, but a comprehensive treatment of the theory, topology, characteristics, and design constraint of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels, is not available. This book is an attempt to provide an in-depth examination and a systematic presentation of the operation principles and implementation details of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels. The content of the book is drawn from recently published research papers and are not available in a single, cohesive book. Equal emphasis is given to the theory of CMOS active inductors and transformers, and their emerging applications. Major subjects to be covered in the book include: inductive characteristics in high-speed analog signal processing and data communications, spiral inductors and transformers - modeling and limitations, a historical perspective of device synthesis, the topology, characterization, and implementation of CMOS active inductors and transformers, and the application of CMOS active inductors and transformers in high-speed analog and digital signal processing and data communications.
The authors have consolidated their research work in this volume titled Soft Computing for Data Mining Applications. The monograph gives an insight into the research in the ?elds of Data Mining in combination with Soft Computing methodologies. In these days, the data continues to grow - ponentially. Much of the data is implicitly or explicitly imprecise. Database discovery seeks to discover noteworthy, unrecognized associations between the data items in the existing database. The potential of discovery comes from the realization that alternate contexts may reveal additional valuable information. The rate at which the data is storedis growing at a phenomenal rate. Asaresult, traditionaladhocmixturesofstatisticaltechniquesanddata managementtools are no longer adequate for analyzing this vast collection of data. Severaldomainswherelargevolumesofdataarestoredincentralizedor distributeddatabasesincludesapplicationslikeinelectroniccommerce, bio- formatics, computer security, Web intelligence, intelligent learning database systems, ?nance, marketing, healthcare, telecommunications, andother?elds. E?cient tools and algorithms for knowledge discovery in large data sets have been devised during the recent years. These methods exploit the ca- bility of computers to search huge amounts of data in a fast and e?ective manner. However, the data to be analyzed is imprecise and a?icted with - certainty. In the case of heterogeneous data sources such as text and video, the data might moreover be ambiguous and partly con?icting. Besides, p- terns and relationships of interest are usually approximate. Thus, in order to make the information mining process more robust it requires tolerance toward imprecision, uncertainty and exc
In recent years, the issue of linkage in GEAs has garnered greater attention and recognition from researchers. Conventional approaches that rely much on ad hoc tweaking of parameters to control the search by balancing the level of exploitation and exploration are grossly inadequate. As shown in the work reported here, such parameters tweaking based approaches have their limits; they can be easily fooled by cases of triviality or peculiarity of the class of problems that the algorithms are designed to handle. Furthermore, these approaches are usually blind to the interactions between the decision variables, thereby disrupting the partial solutions that are being built up along the way.
Condition modelling and control is a technique used to enable decision-making in manufacturing processes of interest to researchers and practising engineering. Condition Monitoring and Control for Intelligent Manufacturing will be bought by researchers and graduate students in manufacturing and control and engineering, as well as practising engineers in industries such as automotive and packaging manufacturing.
We describe in this book, new methods and applications of hybrid intelligent systems using soft computing techniques. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and evolutionary al- rithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of intelligent control, which are basically papers that use hybrid systems to solve particular problems of control. The second part contains papers with the main theme of pattern recognition, which are basically papers using soft computing techniques for achieving pattern recognition in different applications. The third part contains papers with the themes of intelligent agents and social systems, which are papers that apply the ideas of agents and social behavior to solve real-world problems. The fourth part contains papers that deal with the hardware implementation of intelligent systems for solving particular problems. The fifth part contains papers that deal with modeling, simulation and optimization for real-world applications.
Evolutionary algorithms are sophisticated search methods that have been found to be very efficient and effective in solving complex real-world multi-objective problems where conventional optimization tools fail to work well. Despite the tremendous amount of work done in the development of these algorithms in the past decade, many researchers assume that the optimization problems are deterministic and uncertainties are rarely examined. The primary motivation of this book is to provide a comprehensive introduction on the design and application of evolutionary algorithms for multi-objective optimization in the presence of uncertainties. In this book, we hope to expose the readers to a range of optimization issues and concepts, and to encourage a greater degree of appreciation of evolutionary computation techniques and the exploration of new ideas that can better handle uncertainties. "Evolutionary Multi-Objective Optimization in Uncertain Environments: Issues and Algorithms" is intended for a wide readership and will be a valuable reference for engineers, researchers, senior undergraduates and graduate students who are interested in the areas of evolutionary multi-objective optimization and uncertainties.
This monograph is devoted to theoretical and experimental study of inhibitory decision and association rules. Inhibitory rules contain on the right-hand side a relation of the kind "attribut = value." The use of inhibitory rules instead of deterministic (standard) ones allows us to describe more completely infor- tion encoded in decision or information systems and to design classi?ers of high quality. The mostimportantfeatureofthis monographis thatit includesanadvanced mathematical analysis of problems on inhibitory rules. We consider algorithms for construction of inhibitory rules, bounds on minimal complexity of inhibitory rules, and algorithms for construction of the set of all minimal inhibitory rules. We also discuss results of experiments with standard and lazy classi?ers based on inhibitory rules. These results show that inhibitory decision and association rules can be used in data mining and knowledge discovery both for knowledge representation and for prediction. Inhibitory rules can be also used under the analysis and design of concurrent systems. The results obtained in the monograph can be useful for researchers in such areas as machine learning, data mining and knowledge discovery, especially for those who are working in rough set theory, test theory, and logical analysis of data (LAD). The monograph can be used under the creation of courses for graduate students and for Ph.D. studies. TheauthorsofthisbookextendanexpressionofgratitudetoProfessorJanusz Kacprzyk, to Dr. Thomas Ditzinger and to the Studies in Computational Int- ligence sta? at Springer for their support in making this book possible.
This book is the first in aseries on novellow power design architectures, methods and design practices. It results from of a large European project started in 1997, whose goal is to promote the further development and the faster and wider industrial use of advanced design methods for reducing the power consumption of electronic systems. Low power design became crucial with the wide spread of portable information and cornrnunication terminals, where a small battery has to last for a long period. High performance electronics, in addition, suffers from a permanent increase of the dissipated power per square millimetre of silicon, due to the increasing eIock-rates, which causes cooling and reliability problems or otherwise limits the performance. The European Union's Information Technologies Programme 'Esprit' did there fore launch a 'Pilot action for Low Power Design', wh ich eventually grew to 19 R&D projects and one coordination project, with an overall budget of 14 million Euro. It is meanwhile known as European Low Power Initiative for Electronic System Design (ESD-LPD) and will be completed by the end of 2001. It involves 30 major Euro pean companies and 20 well-known institutes. The R&D projects aims to develop or demonstrate new design methods for power reduction, while the coordination project takes care that the methods, experiences and results are properly documented and pub licised."
I am honored and delighted to write the foreword to this very first book about SystemC. It is now an excellent time to summarize what SystemC really is and what it can be used for. The main message in the area of design in the 2001 International Te- nologyRoadmapfor Semiconductors (ITRS) isthat"cost ofdesign is the greatest threat to the continuation ofthe semiconductor roadmap. " This recent revision of the ITRS describes the major productivity improvements of the last few years as "small block reuse," "large block reuse ," and "IC implementation tools. " In order to continue to reduce design cost, the - quired future solutions will be "intelligent test benches" and "embedded system-level methodology. " As the new system-level specification and design language, SystemC - rectly contributes to these two solutions. These will have the biggest - pact on future design technology and will reduce system implementation cost. Ittook SystemC less than two years to emerge as the leader among the many new and well-discussed system-level designlanguages. Inmy op- ion, this is due to the fact that SystemC adopted object-oriented syst- level design-the most promising method already applied by the majority of firms during the last couple of years. Even before the introduction of SystemC, many system designers have attempted to develop executable specifications in C++. These executable functional specifications are then refined to the well-known transaction level, to model the communication of system-level processes.
Evolutionary algorithms are general-purpose search procedures based on the mechanisms of natural selection and population genetics. They are appealing because they are simple, easy to interface, and easy to extend. This volume is concerned with applications of evolutionary algorithms and associated strategies in engineering. It will be useful for engineers, designers, developers, and researchers in any scientific discipline interested in the applications of evolutionary algorithms. The volume consists of five parts, each with four or five chapters. The topics are chosen to emphasize application areas in different fields of engineering. Each chapter can be used for self-study or as a reference by practitioners to help them apply evolutionary algorithms to problems in their engineering domains.
The design of computer systems to be embedded in critical real-time applications is a complex task. Such systems must not only guarantee to meet hard real-time deadlines imposed by their physical environment, they must guarantee to do so dependably, despite both physical faults (in hardware) and design faults (in hardware or software). A fault-tolerance approach is mandatory for these guarantees to be commensurate with the safety and reliability requirements of many life- and mission-critical applications. This book explains the motivations and the results of a collaborative project', whose objective was to significantly decrease the lifecycle costs of such fault tolerant systems. The end-user companies participating in this project already deploy fault-tolerant systems in critical railway, space and nuclear-propulsion applications. However, these are proprietary systems whose architectures have been tailored to meet domain-specific requirements. This has led to very costly, inflexible, and often hardware-intensive solutions that, by the time they are developed, validated and certified for use in the field, can already be out-of-date in terms of their underlying hardware and software technology."
Data Access and Storage Management for Embedded Programmable
Processors gives an overview of the state-of-the-art in
system-level data access and storage management for embedded
programmable processors. The targeted application domain covers
complex embedded real-time multi-media and communication
applications. Many of these applications are data-dominated in the
sense that their cost related aspects, namely power consumption and
footprint are heavily influenced (if not dominated) by the data
access and storage aspects. The material is mainly based on
research at IMEC in this area in the period 1996-2001. In order to
deal with the stringent timing requirements and the data dominated
characteristics of this domain, we have adopted a target
architecture style that is compatible with modern embedded
processors, and we have developed a systematic step-wise
methodology to make the exploration and optimization of such
applications feasible in a source-to-source precompilation
approach.
This volume provides a complete understanding of the fundamental causes of routing congestion in present-day and next-generation VLSI circuits, offers techniques for estimating and relieving congestion, and provides a critical analysis of the accuracy and effectiveness of these techniques. The book includes metrics and optimization techniques for routing congestion at various stages of the VLSI design flow. The subjects covered include an explanation of why the problem of congestion is important and how it will trend, plus definitions of metrics that are appropriate for measuring congestion, and descriptions of techniques for estimating and optimizing routing congestion issues in cell-/library-based VLSI circuits.
This book is the first in a series of three dedicated to advanced topics in Mixed-Signal IC design methodologies. It is one of the results achieved by the Mixed-Signal Design Cluster, an initiative launched in 1998 as part of the TARDIS project, funded by the European Commission within the ESPRIT-IV Framework. This initiative aims to promote the development of new design and test methodologies for Mixed-Signal ICs, and to accelerate their adoption by industrial users. As Microelectronics evolves, Mixed-Signal techniques are gaining a significant importance due to the wide spread of applications where an analog front-end is needed to drive a complex digital-processing subsystem. In this sense, Analog and Mixed-Signal circuits are recognized as a bottleneck for the market acceptance of Systems-On-Chip, because of the inherent difficulties involved in the design and test of these circuits. Specially, problems arising from the use of a common substrate for analog and digital components are a main limiting factor. The Mixed-Signal Cluster has been formed by a group of 11 Research and Development projects, plus a specific action to promote the dissemination of design methodologies, techniques, and supporting tools developed within the Cluster projects. The whole action, ending in July 2002, has been assigned an overall budget of more than 8 million EURO.
This book contains the extended and revised editions of all the talks of the ninth AACD Workshop held in Hotel Bachmair, April 11 - 13 2000 in Rottach-Egem, Germany. The local organization was managed by Rudolf Koch of Infineon Technologies AG, Munich, Germany. The program consisted of six tutorials per day during three days. Experts in the field presented these tutorials and state of the art information is communicated. The audience at the end of the workshop selects program topics for the following workshop. The program committee, consisting of Johan Huijsing of Delft University of Technology, Willy Sansen of Katholieke Universiteit Leuven and Rudy van de Plassche of Broadcom Netherlands BV Bunnik elaborates the selected topics into a three-day program and selects experts in the field for presentation. Each AACD Workshop has given rise to publication of a book by Kluwer entitled "Analog Circuit Design." A series of nine books in a row provides valuable information and good overviews of all analog circuit techniques concerning design, CAD, simulation and device modeling. These books can be seen as a reference to those people involved in analog and mixed signal design. The aim of the workshop is to brainstorm on new and valuable design ideas in the area of analog circuit design. It is the hope of the program committee that this ninth book continues the tradition of emerging contributions to the design of analog and mixed signal systems in Europe and the rest of the world.
"As chip size and complexity continues to grow exponentially, the
challenges of functional verification are becoming a critical issue
in the electronics industry. It is now commonly heard that logical
errors missed during functional verification are the most common
cause of chip re-spins, and that the costs associated with
functional verification are now outweighing the costs of chip
design. To cope with these challenges engineers are increasingly
relying on new design and verification methodologies and languages.
Transaction-based design and verification, constrained random
stimulus generation, functional coverage analysis, and
assertion-based verification are all techniques that advanced
design and verification teams routinely use today. Engineers are
also increasingly turning to design and verification models based
on C/C++ and SystemC in order to build more abstract, higher
performance hardware and software models and to escape the
limitations of RTL HDLs. This new book, Advanced Verification
Techniques, provides specific guidance for these advanced
verification techniques. The book includes realistic examples and
shows how SystemC and SCV can be applied to a variety of advanced
design and verification tasks."
In high speed communications and signal processing applications,
random electrical noise that emanates from devices has a direct
impact on critical high level specifications, for instance, system
bit error rate or signal to noise ratio. Hence, predicting noise in
RF systems at the design stage is extremely important.
Additionally, with the growing complexity of modern RF systems, a
flat transistor-level noise analysis for the entire system is
becoming increasingly difficult. Hence accurate modelling at the
component level and behavioural level simulation techniques are
also becoming increasingly important. In this book, we concentrate
on developing noise simulation techniques for RF circuits.
Introduction to Hardware-Software Co-Design presents a number of issues of fundamental importance for the design of integrated hardware software products such as embedded, communication, and multimedia systems. This book is a comprehensive introduction to the fundamentals of hardware/software co-design. Co-design is still a new field but one which has substantially matured over the past few years. This book, written by leading international experts, covers all the major topics including: fundamental issues in co-design; hardware/software co-synthesis algorithms; prototyping and emulation; target architectures; compiler techniques; specification and verification; system-level specification. Special chapters describe in detail several leading-edge co-design systems including Cosyma, LYCOS, and Cosmos. Introduction to Hardware-Software Co-Design contains sufficient material for use by teachers and students in an advanced course of hardware/software co-design. It also contains extensive explanation of the fundamental concepts of the subject and the necessary background to bring practitioners up-to-date on this increasingly important topic.
A look at important new tools and algorithms for future product modeling systems, based on a seminar at the International Conference and Research Center for Computer Science, Schloss Dagstuhl, Germany, presented by internationally recognised experts in CAD technology. |
![]() ![]() You may like...
Steel - The Story of Pittsburgh's Iron…
Dale Richard Perelman
Paperback
Ideology and Atheism in the Soviet Union
William Van Den Bercken
Hardcover
R3,617
Discovery Miles 36 170
|