![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Co-Design is the set of emerging techniques which allows for the simultaneous design of Hardware and Software. In many cases where the application is very demanding in terms of various performances (time, surface, power consumption), trade-offs between dedicated hardware and dedicated software are becoming increasingly difficult to decide upon in the early stages of a design. Verification techniques - such as simulation or proof techniques - that have proven necessary in the hardware design must be dramatically adapted to the simultaneous verification of Software and Hardware. Describing the latest tools available for both Co-Design and Co-Verification of systems, Hardware/Software Co-Design and Co-Verification offers a complete look at this evolving set of procedures for CAD environments. The book considers all trade-offs that have to be made when co-designing a system. Several models are presented for determining the optimum solution to any co-design problem, including partitioning, architecture synthesis and code generation. When deciding on trade-offs, one of the main factors to be considered is the flow of communication, especially to and from the outside world. This involves the modeling of communication protocols. An approach to the synthesis of interface circuits in the context of co-design is presented. Other chapters present a co-design oriented flexible component data-base and retrieval methods; a case study of an ethernet bridge, designed using LOTOS and co-design methodologies and finally a programmable user interface based on monitors. Hardware/Software Co-Design and Co-Verification will help designers and researchers to understand these latest techniques in system design and as such will be of interest to all involved in embedded system design.
Computer Methods for Analysis of Mixed-Mode Switching Circuits
provides an in-depth treatment of the principles and implementation
details of computer methods and numerical algorithms for analysis
of mixed-mode switching circuits. Major topics include:
Power consumption is a key limitation in many high-speed and high-data-rate electronic systems today, ranging from mobile telecom to portable and desktop computing systems, especially when moving to nanometer technologies. Ultra Low-Power Electronics and Design offers to the reader the unique opportunity of accessing in an easy and integrated fashion a mix of tutorial material and advanced research results, contributed by leading scientists from academia and industry, covering the most hot and up-to-date issues in the field of the design of ultra low-power devices, systems and applications.
Functional Design Errors in Digital Circuits Diagnosis covers a wide spectrum of innovative methods to automate the debugging process throughout the design flow: from Register-Transfer Level (RTL) all the way to the silicon die. In particular, this book describes: (1) techniques for bug trace minimization that simplify debugging; (2) an RTL error diagnosis method that identifies the root cause of errors directly; (3) a counterexample-guided error-repair framework to automatically fix errors in gate-level and RTL designs; (4) a symmetry-based rewiring technology for fixing electrical errors; (5) an incremental verification system for physical synthesis; and (6) an integrated framework for post-silicon debugging and layout repair. The solutions provided in this book can greatly reduce debugging effort, enhance design quality, and ultimately enable the design and manufacture of more reliable electronic devices.
This is the first book to focus on emerging technologies for distributed intelligent decision-making in process planning and dynamic scheduling. It has two sections: a review of several key areas of research, and an in-depth treatment of particular techniques. Each chapter addresses a specific problem domain and offers practical solutions to solve it. The book provides a better understanding of the present state and future trends of research in this area.
Many real systems are composed of multi-state components with different performance levels and several failure modes. These affect the whole system's performance. Most books on reliability theory cover binary models that allow a system only to function perfectly or fail completely. "The Universal Generating Function in Reliability Analysis and Optimization" is the first book that gives a comprehensive description of the universal generating function technique and its applications in binary and multi-state system reliability analysis. Features: This monograph will be of value to anyone interested in system reliability, performance analysis and optimization in industrial, electrical and nuclear engineering.
Many new topologies and circuit design techniques have emerged recently to improve the performance of active inductors, but a comprehensive treatment of the theory, topology, characteristics, and design constraint of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels, is not available. This book is an attempt to provide an in-depth examination and a systematic presentation of the operation principles and implementation details of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels. The content of the book is drawn from recently published research papers and are not available in a single, cohesive book. Equal emphasis is given to the theory of CMOS active inductors and transformers, and their emerging applications. Major subjects to be covered in the book include: inductive characteristics in high-speed analog signal processing and data communications, spiral inductors and transformers - modeling and limitations, a historical perspective of device synthesis, the topology, characterization, and implementation of CMOS active inductors and transformers, and the application of CMOS active inductors and transformers in high-speed analog and digital signal processing and data communications.
The authors have consolidated their research work in this volume titled Soft Computing for Data Mining Applications. The monograph gives an insight into the research in the ?elds of Data Mining in combination with Soft Computing methodologies. In these days, the data continues to grow - ponentially. Much of the data is implicitly or explicitly imprecise. Database discovery seeks to discover noteworthy, unrecognized associations between the data items in the existing database. The potential of discovery comes from the realization that alternate contexts may reveal additional valuable information. The rate at which the data is storedis growing at a phenomenal rate. Asaresult, traditionaladhocmixturesofstatisticaltechniquesanddata managementtools are no longer adequate for analyzing this vast collection of data. Severaldomainswherelargevolumesofdataarestoredincentralizedor distributeddatabasesincludesapplicationslikeinelectroniccommerce, bio- formatics, computer security, Web intelligence, intelligent learning database systems, ?nance, marketing, healthcare, telecommunications, andother?elds. E?cient tools and algorithms for knowledge discovery in large data sets have been devised during the recent years. These methods exploit the ca- bility of computers to search huge amounts of data in a fast and e?ective manner. However, the data to be analyzed is imprecise and a?icted with - certainty. In the case of heterogeneous data sources such as text and video, the data might moreover be ambiguous and partly con?icting. Besides, p- terns and relationships of interest are usually approximate. Thus, in order to make the information mining process more robust it requires tolerance toward imprecision, uncertainty and exc
We describe in this book, new methods and applications of hybrid intelligent systems using soft computing techniques. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and evolutionary al- rithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of intelligent control, which are basically papers that use hybrid systems to solve particular problems of control. The second part contains papers with the main theme of pattern recognition, which are basically papers using soft computing techniques for achieving pattern recognition in different applications. The third part contains papers with the themes of intelligent agents and social systems, which are papers that apply the ideas of agents and social behavior to solve real-world problems. The fourth part contains papers that deal with the hardware implementation of intelligent systems for solving particular problems. The fifth part contains papers that deal with modeling, simulation and optimization for real-world applications.
This monograph is devoted to theoretical and experimental study of inhibitory decision and association rules. Inhibitory rules contain on the right-hand side a relation of the kind "attribut = value." The use of inhibitory rules instead of deterministic (standard) ones allows us to describe more completely infor- tion encoded in decision or information systems and to design classi?ers of high quality. The mostimportantfeatureofthis monographis thatit includesanadvanced mathematical analysis of problems on inhibitory rules. We consider algorithms for construction of inhibitory rules, bounds on minimal complexity of inhibitory rules, and algorithms for construction of the set of all minimal inhibitory rules. We also discuss results of experiments with standard and lazy classi?ers based on inhibitory rules. These results show that inhibitory decision and association rules can be used in data mining and knowledge discovery both for knowledge representation and for prediction. Inhibitory rules can be also used under the analysis and design of concurrent systems. The results obtained in the monograph can be useful for researchers in such areas as machine learning, data mining and knowledge discovery, especially for those who are working in rough set theory, test theory, and logical analysis of data (LAD). The monograph can be used under the creation of courses for graduate students and for Ph.D. studies. TheauthorsofthisbookextendanexpressionofgratitudetoProfessorJanusz Kacprzyk, to Dr. Thomas Ditzinger and to the Studies in Computational Int- ligence sta? at Springer for their support in making this book possible.
In high speed communications and signal processing applications,
random electrical noise that emanates from devices has a direct
impact on critical high level specifications, for instance, system
bit error rate or signal to noise ratio. Hence, predicting noise in
RF systems at the design stage is extremely important.
Additionally, with the growing complexity of modern RF systems, a
flat transistor-level noise analysis for the entire system is
becoming increasingly difficult. Hence accurate modelling at the
component level and behavioural level simulation techniques are
also becoming increasingly important. In this book, we concentrate
on developing noise simulation techniques for RF circuits.
This book contains the extended and revised editions of all the talks of the ninth AACD Workshop held in Hotel Bachmair, April 11 - 13 2000 in Rottach-Egem, Germany. The local organization was managed by Rudolf Koch of Infineon Technologies AG, Munich, Germany. The program consisted of six tutorials per day during three days. Experts in the field presented these tutorials and state of the art information is communicated. The audience at the end of the workshop selects program topics for the following workshop. The program committee, consisting of Johan Huijsing of Delft University of Technology, Willy Sansen of Katholieke Universiteit Leuven and Rudy van de Plassche of Broadcom Netherlands BV Bunnik elaborates the selected topics into a three-day program and selects experts in the field for presentation. Each AACD Workshop has given rise to publication of a book by Kluwer entitled "Analog Circuit Design." A series of nine books in a row provides valuable information and good overviews of all analog circuit techniques concerning design, CAD, simulation and device modeling. These books can be seen as a reference to those people involved in analog and mixed signal design. The aim of the workshop is to brainstorm on new and valuable design ideas in the area of analog circuit design. It is the hope of the program committee that this ninth book continues the tradition of emerging contributions to the design of analog and mixed signal systems in Europe and the rest of the world.
"As chip size and complexity continues to grow exponentially, the
challenges of functional verification are becoming a critical issue
in the electronics industry. It is now commonly heard that logical
errors missed during functional verification are the most common
cause of chip re-spins, and that the costs associated with
functional verification are now outweighing the costs of chip
design. To cope with these challenges engineers are increasingly
relying on new design and verification methodologies and languages.
Transaction-based design and verification, constrained random
stimulus generation, functional coverage analysis, and
assertion-based verification are all techniques that advanced
design and verification teams routinely use today. Engineers are
also increasingly turning to design and verification models based
on C/C++ and SystemC in order to build more abstract, higher
performance hardware and software models and to escape the
limitations of RTL HDLs. This new book, Advanced Verification
Techniques, provides specific guidance for these advanced
verification techniques. The book includes realistic examples and
shows how SystemC and SCV can be applied to a variety of advanced
design and verification tasks."
This book is the first in aseries on novellow power design architectures, methods and design practices. It results from of a large European project started in 1997, whose goal is to promote the further development and the faster and wider industrial use of advanced design methods for reducing the power consumption of electronic systems. Low power design became crucial with the wide spread of portable information and cornrnunication terminals, where a small battery has to last for a long period. High performance electronics, in addition, suffers from a permanent increase of the dissipated power per square millimetre of silicon, due to the increasing eIock-rates, which causes cooling and reliability problems or otherwise limits the performance. The European Union's Information Technologies Programme 'Esprit' did there fore launch a 'Pilot action for Low Power Design', wh ich eventually grew to 19 R&D projects and one coordination project, with an overall budget of 14 million Euro. It is meanwhile known as European Low Power Initiative for Electronic System Design (ESD-LPD) and will be completed by the end of 2001. It involves 30 major Euro pean companies and 20 well-known institutes. The R&D projects aims to develop or demonstrate new design methods for power reduction, while the coordination project takes care that the methods, experiences and results are properly documented and pub licised."
Integrating formal property verification (FPV) into an existing design process raises several interesting questions. Have I written enough properties? Have I written a consistent set of properties? What should I do when the FPV tool runs into capacity issues? This book develops the answers to these questions and fits them into a roadmap for formal property verification a roadmap that shows how to glue FPV technology into the traditional validation flow. A Roadmap for Formal Property Verification explores the key issues in this powerful technology through simple examples you do not need any background on formal methods to read most parts of this book. "
A genuinely useful text that gives an overview of the state-of-the-art in system-level design trade-off explorations for concurrent tasks running on embedded heterogeneous multiple processors. The targeted application domain covers complex embedded real-time multi-media and communication applications. This material is mainly based on research at IMEC and its international university network partners in this area over the last decade. In all, the material those in the digital signal processing industry will find here is bang up-to-date.
The world we live in is pervaded with uncertainty and imprecision. Is it likely to rain this afternoon? Should I take an umbrella with me? Will I be able to find parking near the campus? Should I go by bus? Such simple questions are a c- mon occurrence in our daily lives. Less simple examples: What is the probability that the price of oil will rise sharply in the near future? Should I buy Chevron stock? What are the chances that a bailout of GM, Ford and Chrysler will not s- ceed? What will be the consequences? Note that the examples in question involve both uncertainty and imprecision. In the real world, this is the norm rather than exception. There is a deep-seated tradition in science of employing probability theory, and only probability theory, to deal with uncertainty and imprecision. The mon- oly of probability theory came to an end when fuzzy logic made its debut. H- ever, this is by no means a widely accepted view. The belief persists, especially within the probability community, that probability theory is all that is needed to deal with uncertainty. To quote a prominent Bayesian, Professor Dennis Lindley, "The only satisfactory description of uncertainty is probability.
Covering the development of field computation in the past forty years, this book is a concise, comprehensive and up-to-date introduction to methods for the analysis and synthesis of electric and magnetic fields. A broad view of the subject of field models in electricity and magnetism, ranging from basic theory to numerical applications, is offered. The approach throughout is to solve field problems directly from partial differential equations in terms of vector quantities.
Hardware Software Co-Design of a Multimedia SOC Platform is one of the first of its kinds to provide a comprehensive overview of the design and implementation of the hardware and software of an SoC platform for multimedia applications. Topics covered in this book range from system level design methodology, multimedia algorithm implementation, a sub-word parallel, single-instruction-multiple data (SIMD) processor design, and its virtual platform implementation, to the development of an SIMD parallel compiler as well as a real-time operating system (RTOS). Hardware Software Co-Design of a Multimedia SOC Platform is written for practitioner engineers and technical managers who want to gain first hand knowledge about the hardware-software design process of an SoC platform. It offers both tutorial-like details to help readers become familiar with a diverse range of subjects, and in-depth analysis for advanced readers to pursue further.
This book presents design guidelines and implementation approaches for enterprise safety management system as integrated within enterprise integrated systems. It shows new model-based safety management where process design automation is integrated with enterprise business functions and components. It proposes new system engineering approach addressed to new generation chemical industry. It will help both the undergraduate and professional readers to build basic knowledge about issues and problems of designing practical enterprise safety management system, while presenting in clear way, the system and information engineering practices to design enterprise integrated solution.
Model Based Fuzzy Control uses a given conventional or fuzzy open loop model of the plant under control to derive the set of fuzzy rules for the fuzzy controller. Of central interest are the stability, performance, and robustness of the resulting closed loop system. The major objective of model based fuzzy control is to use the full range of linear and nonlinear design and analysis methods to design such fuzzy controllers with better stability, performance, and robustness properties than non-fuzzy controllers designed using the same techniques. This objective has already been achieved for fuzzy sliding mode controllers and fuzzy gain schedulers - the main topics of this book. The primary aim of the book is to serve as a guide for the practitioner and to provide introductory material for courses in control theory.
Traditionally, the DDSS conferences aim to be a platform for both starting and experienced researchers who focus on the development and application of computer support in urban planning and architectural design. This volume contains 31 peer reviewed papers from this year's conference. This book will bring researchers together and is a valuable resource for their continuous joint effort to improve the design and planning of our environment.
Motivated learning is an emerging research field in artificial intelligence and cognitive modelling. Computational models of motivation extend reinforcement learning to adaptive, multitask learning in complex, dynamic environments - the goal being to understand how machines can develop new skills and achieve goals that were not predefined by human engineers. In particular, this book describes how motivated reinforcement learning agents can be used in computer games for the design of non-player characters that can adapt their behaviour in response to unexpected changes in their environment. This book covers the design, application and evaluation of computational models of motivation in reinforcement learning. The authors start with overviews of motivation and reinforcement learning, then describe models for motivated reinforcement learning. The performance of these models is demonstrated by applications in simulated game scenarios and a live, open-ended virtual world. Researchers in artificial intelligence, machine learning and artificial life will benefit from this book, as will practitioners working on complex, dynamic systems - in particular multiuser, online games.
This book offers up a deep understanding of concepts and practices behind the composition of heterogeneous components. After the analysis of existing computation and execution models used for the specification and validation of different sub-systems, the book introduces a systematic approach to build an execution model for systems composed of heterogeneous components. Mixed continuous/discrete and hardware/software systems are used to illustrate these concepts. The benefit of reading this book is to arrive at a clear vision of the theory and practice of specification and validation of complex modern systems. Numerous examples give designers highly applicable solutions.
Verification presents the most time-consuming task in the
integrated circuit design process. The increasing similarity
between implementation verification and the ever-needed task of
providing vectors for manufacturing fault testing is tempting many
professionals to combine verification and testing efforts. |
![]() ![]() You may like...
New Approaches in Biomedical…
Katrin Kneipp, Ricardo Aroca, …
Hardcover
R3,459
Discovery Miles 34 590
Key Marketing Metrics - The 50+ Metrics…
Neil Bendle, Paul Farris, …
Paperback
R1,081
Discovery Miles 10 810
Introduction To Algebraic Geometry And…
Dilip P. Patil, Uwe Storch
Paperback
R1,284
Discovery Miles 12 840
|