![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
Embedded systems are informally defined as a collection of programmable parts surrounded by ASICs and other standard components, that interact continuously with an environment through sensors and actuators. The programmable parts include micro-controllers and Digital Signal Processors (DSPs). Embedded systems are often used in life-critical situations, where reliability and safety are more important criteria than performance. Today, embedded systems are designed with an ad hoc approach that is heavily based on earlier experience with similar products and on manual design. Use of higher-level languages such as C helps structure the design somewhat, but with increasing complexity it is not sufficient. Formal verification and automatic synthesis of implementations are the surest ways to guarantee safety. Thus, the POLIS system which is a co-design environment for embedded systems is based on a formal model of computation. POLIS was initiated in 1988 as a research project at the University of California at Berkeley and, over the years, grew into a full design methodology with a software system supporting it. Hardware-Software Co-Design of Embedded Systems: The POLIS Approach is intended to give a complete overview of the POLIS system including its formal and algorithmic aspects. Hardware-Software Co-Design of Embedded Systems: The POLIS Approach will be of interest to embedded system designers (automotive electronics, consumer electronics and telecommunications), micro-controller designers, CAD developers and students.
The modern wireless communication industry has put great demands on circuit designers for smaller, cheaper transceivers in the gigahertz frequency range. One tool which has assisted designers in satisfying these requirements is the use of on-chip inductiveelements (inductors and transformers) in silicon (Si) radio-frequency (RF) integrated circuits (ICs). These elements allow greatly improved levels of performance in Si monolithic low-noise amplifiers, power amplifiers, up-conversion and down-conversion mixers and local oscillators. Inductors can be used to improve the intermodulation distortion performance and noise figure of small-signal amplifiers and mixers. In addition, the gain of amplifier stages can be enhanced and the realization of low-cost on-chip local oscillators with good phase noise characteristics is made feasible. In order to reap these benefits, it is essential that the IC designer be able to predict and optimize the characteristics of on-chip inductiveelements. Accurate knowledge of inductance values, quality factor (Q) and the influence of ad- cent elements (on-chip proximity effects) and substrate losses is essential. In this book the analysis, modeling and application of on-chip inductive elements is considered. Using analyses based on Maxwells equations, an accurate and efficient technique is developed to model these elements over a wide frequency range. Energy loss to the conductive substrate is modeled through several mechanisms, including electrically induced displacement and conductive c- rents and by magnetically induced eddy currents. These techniques have been compiled in a user-friendly software tool ASITIC (Analysis and Simulation of Inductors and Transformers for Integrated Circuits).
Logic Synthesis and Verification Algorithms is a textbook designed for courses on VLSI Logic Synthesis and Verification, Design Automation, CAD and advanced level discrete mathematics. It also serves as a basic reference work in design automation for both professionals and students. Logic Synthesis and Verification Algorithms is about the theoretical underpinnings of VLSI (Very Large Scale Integrated Circuits). It combines and integrates modern developments in logic synthesis and formal verification with the more traditional matter of Switching and Finite Automata Theory. The book also provides background material on Boolean algebra and discrete mathematics. A unique feature of this text is the large collection of solved problems. Throughout the text the algorithms covered are the subject of one or more problems based on the use of available synthesis programs.
Physicians, lawyers, engineers, architects, financial analysts, and other pro fessionals articulate an increasing need for support by intelligent workstations for decision making, analysis, communication, and other activities. "Intelligent Workstations for Professionals" is the collection of papers presented by inter national scientists at a symposium and workshop in March 1992. Requirements from potential users, studies of their behavior as well as approaches and aspects oftechnical realizations of "intelligent" functions are introduced. Eight contributions from members of the Center for Information and Tele communication Technology (Clrn of Northwestern University, Wisconsin Whitewater University, and the Children's Memorial Hospital deal with the latest findings of the UNIS (Users' Needs for Intelligent Systems) project, which is designed to identify needs and wishes from professionals for intelligent sup port systems and the potential barriers to adoption and use of such systems. The remaining papers concentrate on new approaches and techniques that en hance the "intelligence" of future workstations. They tackle issues like architectural trends in workstation design, the combination of workstations with HDTV and speech processing, automatic reading and understanding of documents, the automated development of software, or the processing of in exact knowledge. These papers were contributed by members of the DFKI GmbH (German Research Institute for Artificial Intelligence), GMD mbH (German Society for Mathematics and Data Processing), Siemens Gammasonics Inc., Siemens Nixdorf Informationssysteme AG and Siemens AG."
Design and Analysis of Distributed Embedded Systems is organized similar to the conference. Chapters 1 and 2 deal with specification methods and their analysis while Chapter 6 concentrates on timing and performance analysis. Chapter 3 describes approaches to system verification at different levels of abstraction. Chapter 4 deals with fault tolerance and detection. Middleware and software reuse aspects are treated in Chapter 5. Chapters 7 and 8 concentrate on the distribution related topics such as partitioning, scheduling and communication. The book closes with a chapter on design methods and frameworks.
Designing is one of the most significant of human acts. Surprisingly, given that designing has been occurring for many millenia, our understanding of the processes of designing is remarkably limited. Recently, design methods have been formalised not as humano-centred processes but as processes capable of computer implementation with the goal of augmenting human designers. This volume contains contributions which cover design methods based on evolutionary systems, generative processes, evaluation methods and analysis methods. It presents the state of the art in formal design methods for computer aided design.
The purpose of this book is to present computationally efficient algorithms for calculating the dynamics of robot mechanisms represented as systems of rigid bodies. The efficiency is achieved by the use of recursive formulations of the equations of motion, i.e. formulations in which the equations of motion are expressed implicitly in terms of recurrence relations between the quantities describing the system. The use of recursive formulations in dynamics is fairly new, 50 the principles of their operation and reasons for their efficiency are explained. Three main algorithms are described: the recursIve Newton-Euler formulation for inverse dynamics (the calculation of the forces given the accelerations), and the composite-rigid-body and articulated-body methods for forward dynamics (the calculation of the accelerations given the forces). These algorithms are initially described in terms of an un-branched, open loop kinematic chain -- a typical serial robot mechanism. This is done to keep the descriptions of the algorithms simple, and is in line with descriptions appearing in the literature. Once the basic algorithms have been introduced, the restrictions on the mechanism are lifted and the algorithms are extended to cope with kinematic trees and loops, and general constraints at the joints. The problem of simulating the effect of contact between a robot and its environment is also considered. Some consideration is given to the details and practical problems of implementing these algori?hms on a computer."
Collaborative virtual environments (CVEs) are multi-user virtual realities which actively support communication and co-operation. This book offers a comprehensive reference volume to the state-of-the-art in the area of design studies in CVEs. It is an excellent mix of contributions from over 25 leading researcher/experts in multiple disciplines from academia and industry, providing up-to-date insight into the current research topics in this field as well as the latest technological advancements and the best working examples. Many of these results and ideas are also applicable to other areas such as CVE for design education. Overall, this bookserves asan excellent reference for postgraduate students, researchers and practitioners who need a comprehensive approach to study the design behaviours in CVEs. It is also a useful and informative source of materials for those interested in learning more on using/developing CVEs to support design and design collaboration. "
This state-of-the-art book explores the concept of knowledge-intensive CAD systems. The topics covered range from ontology to knowledge representation, making it essential reading for researchers, engineers, and technical managers involved in the development of advanced applications for knowledge management, engineering design, and manufacturing.
INTRODUCTION TO COMPUTER-AIDED DESIGN OF USER INTERFACES l 2 Jean Vanderdonckt and Angel Puerta ,3 Jlnstitut d'Administration et de Gestion - Universite catholique de Louvain Place des Doyens, 1 - B-1348 Louvain-la-Neuve (Belgium) vanderdonckt@gant,ucl. ac,be , vanderdoncktj@acm,org Web: http://www. arpuerta. com JKnowledge Systems Laboratory, Stanford University, MSOB x215 Stanford, CA 94305-5479, USA puena@camis. stanford. edu 3RedWhaie Corp. , 277 Town & Country Village Palo Alto, CA 94303, USA puerta@ redwhale. com Web: http://www. redwhale. com Computer-Aided Design of Vser Interfaces (CADUI) is hereby referred to as the particular area of Human-Computer Interaction (HCI) intended to provide software support for any activity involved in the development life cycle of an interactive application, Such activities namely include task analysis, contextual inquiry [l], requirements definition, user-centred design, application modelling, conceptual design, prototyping, programming, in- stallation, test, evaluation, maintenance, Although very recently addressed (e. g. , [3]), the activity of re-designing an existing user interface (VI) for an interactive application and the activity of re-engineering a VI to rebuild its underlying models are also considered in CADVI. A fundamental aim of CADVI is not only to provide some software sup- port to the above activities, but also to incorporate strong and solid meth- odological aspects into the development, thus fostering abstraction reflection and leaving ad hoc development aside [5,7]. Incorporating such methodo- logical aspects inevitably covers three related, sometimes intertwined, facets: models, method and tools.
Cognitive Informatics (CI) is the science of cognitive information processing and its applications in cognitive computing. CI is a transdisciplinary enquiry of computer science, information science, cognitive science, and intelligence science that investigates into the internal information processing mechanisms and processes of the brain. Advances and engineering applications of CI have led to the emergence of cognitive computing and the development of Cognitive Computers (CCs) that reason and learn. As initiated by Yingxu Wang and his colleagues, CC has emerged and developed based on the transdisciplinary research in CI, abstract intelligence (aI), and denotational mathematics after the inauguration of the series of IEEE International Conference on Cognitive Informatics since 2002 at Univ. of Calgary, Stanford Univ., and Tsinghua Univ., etc. This volume in LNCS (subseries of Computational Intelligence), LNCI 323, edited by Y. Wang, D. Zhang, and W. Kinsner, presents the latest development in cognitive informatics and cognitive computing. The book focuses on the explanation of cognitive models of the brain, the layered reference model of the brain, the fundamental mechanisms of abstract intelligence, and the implementation of computational intelligence by autonomous inference and learning engines based on CCs.
Many different kinds of FPGAs exist, with different programming technologies, different architectures and different software. Field-Programmable Gate Array Technology describes the major FPGA architectures available today, covering the three programming technologies that are in use and the major architectures built on those programming technologies. The reader is introduced to concepts relevant to the entire field of FPGAs using popular devices as examples. Field-Programmable Gate Array Technology includes discussions of FPGA integrated circuit manufacturing, circuit design and logic design. It describes the way logic and interconnect are implemented in various kinds of FPGAs. It covers particular problems with design for FPGAs and future possibilities for new architectures and software. This book compares CAD for FPGAs with CAD for traditional gate arrays. It describes algorithms for placement, routing and optimization of FPGAs. Field-Programmable Gate Array Technology describes all aspects of FPGA design and development. For this reason, it covers a significant amount of material. Each section is clearly explained to readers who are assumed to have general technical expertise in digital design and design tools. Potential developers of FPGAs will benefit primarily from the FPGA architecture and software discussion. Electronics systems designers and ASIC users will find a background to different types of FPGAs and applications of their use.
Direct Engineering (DE) is the creation of a product development cycle into a single, unified process. The design process in most industries is an evolutionary one (i.e., incremental changes to some existing design). DE is a manufacturing process that seeks to improve the design processes by providing complete archival documentation of existing designs. It uses three-dimensional geometric models with integrated manufacturing information throughout the design process. DE reduces the design cycle, and the variety and number of engineering changes. This process decreases the design cycle time, increases productivity, and provides a higher quality product. The required technologies and methodologies that will support the development of the DE environment are: (1) product representation using feature-based modeling; (2) knowledge-based applications that will support the entire product development cycle; (3) an engineering environment implemented around distributed computing and object-oriented systems; (4) direct manufacturing techniques using rapid prototyping. Direct Engineering: Toward Intelligent Manufacturing addresses the following recent topics related to the development, implementation, and integration of the DE environment: (1) the current scope of the research in intelligent manufacturing; (2) the results of the technologies and tools developed for integrated product and process designs, and (3) examination of the methodologies and algorithms used for the implementation of direct engineering.
The complete shop floor automation - a "lights out factory," where workers initially set up all machines, turn off the lights, lock the door and the machine churns up the parts - remains an unfulfilled dream. Yet when we look at the enormity of the process of automation and integration even for the most simply conceived part factory, we can recognize that automation has been applied and is being applied, more so when it made sense from a cost/benefit standpoint. It is our nature to be dissatisfied with near term progress, but when we realize how short a time the tools to do that automation have been available, the progress is clearly noteworthy - considering the multitudes of factors and the environment we have to deal with. Most of the automa tion problems we confront in today's environment are multidisciplinary in nature. They require not just the knowledge and experience in various distinct fields but good cooperation from different disci plined organizations to adequately comprehend and solve such problems. In Volume III we have many examples that reflect the current state of the art techniques of robotics and plant automation. The papers for Volume III have been arranged in a logical order of automation planning, automated assembly, robot programming and simula tion, control, motion coordination, communication and networking to factories of the future."
This book presents a new set of embedded system design techniques called multidimensional data flow, which combine the various benefits offered by existing methodologies such as block-based system design, high-level simulation, system analysis and polyhedral optimization. It describes a novel architecture for efficient and flexible high-speed communication in hardware that can be used both in manual and automatic system design and that offers various design alternatives, balancing achievable throughput with required hardware size. This book demonstrates multidimensional data flow by showing its potential for modeling, analysis, and synthesis of complex image processing applications. These applications are presented in terms of their fundamental properties and resulting design constraints. Coverage includes a discussion of how far the latter can be met better by multidimensional data flow than alternative approaches. Based on these results, the book explains the principles of fine-grained system level analysis and high-speed communication synthesis. Additionally, an extensive review of related techniques is given in order to show their relation to multidimensional data flow.
The book provides a comprehensive description and implementation methodology for the Philips/NXP Aethereal/aelite Network-on-Chip (NoC). The presentation offers a systems perspective, starting from the system requirements and deriving and describing the resulting hardware architectures, embedded software, and accompanying design flow. Readers get an in depth view of the interconnect requirements, not centered only on performance and scalability, but also the multi-faceted, application-driven requirements, in particular composability and predictability. The book shows how these qualitative requirements are implemented in a state-of-the-art on-chip interconnect, and presents the realistic, quantitative costs.
Embedded systems are becoming one of the major driving forces in computer science. Furthermore, it is the impact of embedded information technology that dictates the pace in most engineering domains. Nearly all technical products above a certain level of complexity are not only controlled but increasingly even dominated by their embedded computer systems. Traditionally, such embedded control systems have been implemented in a monolithic, centralized way. Recently, distributed solutions are gaining increasing importance. In this approach, the control task is carried out by a number of controllers distributed over the entire system and connected by some interconnect network, like fieldbuses. Such a distributed embedded system may consist of a few controllers up to several hundred, as in today's top-range automobiles. Distribution and parallelism in embedded systems design increase the engineering challenges and require new development methods and tools. This book is the result of the International Workshop on Distributed and Parallel Embedded Systems (DIPES'98), organized by the International Federation for Information Processing (IFIP) Working Groups 10.3 (Concurrent Systems) and 10.5 (Design and Engineering of Electronic Systems). The workshop took place in October 1998 in Schloss Eringerfeld, near Paderborn, Germany, and the resulting book reflects the most recent points of view of experts from Brazil, Finland, France, Germany, Italy, Portugal, and the USA. The book is organized in six chapters: `Formalisms for Embedded System Design': IP-based system design and various approaches to multi-language formalisms. `Synthesis from Synchronous/Asynchronous Specification': Synthesis techniques based on Message Sequence Charts (MSC), StateCharts, and Predicate/Transition Nets. `Partitioning and Load-Balancing': Application in simulation models and target systems. <`Verification and Validation': Formal techniques for precise verification and more pragmatic approaches to validation. `Design Environments' for distributed embedded systems and their impact on the industrial state of the art. `Object Oriented Approaches': Impact of OO-techniques on distributed embedded systems. GBP/LISTGBP This volume will be essential reading for computer science researchers and application developers.
TOOLS Eastern Europe 2002 was the third annual conference on the technology of object-oriented languages and systems. It was held in Eastern Europe, more specifically in Sofia, Bulgaria, from March 13 to 15. In my capacity of program chairman, I could count on the support from the Programming Technology Lab of the Vrije Universiteit Brussel to set up the technical program for this con- ference. We managed to assemble a first class international program committee composed of the following researchers: * Mehmet Aksit (Technische Hogeschool Twente, Netherlands) * Jan Bosch (Universiteit Groningen, Netherlands) * Gilad Bracha (Sun Microsystems, USA) * Shigeru Chiba (Tokyo Institute of Technology, Japan) * Pierre Cointe (Ecole des Mines de Nantes, France) * Serge Demeyer (Universitaire Instelling Antwerpen, Belgium) * Pavel Hruby (Navision, Denmark) * Mehdi Jazayeri (Technische Universitiit Wien, Austria) * Eric Jul (University of Copenhagen, Denmark) * Gerti Kappel (University of Linz, Austria) * Boris Magnusson (University of Lund, Sweden) * Daniela Mehandjiiska-Stavreva (Bond University, Australia) * Tom Mens (Vrije Universiteit Brussel, Belgium) * Christine Mingins (Monash University, Australia) * Ana Moreira (Universidade Nova de Lisboa, Portugal) * Oscar Nierstrasz (Universitiit Bern, Switzerland) * Walter Olthoff (DFKI, Germany) * Igor Pottosin (A. P. Ershov Institute of Informatics Systems, Russia) * Atanas Radenski (Winston-Salem State University, USA) Markku Sakkinen (University of Jyvilskyl!l. , Finland) * * Bran Selic (Rational, Canada) * Andrey Terehov (St.
System-on-Chip for Real-Time Applications will be of interest to engineers, both in industry and academia, working in the area of SoC VLSI design and application. It will also be useful to graduate and undergraduate students in electrical and computer engineering and computer science. A selected set of papers from the 2nd International Workshop on Real-Time Applications were used to form the basis of this book. It is organized into the following chapters: -Introduction; -Design Reuse; -Modeling; -Architecture; -Design Techniques; -Memory; -Circuits; -Low Power; -Interconnect and Technology; -MEMS. System-on-Chip for Real-Time Applications contains many signal processing applications and will be of particular interest to those working in that community.
This handbook provides design considerations and rules-of-thumb to ensure the functionality you want will work. It brings together all the information needed by systems designers to develop applications that include configurability, from the simplest implementations to the most complicated.
Memory Issues in Embedded Systems-On-Chip: Optimizations and Explorations is designed for different groups in the embedded systems-on-chip arena. First, it is designed for researchers and graduate students who wish to understand the research issues involved in memory system optimization and exploration for embedded systems-on-chip. Second, it is intended for designers of embedded systems who are migrating from a traditional micro-controllers centered, board-based design methodology to newer design methodologies using IP blocks for processor-core-based embedded systems-on-chip. Also, since Memory Issues in Embedded Systems-on-Chip: Optimization and Explorations illustrates a methodology for optimizing and exploring the memory configuration of embedded systems-on-chip, it is intended for managers and system designers who may be interested in the emerging capabilities of embedded systems-on-chip design methodologies for memory-intensive applications.
Verification isjob one in today's modem design process. Statistics tell us that the verification process takes up a majority of the overall work. Chips that come back dead on arrival scream that verification is at fault for not finding the mistakes. How do we ensure success? After an accomplishment, have you ever had someone ask you, "Are you good or are you just lucky?"? Many design projects depend on blind luck in hopes that the chip will work. Other's, just adamantly rely on their own abilities to bring the chip to success. ill either case, how can we tell the difference between being good or lucky? There must be a better way not to fail. Failure. No one likes to fail. ill his book, "The Logic of Failure", Dietrich Domer argues that failure does not just happen. A series of wayward steps leads to disaster. Often these wayward steps are not really logical, decisive steps, but more like default omissions. Anti-planning if you will, an ad-hoc approach to doing something. To not plan then, is to fail.
The Verilog language is a hardware description language which provides a means of specifying a digital system at a wide range of levels of abstraction. The language supports the early conceptual stages of design with its behavioral level of abstraction, and the later implementation stages with its structural level of abstraction. The language provides hierarchical constructs, allowing the designer to control the complexity of a description. Verilog was originally designed in the winter of 1983/84 as a proprietary verification/simulation product. Since then, several other proprietary analysis tools have been developed around the language, including a fault simulator and a timing analyzer; the language being instrumental in providing consistency across these tools. Now, the language is openly available for any tool to read and write. This book introduces the language. It is sometimes difficult to separate the language from the simulator tool because the dynamic aspects of the language are defined by the way the simulator works. Where possible, we have stayed away from simulator-specific details and concentrated on design specification, but have included enough information to be able to have working executable models. The book takes a tutorial approach to presenting the language.
Design is an important research topic in engineering and architecture, since design is not only a means of change but also one of the keystones of economic competitiveness and the fundamental precursor to manufacturing. However, our understanding of design as a process and our ability to model it are still very limited. The development of computational models founded on the artificial intelligence paradigm has provided an impetus for much of current design research -- both computational and cognitive. Notwithstanding their immaturity noticeable advances have been made both in extending our understanding of design and in developing tools based on that understanding. The papers in this volume are from the Third International Conference on Artificial Intelligence in Design held in August 1994 in Lausanne, Switzerland. They represent the cutting edge of research and development in this field. They are of particular interest to researchers, developers and users of computer systems in design. This volume demonstrates both the breadth and depth of artificial intelligence in design and points the way forward for our understanding of design as a process and for the development of computer-based tools to aid designers.
The development of computational models of design founded on the artificial intelligenceparadigm has provided an impetus for muchofcurrentdesign research. As artificial intelligence has matured and developed new approaches so the impact ofthese new approaches on design research has been felt. This can be seen in the wayconcepts from cognitive science has found theirway into artificial intelligence and hence into design research. And, also in the way in which agent-based systems arebeingincorporated into design systems. In design research there is an increasing blurring between notions drawn from artificial intelligence and those drawn from cognitive science. Whereas a number of years ago the focus was largely on applying artificial intelligence to designing as an activity, thus treating designing as a form ofproblem solving, today we are seeing a much wider variety ofconceptions of the role of artificial intelligence in helping to model and comprehend designing as a process. Thus, we see papers in this volume which have as their focus the development or implementationofframeworks for artificial intelligence in design - attempting to determine a unique locus for these ideas. We see papers which attempt to find foundations for the development of tools based on the artificial intelligence paradigm; often the foundations come from cognitive studiesofhuman designers. |
![]() ![]() You may like...
Advances in Photometric…
Jean-Denis Durou, Maurizio Falcone, …
Hardcover
R2,889
Discovery Miles 28 890
Safe Robot Navigation Among Moving and…
Andrey V. Savkin, Alexey S. Matveev, …
Paperback
Statistical Modeling in Machine Learning…
Tilottama Goswami, G. R. Sinha
Paperback
R4,171
Discovery Miles 41 710
Machine Learning with Quantum Computers
Maria Schuld, Francesco Petruccione
Hardcover
R3,651
Discovery Miles 36 510
Modern Approaches in Machine Learning…
Vinit Kumar Gunjan, Jacek M. Zurada
Hardcover
R5,183
Discovery Miles 51 830
Identifying the Complex Causes of Civil…
Atin Basuchoudhary, James T. Bang, …
Hardcover
R1,747
Discovery Miles 17 470
Handbook of Research on the Internet of…
Rajesh Singh, Anita Gehlot, …
Hardcover
R8,788
Discovery Miles 87 880
Reservoir Computing - Theory, Physical…
Kohei Nakajima, Ingo Fischer
Hardcover
R4,952
Discovery Miles 49 520
|