![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This book contains selected contributions from the 6th CIRP International Seminar on Computer-Aided Tolerancing, which was held on 22-24 March, 1999, at the University of Twente, Enschede, The Netherlands. This volume presents the theory and application of consistent tolerancing. Until recently CADCAM systems did not even address the issue of tolerances and focused purely on nominal geometry. Therefore, CAD data was only of limited use for the downstream processes. The latest generation of CADCAM systems incorporates functionality for tolerance specification. However, the lack of consistency in existing tolerancing standards and everyday tolerancing practice still lead to ill-defined products, excessive manufacturing costs and unexpected failures. Research and improvement of education in tolerancing are hot items today. Global Consistency of Tolerances gives an excellent overview of the recent developments in the field of Computer-Aided Tolerancing, including such topics as tolerance specification; tolerance analysis; tolerance synthesis; tolerance representation; geometric product specification; functional product analysis; statistical tolerancing; education of tolerancing; computational metrology; tolerancing standards; and industrial applications and CAT systems. This book is well suited to users of new generation CADCAM systems who want to use the available tolerancing possibilities properly. It can also be used as a starting point for research activities.
A number of optimization problems of the mechanics of space flight and the motion of walking robots and manipulators, and of quantum physics, eco momics and biology, have an irregular structure: classical variational proce dures do not formally make it possible to find optimal controls that, as we explain, have an impulse character. This and other well-known facts lead to the necessity for constructing dynamical models using the concept of a gener alized function (Schwartz distribution). The problem ofthe systematization of such models is very important. In particular, the problem of the construction of the general form of linear and nonlinear operator equations in distributions is timely. Another problem is related to the proper determination of solutions of equations that have nonlinear operations over generalized functions in their description. It is well-known that "the value of a distribution at a point" has no meaning. As a result the problem to construct the concept of stability for generalized processes arises. Finally, optimization problems for dynamic systems in distributions need finding optimality conditions. This book contains results that we have obtained in the above-mentioned directions. The aim of the book is to provide for electrical and mechanical engineers or mathematicians working in applications, a general and systematic treat ment of dynamic systems based on up-to-date mathematical methods and to demonstrate the power of these methods in solving dynamics of systems and applied control problems."
Combinatorial optimisation is a ubiquitous discipline whose usefulness spans vast applications domains. The intrinsic complexity of most combinatorial optimisation problems makes classical methods unaffordable in many cases. To acquire practical solutions to these problems requires the use of metaheuristic approaches that trade completeness for pragmatic effectiveness. Such approaches are able to provide optimal or quasi-optimal solutions to a plethora of difficult combinatorial optimisation problems. The application of metaheuristics to combinatorial optimisation is an active field in which new theoretical developments, new algorithmic models, and new application areas are continuously emerging. This volume presents recent advances in the area of metaheuristic combinatorial optimisation, with a special focus on evolutionary computation methods. Moreover, it addresses local search methods and hybrid approaches. In this sense, the book includes cutting-edge theoretical, methodological, algorithmic and applied developments in the field, from respected experts and with a sound perspective.
Functional Design Errors in Digital Circuits Diagnosis covers a wide spectrum of innovative methods to automate the debugging process throughout the design flow: from Register-Transfer Level (RTL) all the way to the silicon die. In particular, this book describes: (1) techniques for bug trace minimization that simplify debugging; (2) an RTL error diagnosis method that identifies the root cause of errors directly; (3) a counterexample-guided error-repair framework to automatically fix errors in gate-level and RTL designs; (4) a symmetry-based rewiring technology for fixing electrical errors; (5) an incremental verification system for physical synthesis; and (6) an integrated framework for post-silicon debugging and layout repair. The solutions provided in this book can greatly reduce debugging effort, enhance design quality, and ultimately enable the design and manufacture of more reliable electronic devices.
Many real systems are composed of multi-state components with different performance levels and several failure modes. These affect the whole system's performance. Most books on reliability theory cover binary models that allow a system only to function perfectly or fail completely. "The Universal Generating Function in Reliability Analysis and Optimization" is the first book that gives a comprehensive description of the universal generating function technique and its applications in binary and multi-state system reliability analysis. Features: This monograph will be of value to anyone interested in system reliability, performance analysis and optimization in industrial, electrical and nuclear engineering.
The Maintenance Management Framework describes and reviews the concept, process and framework of modern maintenance management of complex systems; concentrating specifically on modern modelling tools (deterministic and empirical) for maintenance planning and scheduling. It will be bought by engineers and professionals involved in maintenance management, maintenance engineering, operations management, quality, etc. as well as graduate students and researchers in this field.
Many new topologies and circuit design techniques have emerged recently to improve the performance of active inductors, but a comprehensive treatment of the theory, topology, characteristics, and design constraint of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels, is not available. This book is an attempt to provide an in-depth examination and a systematic presentation of the operation principles and implementation details of CMOS active inductors and transformers, and a detailed examination of their emerging applications in high-speed analog signal processing and data communications over wire and wireless channels. The content of the book is drawn from recently published research papers and are not available in a single, cohesive book. Equal emphasis is given to the theory of CMOS active inductors and transformers, and their emerging applications. Major subjects to be covered in the book include: inductive characteristics in high-speed analog signal processing and data communications, spiral inductors and transformers - modeling and limitations, a historical perspective of device synthesis, the topology, characterization, and implementation of CMOS active inductors and transformers, and the application of CMOS active inductors and transformers in high-speed analog and digital signal processing and data communications.
Evolutionary algorithms (EAs), as well as other bio-inspired heuristics, are widely usedto solvenumericaloptimizationproblems.However, intheir or- inal versions, they are limited to unconstrained search spaces i.e they do not include a mechanism to incorporate feasibility information into the ?tness function. On the other hand, real-world problems usually have constraints in their models. Therefore, a considerable amount of research has been d- icated to design and implement constraint-handling techniques. The use of (exterior) penalty functions is one of the most popular methods to deal with constrained search spaces when using EAs. However, other alternative me- ods have been proposed such as: special encodings and operators, decoders, the use of multiobjective concepts, among others. An e?cient and adequate constraint-handling technique is a key element in the design of competitive evolutionary algorithms to solve complex op- mization problems. In this way, this subject deserves special research e?orts. After asuccessfulspecialsessiononconstraint-handlingtechniquesusedin evolutionary algorithms within the Congress on Evolutionary Computation (CEC) in 2007, and motivated by the kind invitation made by Dr. Janusz Kacprzyk, I decided to edit a book, with the aim of putting together recent studies on constrained numerical optimization using evolutionary algorithms and other bio-inspired approaches. The intended audience for this book comprises graduate students, prac- tionersandresearchersinterestedonalternativetechniquestosolvenumerical optimization problems in presence of constraints
The authors have consolidated their research work in this volume titled Soft Computing for Data Mining Applications. The monograph gives an insight into the research in the ?elds of Data Mining in combination with Soft Computing methodologies. In these days, the data continues to grow - ponentially. Much of the data is implicitly or explicitly imprecise. Database discovery seeks to discover noteworthy, unrecognized associations between the data items in the existing database. The potential of discovery comes from the realization that alternate contexts may reveal additional valuable information. The rate at which the data is storedis growing at a phenomenal rate. Asaresult, traditionaladhocmixturesofstatisticaltechniquesanddata managementtools are no longer adequate for analyzing this vast collection of data. Severaldomainswherelargevolumesofdataarestoredincentralizedor distributeddatabasesincludesapplicationslikeinelectroniccommerce, bio- formatics, computer security, Web intelligence, intelligent learning database systems, ?nance, marketing, healthcare, telecommunications, andother?elds. E?cient tools and algorithms for knowledge discovery in large data sets have been devised during the recent years. These methods exploit the ca- bility of computers to search huge amounts of data in a fast and e?ective manner. However, the data to be analyzed is imprecise and a?icted with - certainty. In the case of heterogeneous data sources such as text and video, the data might moreover be ambiguous and partly con?icting. Besides, p- terns and relationships of interest are usually approximate. Thus, in order to make the information mining process more robust it requires tolerance toward imprecision, uncertainty and exc
In recent years, the issue of linkage in GEAs has garnered greater attention and recognition from researchers. Conventional approaches that rely much on ad hoc tweaking of parameters to control the search by balancing the level of exploitation and exploration are grossly inadequate. As shown in the work reported here, such parameters tweaking based approaches have their limits; they can be easily fooled by cases of triviality or peculiarity of the class of problems that the algorithms are designed to handle. Furthermore, these approaches are usually blind to the interactions between the decision variables, thereby disrupting the partial solutions that are being built up along the way.
This monograph is devoted to theoretical and experimental study of inhibitory decision and association rules. Inhibitory rules contain on the right-hand side a relation of the kind "attribut = value." The use of inhibitory rules instead of deterministic (standard) ones allows us to describe more completely infor- tion encoded in decision or information systems and to design classi?ers of high quality. The mostimportantfeatureofthis monographis thatit includesanadvanced mathematical analysis of problems on inhibitory rules. We consider algorithms for construction of inhibitory rules, bounds on minimal complexity of inhibitory rules, and algorithms for construction of the set of all minimal inhibitory rules. We also discuss results of experiments with standard and lazy classi?ers based on inhibitory rules. These results show that inhibitory decision and association rules can be used in data mining and knowledge discovery both for knowledge representation and for prediction. Inhibitory rules can be also used under the analysis and design of concurrent systems. The results obtained in the monograph can be useful for researchers in such areas as machine learning, data mining and knowledge discovery, especially for those who are working in rough set theory, test theory, and logical analysis of data (LAD). The monograph can be used under the creation of courses for graduate students and for Ph.D. studies. TheauthorsofthisbookextendanexpressionofgratitudetoProfessorJanusz Kacprzyk, to Dr. Thomas Ditzinger and to the Studies in Computational Int- ligence sta? at Springer for their support in making this book possible.
Evolutionary algorithms are sophisticated search methods that have been found to be very efficient and effective in solving complex real-world multi-objective problems where conventional optimization tools fail to work well. Despite the tremendous amount of work done in the development of these algorithms in the past decade, many researchers assume that the optimization problems are deterministic and uncertainties are rarely examined. The primary motivation of this book is to provide a comprehensive introduction on the design and application of evolutionary algorithms for multi-objective optimization in the presence of uncertainties. In this book, we hope to expose the readers to a range of optimization issues and concepts, and to encourage a greater degree of appreciation of evolutionary computation techniques and the exploration of new ideas that can better handle uncertainties. "Evolutionary Multi-Objective Optimization in Uncertain Environments: Issues and Algorithms" is intended for a wide readership and will be a valuable reference for engineers, researchers, senior undergraduates and graduate students who are interested in the areas of evolutionary multi-objective optimization and uncertainties.
We describe in this book, new methods and applications of hybrid intelligent systems using soft computing techniques. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and evolutionary al- rithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of intelligent control, which are basically papers that use hybrid systems to solve particular problems of control. The second part contains papers with the main theme of pattern recognition, which are basically papers using soft computing techniques for achieving pattern recognition in different applications. The third part contains papers with the themes of intelligent agents and social systems, which are papers that apply the ideas of agents and social behavior to solve real-world problems. The fourth part contains papers that deal with the hardware implementation of intelligent systems for solving particular problems. The fifth part contains papers that deal with modeling, simulation and optimization for real-world applications.
Condition modelling and control is a technique used to enable decision-making in manufacturing processes of interest to researchers and practising engineering. Condition Monitoring and Control for Intelligent Manufacturing will be bought by researchers and graduate students in manufacturing and control and engineering, as well as practising engineers in industries such as automotive and packaging manufacturing.
Introduction to Hardware-Software Co-Design presents a number of issues of fundamental importance for the design of integrated hardware software products such as embedded, communication, and multimedia systems. This book is a comprehensive introduction to the fundamentals of hardware/software co-design. Co-design is still a new field but one which has substantially matured over the past few years. This book, written by leading international experts, covers all the major topics including: fundamental issues in co-design; hardware/software co-synthesis algorithms; prototyping and emulation; target architectures; compiler techniques; specification and verification; system-level specification. Special chapters describe in detail several leading-edge co-design systems including Cosyma, LYCOS, and Cosmos. Introduction to Hardware-Software Co-Design contains sufficient material for use by teachers and students in an advanced course of hardware/software co-design. It also contains extensive explanation of the fundamental concepts of the subject and the necessary background to bring practitioners up-to-date on this increasingly important topic.
This book contains the extended and revised editions of all the talks of the ninth AACD Workshop held in Hotel Bachmair, April 11 - 13 2000 in Rottach-Egem, Germany. The local organization was managed by Rudolf Koch of Infineon Technologies AG, Munich, Germany. The program consisted of six tutorials per day during three days. Experts in the field presented these tutorials and state of the art information is communicated. The audience at the end of the workshop selects program topics for the following workshop. The program committee, consisting of Johan Huijsing of Delft University of Technology, Willy Sansen of Katholieke Universiteit Leuven and Rudy van de Plassche of Broadcom Netherlands BV Bunnik elaborates the selected topics into a three-day program and selects experts in the field for presentation. Each AACD Workshop has given rise to publication of a book by Kluwer entitled "Analog Circuit Design." A series of nine books in a row provides valuable information and good overviews of all analog circuit techniques concerning design, CAD, simulation and device modeling. These books can be seen as a reference to those people involved in analog and mixed signal design. The aim of the workshop is to brainstorm on new and valuable design ideas in the area of analog circuit design. It is the hope of the program committee that this ninth book continues the tradition of emerging contributions to the design of analog and mixed signal systems in Europe and the rest of the world.
"As chip size and complexity continues to grow exponentially, the
challenges of functional verification are becoming a critical issue
in the electronics industry. It is now commonly heard that logical
errors missed during functional verification are the most common
cause of chip re-spins, and that the costs associated with
functional verification are now outweighing the costs of chip
design. To cope with these challenges engineers are increasingly
relying on new design and verification methodologies and languages.
Transaction-based design and verification, constrained random
stimulus generation, functional coverage analysis, and
assertion-based verification are all techniques that advanced
design and verification teams routinely use today. Engineers are
also increasingly turning to design and verification models based
on C/C++ and SystemC in order to build more abstract, higher
performance hardware and software models and to escape the
limitations of RTL HDLs. This new book, Advanced Verification
Techniques, provides specific guidance for these advanced
verification techniques. The book includes realistic examples and
shows how SystemC and SCV can be applied to a variety of advanced
design and verification tasks."
Sigma delta modulation has become a very useful and widely applied technique for high performance Analog-to-Digital (A/D) conversion of narrow band signals. Through the use of oversampling and negative feedback, the quantization errors of a coarse quantizer are suppressed in a narrow signal band in the output of the modulator. Bandpass sigma delta modulation is well suited for A/D conversion of narrow band signals modulated on a carrier, as occurs in communication systems such as AM/FM receivers and mobile phones. Due to the nonlinearity of the quantizer in the feedback loop, a sigma delta modulator may exhibit input signal dependent stability properties. The same combination of the nonlinearity and the feedback loop complicates the stability analysis. In Bandpass Sigma Delta Modulators, the describing function method is used to analyze the stability of the sigma delta modulator. The linear gain model commonly used for the quantizer fails to predict small signal stability properties and idle patterns accurately. In Bandpass Sigma Delta Modulators an improved model for the quantizer is introduced, extending the linear gain model with a phase shift. Analysis shows that the phase shift of a sampled quantizer is in fact a phase uncertainty. Stability analysis of sigma delta modulators using the extended model allows accurate prediction of idle patterns and calculation of small-signal stability boundaries for loop filter parameters. A simplified rule of thumb is derived and applied to bandpass sigma delta modulators. The stability properties have a considerable impact on the design of single-loop, one-bit, high-order continuous-time bandpass sigma delta modulators. The continuous-time bandpass loop filter structure should have sufficient degrees of freedom to implement the desired (small-signal stable) sigma delta modulator behavior. Bandpass Sigma Delta Modulators will be of interest to practicing engineers and researchers in the areas of mixed-signal and analog integrated circuit design.
This book is the first in aseries on novellow power design architectures, methods and design practices. It results from of a large European project started in 1997, whose goal is to promote the further development and the faster and wider industrial use of advanced design methods for reducing the power consumption of electronic systems. Low power design became crucial with the wide spread of portable information and cornrnunication terminals, where a small battery has to last for a long period. High performance electronics, in addition, suffers from a permanent increase of the dissipated power per square millimetre of silicon, due to the increasing eIock-rates, which causes cooling and reliability problems or otherwise limits the performance. The European Union's Information Technologies Programme 'Esprit' did there fore launch a 'Pilot action for Low Power Design', wh ich eventually grew to 19 R&D projects and one coordination project, with an overall budget of 14 million Euro. It is meanwhile known as European Low Power Initiative for Electronic System Design (ESD-LPD) and will be completed by the end of 2001. It involves 30 major Euro pean companies and 20 well-known institutes. The R&D projects aims to develop or demonstrate new design methods for power reduction, while the coordination project takes care that the methods, experiences and results are properly documented and pub licised."
Verification presents the most time-consuming task in the
integrated circuit design process. The increasing similarity
between implementation verification and the ever-needed task of
providing vectors for manufacturing fault testing is tempting many
professionals to combine verification and testing efforts.
Hardware Design and Petri Nets presents a summary of the state of the art in the applications of Petri nets to designing digital systems and circuits. The area of hardware design has traditionally been a fertile field for research in concurrency and Petri nets. Many new ideas about modelling and analysis of concurrent systems, and Petri nets in particular, originated in theory of asynchronous digital circuits. Similarly, the theory and practice of digital circuit design have always recognized Petri nets as a powerful and easy-to-understand modelling tool. The ever-growing demand in the electronic industry for design automation to build various types of computer-based systems creates many opportunities for Petri nets to establish their role of a formal backbone in future tools for constructing systems that are increasingly becoming distributed, concurrent and asynchronous. Petri nets have already proved very effective in supporting algorithms for solving key problems in synthesis of hardware control circuits. However, since the front end to any realistic design flow in the future is likely to rely on more pragmatic Hardware Description Languages (HDLs), such as VHDL and Verilog, it is crucial that Petri nets are well interfaced to such languages. Hardware Design and Petri Nets is divided into five parts, which cover aspects of behavioral modelling, analysis and verification, synthesis from Petri nets and STGs, design environments based on high-level Petri nets and HDLs, and finally performance analysis using Petri nets. Hardware Design and Petri Nets serves as an excellent reference source and may be used as a text for advanced courses on the subject.
A major advantage of a direct digital synthesizer is that its output frequency, phase and amplitude can be precisely and rapidly manipulated under digital processor control. This book was written to find possible applications for radio communication systems.
With the advent of portable and autonomous computing systems, power con sumption has emerged as a focal point in many research projects, commercial systems and DoD platforms. One current research initiative, which drew much attention to this area, is the Power Aware Computing and Communications (PAC/C) program sponsored by DARPA. Many of the chapters in this book include results from work that have been supported by the PACIC program. The performance of computer systems has been tremendously improving while the size and weight of such systems has been constantly shrinking. The capacities of batteries relative to their sizes and weights has been also improv ing but at a rate which is much slower than the rate of improvement in computer performance and the rate of shrinking in computer sizes. The relation between the power consumption of a computer system and it performance and size is a complex one which is very much dependent on the specific system and the technology used to build that system. We do not need a complex argument, however, to be convinced that energy and power, which is the rate of energy consumption, are becoming critical components in computer systems in gen eral, and portable and autonomous systems, in particular. Most of the early research on power consumption in computer systems ad dressed the issue of minimizing power in a given platform, which usually translates into minimizing energy consumption, and thus, longer battery life."
In high speed communications and signal processing applications,
random electrical noise that emanates from devices has a direct
impact on critical high level specifications, for instance, system
bit error rate or signal to noise ratio. Hence, predicting noise in
RF systems at the design stage is extremely important.
Additionally, with the growing complexity of modern RF systems, a
flat transistor-level noise analysis for the entire system is
becoming increasingly difficult. Hence accurate modelling at the
component level and behavioural level simulation techniques are
also becoming increasingly important. In this book, we concentrate
on developing noise simulation techniques for RF circuits.
Mixed Reality is moving out of the research-labs into our daily lives. It plays an increasing role in architecture, design and construction. The combination of digital content with reality creates an exciting synergy that sets out to enhance engagement within architectural design and construction. State-of-the-art research projects on theories and applications within Mixed Reality are presented by leading researchers covering topics in architecture, design collaboration, construction and education. They discuss current projects and offer insight into the next wave of Mixed Reality possibilities. |
![]() ![]() You may like...
Recent Trends in Computer-aided…
Saptarshi Chatterjee, Debangshu Dey, …
Paperback
R2,729
Discovery Miles 27 290
Autodesk Revit 2023 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,061
Discovery Miles 20 610
Mastercam 2023 for SolidWorks Black Book…
Gaurav Verma, Matt Weber
Hardcover
R2,502
Discovery Miles 25 020
Creo Parametric 9.0 Black Book (Colored)
Gaurav Verma, Matt Weber
Hardcover
R2,326
Discovery Miles 23 260
FOCAPD-19/Proceedings of the 9th…
Salvador Garcia-Munoz, Carl D. Laird, …
Hardcover
|