![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This proceedings volume gathers a selection of outstanding research papers presented at the third Conference on Isogeometric Analysis and Applications, held in Delft, The Netherlands, in April 2018. This conference series, previously held in Linz, Austria, in 2012 and Annweiler am Trifels, Germany, in 2014, has created an international forum for interaction between scientists and practitioners working in this rapidly developing field. Isogeometric analysis is a groundbreaking computational approach that aims to bridge the gap between numerical analysis and computational geometry modeling by integrating the finite element method and related numerical simulation techniques into the computer-aided design workflow, and vice versa. The methodology has matured over the last decade both in terms of our theoretical understanding, its mathematical foundation and the robustness and efficiency of its practical implementations. This development has enabled scientists and practitioners to tackle challenging new applications at the frontiers of research in science and engineering and attracted early adopters for this his novel computer-aided design and engineering technology in industry. The IGAA 2018 conference brought together experts on isogeometric analysis theory and application, share their insights into challenging industrial applications and to discuss the latest developments as well as the directions of future research and development that are required to make isogeometric analysis an established mainstream technology.
This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units. The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management.
This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and professionals alike.
Digital Video and HD: Algorithms and Interfaces provides a
one-stop shop for the theory and engineering of digital video
systems. Equally accessible to video engineers and those working in
computer graphics, Charles Poynton s revision to his classic text
covers emergent compression systems, including H.264 and VP8/WebM,
and augments detailed information on JPEG, DVC, and MPEG-2 systems.
This edition also introduces the technical aspects of file-based
workflows and outlines the emerging domain of metadata, placing it
in the context of digital video processing.
Embedded software is ubiquitous today. There are millions of lines of embedded code in smart phones, and even more in systems responsible for automotive control, avionics control, weapons control and space missions. Some of these are safety-critical systems whose correctness, timely response, and reliability are of paramount importance. These requirement pose new challenges to system designers. This necessitates that a proper design science, based on "constructive correctness" be developed. Correct-by-construction design and synthesis of embedded software is done in a way so that post-development verification is minimized, and correct operation of embedded systems is maximized. This book presents the state of the art in the design of safety-critical, embedded software. It introduced readers to three major approaches to specification driven, embedded software synthesis/construction: synchronous programming based approaches, models of computation based approaches, and an approach based on concurrent programming with a co-design focused language. It is an invaluable reference for practitioners and researchers concerned with improving the product development life-cycle.
The primary objective of this book is to provide an easy approach to the basic principles of Engineering Drawing, which is one of the core subjects for undergraduate students in all branches of engineering. Further, it offers comprehensive coverage of topics required for a first course in this subject, based on the author's years of experience in teaching this subject. Emphasis is placed on the precise and logical presentation of the concepts and principles that are essential to understanding the subject. The methods presented help students to grasp the fundamentals more easily. In addition, the book highlights essential problem-solving strategies and features both solved examples and multiple-choice questions to test their comprehension.
This book gives an introduction to the finite element method as a general computational method for solving partial differential equations approximately. Our approach is mathematical in nature with a strong focus on the underlying mathematical principles, such as approximation properties of piecewise polynomial spaces, and variational formulations of partial differential equations, but with a minimum level of advanced mathematical machinery from functional analysis and partial differential equations.In principle, the material should be accessible to students with only knowledge of calculus of several variables, basic partial differential equations, and linear algebra, as the necessary concepts from more advanced analysis are introduced when needed. Throughout the text we emphasize implementation of the involved algorithms, and have therefore mixed mathematical theory with concrete computer code using the numerical software MATLAB is and its PDE-Toolbox.We have also had the ambition to cover some of the most important applications of finite elements and the basic finite element methods developed for those applications, including diffusion and transport phenomena, solid and fluid mechanics, and also electromagnetics.
Here's the only book to comprehensively address integrated optics from both the theory and practical modeling standpoints -- it reveals crucial design methods that decrease your overall device modeling effort.
Computer technology has revolutionized many aspects of building design, such as drafting, management, construction - even building with robots. This revolution has expanded into the field of design creativity. Presented in this book is an up-to-date, comprehensive picture of research advances in the fast-growing field of informatics applied to conceptual stages in the generation of artifacts - in particular, buildings. It addresses the question how far and in what ways creative design can be intelligently automated. Among the topics covered are: the use of precedents; the relations between case-based, rule-based, and principle-based architectural design reasoning; product typology; artifact thesauruses; the inputting and retrieval of architectural knowledge; the visual representation and understanding of existing or projected built forms; empirical and analytical models of the design process and the design product; desktop design toolkits; grammars of shape and of function; multiple-perspective building data structures; design as a multi-agent collaborative process; the integration of heterogeneous engineering information; and foundations for a systematic approach to the development of knowledge-based design systems. The papers provide a link between basic and practical issues: - fundamental questions in the theory of artifact design, artifical intelligence, and the cognitive science of imagination and reasoning; - problems in the computerization of building data and design facilities; - the practical tasks of building conception, construction and evaluation. The automation of creative design is itself considered as an engineering design problem. The implications of current and future work for architectural education and research in architectural history, as well as for computer-integrated construction and the management of engineering projects are considered.
The book provides a comprehensive description and implementation methodology for the Philips/NXP Aethereal/aelite Network-on-Chip (NoC). The presentation offers a systems perspective, starting from the system requirements and deriving and describing the resulting hardware architectures, embedded software, and accompanying design flow. Readers get an in depth view of the interconnect requirements, not centered only on performance and scalability, but also the multi-faceted, application-driven requirements, in particular composability and predictability. The book shows how these qualitative requirements are implemented in a state-of-the-art on-chip interconnect, and presents the realistic, quantitative costs.
This monograph presents the latest developments and applications of computational tools related to the biosciences and medical engineering. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open completely new research fields with a closer joining of the engineering and bio/medical area. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the "language" is sometimes different from discipline to discipline. This monograph reports the results of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices and medical materials. It is also shown that the application of computational methods often necessitates mathematical and experimental methods.
This book is about formal veri?cation, that is, the use of mathematical reasoning to ensure correct execution of computing systems. With the increasing use of c- puting systems in safety-critical and security-critical applications, it is becoming increasingly important for our well-being to ensure that those systems execute c- rectly. Over the last decade, formal veri?cation has made signi?cant headway in the analysis of industrial systems, particularly in the realm of veri?cation of hardware. A key advantage of formal veri?cation is that it provides a mathematical guarantee of their correctness (up to the accuracy of formal models and correctness of r- soning tools). In the process, the analysis can expose subtle design errors. Formal veri?cation is particularly effective in ?nding corner-case bugs that are dif?cult to detect through traditional simulation and testing. Nevertheless, and in spite of its promise, the application of formal veri?cation has so far been limited in an ind- trial design validation tool ?ow. The dif?culties in its large-scale adoption include the following (1) deductive veri?cation using theorem provers often involves - cessive and prohibitive manual effort and (2) automated decision procedures (e. g. , model checking) can quickly hit the bounds of available time and memory. This book presents recent advances in formal veri?cation techniques and d- cusses the applicability of the techniques in ensuring the reliability of large-scale systems. We deal with the veri?cation of a range of computing systems, from - quential programsto concurrentprotocolsand pipelined machines.
This book contains selected papers of the 11th OpenFOAM (R) Workshop that was held in Guimaraes, Portugal, June 26 - 30, 2016. The 11th OpenFOAM (R) Workshop had more than 140 technical/scientific presentations and 30 courses, and was attended by circa 300 individuals, representing 180 institutions and 30 countries, from all continents. The OpenFOAM (R) Workshop provided a forum for researchers, industrial users, software developers, consultants and academics working with OpenFOAM (R) technology. The central part of the Workshop was the two-day conference, where presentations and posters on industrial applications and academic research were shown. OpenFOAM (R) (Open Source Field Operation and Manipulation) is a free, open source computational toolbox that has a larger user base across most areas of engineering and science, from both commercial and academic organizations. As a technology, OpenFOAM (R) provides an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics, among several others. Additionally, the OpenFOAM technology offers complete freedom to customize and extend its functionalities.
Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented.
This book contains the extended papers presented at the 2nd Workshop on Supervised and Unsupervised Ensemble Methods and their Applications (SUEMA)heldon21-22July,2008inPatras,Greece,inconjunctionwiththe 18thEuropeanConferenceon Arti?cial Intelligence(ECAI 2008). This wo- shop was a successor of the smaller event held in 2007 in conjunction with 3rd Iberian Conference on Pattern Recognition and Image Analysis, Girona, Spain. The success of that event as well as the publication of workshop - pers in the edited book "Supervised and Unsupervised Ensemble Methods and their Applications", published by Springer-Verlag in Studies in Com- tational Intelligence Series in volume 126, encouraged us to continue a good tradition. The scope of both SUEMA workshops (hence, the book as well) is the application of theoretical ideas in the ?eld of ensembles of classi?cation and clusteringalgorithmstoreal/lifeproblemsinscienceandindustry. Ensembles, which represent a number of algorithms whose class or cluster membership predictions are combined together to produce a single outcome value, have alreadyprovedto be a viable alternativeto a single best algorithmin various practical tasks under di?erent scenarios, from bioinformatics to biometrics, from medicine to network security. The ensemble approach is caused to life by the famous "no free lunch" theorem, stating that there is no absolutely best algorithm to solve all problems. Although ensembles cannot be cons- ered as absolute remedy of a single algorithm de?ciency, it is widely believed thatensemblesprovideabetteranswerto"nofreelunch"theoremthanas- glebestalgorithm. Statistical,algorithmical,representational,computational and practical reasons can explain the success of ensemble methods.
Old age is currently the greatest risk factor for developing dementia. Since older people make up a larger portion of the population than ever before, the resulting increase in the incidence of dementia presents a major challenge for society. Dementia is complex and multifaceted and impacts not only the person with the diagnosis but also those caring for them and society as a whole. Human-Computer Interaction (HCI) design and development are pivotal in enabling people with dementia to live well and be supported in the communities around them. HCI is increasingly addressing the need for inclusivity and accessibility in the design and development of new technologies, interfaces, systems, services, and tools. Using interdisciplinary approaches HCI engages with the complexities and 'messiness' of real-world design spaces to provide novel perspectives and new ways of addressing the challenge of dementia and multi-stakeholder needs. HCI and Design in the Context of Dementia brings together the work of international experts, designers and researchers working across disciplines. It provides methodologies, methods and frameworks, approaches to participatory engagement and case studies showing how technology can impact the lives of people living with dementia and those around them. It includes examples of how to conduct dementia research and design in-context in the field of HCI, ethically and effectively and how these issues transcend the design space of dementia to inform HCI design and technology development more broadly. The book is valuable for and aimed at designers, researchers, scholars and caregivers that work with vulnerable groups like people with dementia, and those directly impacted.
To satisfy the higher requirements of digitally converged embedded systems, this book describes heterogeneous multicore technology that uses various kinds of low-power embedded processor cores on a single chip. With this technology, heterogeneous parallelism can be implemented on an SoC, and greater flexibility and superior performance per watt can then be achieved. This book defines the heterogeneous multicore architecture and explains in detail several embedded processor cores including CPU cores and special-purpose processor cores that achieve highly arithmetic-level parallelism. The authors developed three multicore chips (called RP-1, RP-2, and RP-X) according to the defined architecture with the introduced processor cores. The chip implementations, software environments, and applications running on the chips are also explained in the book. Provides readers an overview and practical discussion of heterogeneous multicore technologies from both a hardware and software point of view; Discusses a new, high-performance and energy efficient approach to designing SoCs for digitally converged, embedded systems; Covers hardware issues such as architecture and chip implementation, as well as software issues such as compilers, operating systems, and application programs; Describes three chips developed according to the defined heterogeneous multicore architecture, including chip implementations, software environments, and working applications.
This book gives Abaqus users who make use of finite-element models in academic or practitioner-based research the in-depth program knowledge that allows them to debug a structural analysis model. The book provides many methods and guidelines for different analysis types and modes, that will help readers to solve problems that can arise with Abaqus if a structural model fails to converge to a solution. The use of Abaqus affords a general checklist approach to debugging analysis models, which can also be applied to structural analysis. The author uses step-by-step methods and detailed explanations of special features in order to identify the solutions to a variety of problems with finite-element models. The book promotes: * a diagnostic mode of thinking concerning error messages; * better material definition and the writing of user material subroutines; * work with the Abaqus mesher and best practice in doing so; * the writing of user element subroutines and contact features with convergence issues; and * consideration of hardware and software issues and a Windows HPC cluster solution. The methods and information provided facilitate job diagnostics and help to obtain converged solutions for finite-element models regarding structural component assemblies in static or dynamic analysis. The troubleshooting advice ensures that these solutions are both high-quality and cost-effective according to practical experience. The book offers an in-depth guide for students learning about Abaqus, as each problem and solution are complemented by examples and straightforward explanations. It is also useful for academics and structural engineers wishing to debug Abaqus models on the basis of error and warning messages that arise during finite-element modelling processing.
More and more information, audio and video but also a range of other information type, is generated, processed and used by machines today, even though the end user may be a human. The result over the past 15 years has been a substantial increase in the type of information and change in the way humans generate, classify, store, search, access and consume information. Conversion of information to digital form is a prerequisite for this enhanced machine role, but must be done having in mind requirements such as compactness, fidelity, interpretability etc. This book presents new ways of dealing with digital information and new types of digital information underpinning the evolution of society and business.
Artificial Intelligence (AI) is penetrating in all sciences as a multidisciplinary approach. However, adopting the theory of AI including computer vision and computer audition to urban intellectual space, is always difficult for architecture and urban planners. This book overcomes this challenge through a conceptual framework by merging computer vision and audition to urban studies based on a series of workshops called Remorph, conducted by Tehran Urban Innovation Center (TUIC).
The Information and communication technology (ICT) industry is said to account for 2% of the worldwide carbon emissions - a fraction that continues to grow with the relentless push for more and more sophisticated computing equipment, c- munications infrastructure, and mobile devices. While computers evolved in the directionofhigherandhigherperformanceformostofthelatterhalfofthe20thc- tury, the late 1990's and early 2000'ssaw a new emergingfundamentalconcern that has begun to shape our day-to-day thinking in system design - power dissipation. As we elaborate in Chapter 1, a variety of factors colluded to raise power-ef?ciency as a ?rst class design concern in the designer's mind, with profound consequences all over the ?eld: semiconductor process design, circuit design, design automation tools, system and application software, all the way to large data centers. Power-ef?cient System Design originated from a desire to capture and highlight the exciting developments in the rapidly evolving ?eld of power and energy op- mization in electronic and computer based systems. Tremendous progress has been made in the last two decades, and the topic continues to be a fascinating research area. To develop a clearer focus, we have concentrated on the relatively higher level of design abstraction that is loosely called the system level. In addition to the ext- sive coverage of traditional power reduction targets such as CPU and memory, the book is distinguished by detailed coverage of relatively modern power optimization ideas focussing on components such as compilers, operating systems, servers, data centers, and graphics processors.
The papers in this volume represent research and development in the field of artificial intelligence. This volume demonstrates both the breadth and depth of artificial intelligence in design and points the way forward for our understanding of design as a process and for the development of advanced computer-based tools to aid designers. The paper describes advances in both theory and applications. This volume should be of particular interest to researchers, developers and users of advanced computer systems in design.
Computer-Aided Innovation (CAI) is emerging as a strategic domain of research and application to support enterprises throughout the overall innovation process. The 5.4 Working Group of IFIP aims at defining the scientific foundation of Computer Aided Innovation systems and at identifying state of the art and trends of CAI tools and methods. These Proceedings derive from the second Topical Session on Computer- Aided Innovation organized within the 20th World Computer Congress of IFIP. The goal of the Topical Session is to provide a survey of existing technologies and research activities in the field and to identify opportunities of integration of CAI with other PLM systems. According to the heterogeneous needs of innovation-related activities, the papers published in this volume are characterized by multidisciplinary contents and complementary perspectives and scopes. Such a richness of topics and disciplines will certainly contribute to the promotion of fruitful new collaborations and synergies within the IFIP community. Gaetano Cascini th Florence, April 30 20 08 CAI Topical Session Organization The IFIP Topical Session on Computer-Aided Innovation (CAI) is a co-located conference organized under the auspices of the IFIP World Computer Congress (WCC) 2008 in Milano, Italy Gaetano Cascini CAI Program Committee Chair [email protected]
There have been substantial developments in meshfree methods, particle methods, and generalized finite element methods since the mid 1990s. The growing interest in these methods is in part due to the fact that they offer extremely flexible numerical tools and can be interpreted in a number of ways. For instance, meshfree methods can be viewed as a natural extension of classical finite element and finite difference methods to scattered node configurations with no fixed connectivity. Furthermore, meshfree methods have a number of advantageous features that are especially attractive when dealing with multiscale phenomena: A-priori knowledge about the solution's particular local behavior can easily be introduced into the meshfree approximation space, and coarse scale approximations can be seamlessly refined by adding fine scale information. However, the implementation of meshfree methods and their parallelization also requires special attention, for instance with respect to numerical integration.
Technology computer-aided design, or TCAD, is critical to today's semiconductor technology and anybody working in this industry needs to know something about TCAD. This book is about how to use computer software to manufacture and test virtually semiconductor devices in 3D. It brings to life the topic of semiconductor device physics, with a hands-on, tutorial approach that de-emphasizes abstract physics and equations and emphasizes real practice and extensive illustrations. Coverage includes a comprehensive library of devices, representing the state of the art technology, such as SuperJunction LDMOS, GaN LED devices, etc. |
![]() ![]() You may like...
An Operator Perspective on Signals and…
Arthur Frazho, Wisuwat Bhosri
Hardcover
R3,461
Discovery Miles 34 610
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R12,034
Discovery Miles 120 340
Business Continuity Management in…
Leni Sagita Riantini Supriadi, Low Sui Pheng
Hardcover
Data-Driven Prediction for Industrial…
Jun Zhao, Wei Wang, …
Hardcover
R3,689
Discovery Miles 36 890
Models and Algorithms of Time-Dependent…
Stanislaw Gawiejnowicz
Hardcover
R4,456
Discovery Miles 44 560
Bispectral Methods of Signal Processing…
Alexander V Totsky, Alexander A Zelensky, …
Hardcover
R5,333
Discovery Miles 53 330
Automation of Finite Element Methods
Joze Korelc, Peter Wriggers
Hardcover
R4,953
Discovery Miles 49 530
Network Games, Control, and Optimization…
Samson Lasaulce, Tania Jimenez, …
Hardcover
R2,909
Discovery Miles 29 090
|