![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This book gives an introduction to the finite element method as a general computational method for solving partial differential equations approximately. Our approach is mathematical in nature with a strong focus on the underlying mathematical principles, such as approximation properties of piecewise polynomial spaces, and variational formulations of partial differential equations, but with a minimum level of advanced mathematical machinery from functional analysis and partial differential equations.In principle, the material should be accessible to students with only knowledge of calculus of several variables, basic partial differential equations, and linear algebra, as the necessary concepts from more advanced analysis are introduced when needed. Throughout the text we emphasize implementation of the involved algorithms, and have therefore mixed mathematical theory with concrete computer code using the numerical software MATLAB is and its PDE-Toolbox.We have also had the ambition to cover some of the most important applications of finite elements and the basic finite element methods developed for those applications, including diffusion and transport phenomena, solid and fluid mechanics, and also electromagnetics.
Here's the only book to comprehensively address integrated optics from both the theory and practical modeling standpoints -- it reveals crucial design methods that decrease your overall device modeling effort.
The primary objective of this book is to provide an easy approach to the basic principles of Engineering Drawing, which is one of the core subjects for undergraduate students in all branches of engineering. Further, it offers comprehensive coverage of topics required for a first course in this subject, based on the author's years of experience in teaching this subject. Emphasis is placed on the precise and logical presentation of the concepts and principles that are essential to understanding the subject. The methods presented help students to grasp the fundamentals more easily. In addition, the book highlights essential problem-solving strategies and features both solved examples and multiple-choice questions to test their comprehension.
Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented.
This book contains selected papers of the 11th OpenFOAM (R) Workshop that was held in Guimaraes, Portugal, June 26 - 30, 2016. The 11th OpenFOAM (R) Workshop had more than 140 technical/scientific presentations and 30 courses, and was attended by circa 300 individuals, representing 180 institutions and 30 countries, from all continents. The OpenFOAM (R) Workshop provided a forum for researchers, industrial users, software developers, consultants and academics working with OpenFOAM (R) technology. The central part of the Workshop was the two-day conference, where presentations and posters on industrial applications and academic research were shown. OpenFOAM (R) (Open Source Field Operation and Manipulation) is a free, open source computational toolbox that has a larger user base across most areas of engineering and science, from both commercial and academic organizations. As a technology, OpenFOAM (R) provides an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics, among several others. Additionally, the OpenFOAM technology offers complete freedom to customize and extend its functionalities.
The book provides a comprehensive description and implementation methodology for the Philips/NXP Aethereal/aelite Network-on-Chip (NoC). The presentation offers a systems perspective, starting from the system requirements and deriving and describing the resulting hardware architectures, embedded software, and accompanying design flow. Readers get an in depth view of the interconnect requirements, not centered only on performance and scalability, but also the multi-faceted, application-driven requirements, in particular composability and predictability. The book shows how these qualitative requirements are implemented in a state-of-the-art on-chip interconnect, and presents the realistic, quantitative costs.
This book contains the extended papers presented at the 2nd Workshop on Supervised and Unsupervised Ensemble Methods and their Applications (SUEMA)heldon21-22July,2008inPatras,Greece,inconjunctionwiththe 18thEuropeanConferenceon Arti?cial Intelligence(ECAI 2008). This wo- shop was a successor of the smaller event held in 2007 in conjunction with 3rd Iberian Conference on Pattern Recognition and Image Analysis, Girona, Spain. The success of that event as well as the publication of workshop - pers in the edited book "Supervised and Unsupervised Ensemble Methods and their Applications", published by Springer-Verlag in Studies in Com- tational Intelligence Series in volume 126, encouraged us to continue a good tradition. The scope of both SUEMA workshops (hence, the book as well) is the application of theoretical ideas in the ?eld of ensembles of classi?cation and clusteringalgorithmstoreal/lifeproblemsinscienceandindustry. Ensembles, which represent a number of algorithms whose class or cluster membership predictions are combined together to produce a single outcome value, have alreadyprovedto be a viable alternativeto a single best algorithmin various practical tasks under di?erent scenarios, from bioinformatics to biometrics, from medicine to network security. The ensemble approach is caused to life by the famous "no free lunch" theorem, stating that there is no absolutely best algorithm to solve all problems. Although ensembles cannot be cons- ered as absolute remedy of a single algorithm de?ciency, it is widely believed thatensemblesprovideabetteranswerto"nofreelunch"theoremthanas- glebestalgorithm. Statistical,algorithmical,representational,computational and practical reasons can explain the success of ensemble methods.
This book is about formal veri?cation, that is, the use of mathematical reasoning to ensure correct execution of computing systems. With the increasing use of c- puting systems in safety-critical and security-critical applications, it is becoming increasingly important for our well-being to ensure that those systems execute c- rectly. Over the last decade, formal veri?cation has made signi?cant headway in the analysis of industrial systems, particularly in the realm of veri?cation of hardware. A key advantage of formal veri?cation is that it provides a mathematical guarantee of their correctness (up to the accuracy of formal models and correctness of r- soning tools). In the process, the analysis can expose subtle design errors. Formal veri?cation is particularly effective in ?nding corner-case bugs that are dif?cult to detect through traditional simulation and testing. Nevertheless, and in spite of its promise, the application of formal veri?cation has so far been limited in an ind- trial design validation tool ?ow. The dif?culties in its large-scale adoption include the following (1) deductive veri?cation using theorem provers often involves - cessive and prohibitive manual effort and (2) automated decision procedures (e. g. , model checking) can quickly hit the bounds of available time and memory. This book presents recent advances in formal veri?cation techniques and d- cusses the applicability of the techniques in ensuring the reliability of large-scale systems. We deal with the veri?cation of a range of computing systems, from - quential programsto concurrentprotocolsand pipelined machines.
This monograph presents the latest developments and applications of computational tools related to the biosciences and medical engineering. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open completely new research fields with a closer joining of the engineering and bio/medical area. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the "language" is sometimes different from discipline to discipline. This monograph reports the results of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices and medical materials. It is also shown that the application of computational methods often necessitates mathematical and experimental methods.
To satisfy the higher requirements of digitally converged embedded systems, this book describes heterogeneous multicore technology that uses various kinds of low-power embedded processor cores on a single chip. With this technology, heterogeneous parallelism can be implemented on an SoC, and greater flexibility and superior performance per watt can then be achieved. This book defines the heterogeneous multicore architecture and explains in detail several embedded processor cores including CPU cores and special-purpose processor cores that achieve highly arithmetic-level parallelism. The authors developed three multicore chips (called RP-1, RP-2, and RP-X) according to the defined architecture with the introduced processor cores. The chip implementations, software environments, and applications running on the chips are also explained in the book. Provides readers an overview and practical discussion of heterogeneous multicore technologies from both a hardware and software point of view; Discusses a new, high-performance and energy efficient approach to designing SoCs for digitally converged, embedded systems; Covers hardware issues such as architecture and chip implementation, as well as software issues such as compilers, operating systems, and application programs; Describes three chips developed according to the defined heterogeneous multicore architecture, including chip implementations, software environments, and working applications.
More and more information, audio and video but also a range of other information type, is generated, processed and used by machines today, even though the end user may be a human. The result over the past 15 years has been a substantial increase in the type of information and change in the way humans generate, classify, store, search, access and consume information. Conversion of information to digital form is a prerequisite for this enhanced machine role, but must be done having in mind requirements such as compactness, fidelity, interpretability etc. This book presents new ways of dealing with digital information and new types of digital information underpinning the evolution of society and business.
Computer-Aided Innovation (CAI) is emerging as a strategic domain of research and application to support enterprises throughout the overall innovation process. The 5.4 Working Group of IFIP aims at defining the scientific foundation of Computer Aided Innovation systems and at identifying state of the art and trends of CAI tools and methods. These Proceedings derive from the second Topical Session on Computer- Aided Innovation organized within the 20th World Computer Congress of IFIP. The goal of the Topical Session is to provide a survey of existing technologies and research activities in the field and to identify opportunities of integration of CAI with other PLM systems. According to the heterogeneous needs of innovation-related activities, the papers published in this volume are characterized by multidisciplinary contents and complementary perspectives and scopes. Such a richness of topics and disciplines will certainly contribute to the promotion of fruitful new collaborations and synergies within the IFIP community. Gaetano Cascini th Florence, April 30 20 08 CAI Topical Session Organization The IFIP Topical Session on Computer-Aided Innovation (CAI) is a co-located conference organized under the auspices of the IFIP World Computer Congress (WCC) 2008 in Milano, Italy Gaetano Cascini CAI Program Committee Chair [email protected]
Providing a step-by-step guide for the implementation of virtual manufacturing using Creo Parametric software (formerly known as Pro-Engineer), this book creates an engaging and interactive learning experience for manufacturing engineering students. Featuring graphic illustrations of simulation processes and operations, and written in accessible English to promote user-friendliness, the book covers key topics in the field including: the engraving machining process, face milling, profile milling, surface milling, volume rough milling, expert machining, electric discharge machining (EDM), and area turning using the lathe machining process. Maximising reader insights into how to simulate material removal processes, and how to generate cutter location data and G-codes data, this valuable resource equips undergraduate, postgraduate, BTech and HND students in the fields of manufacturing engineering, computer aided design (CAD) and computer aided engineering (CAE) with transferable skills and knowledge. This book is also intended for technicians, technologists and engineers new to Creo Parametric software.
The Information and communication technology (ICT) industry is said to account for 2% of the worldwide carbon emissions - a fraction that continues to grow with the relentless push for more and more sophisticated computing equipment, c- munications infrastructure, and mobile devices. While computers evolved in the directionofhigherandhigherperformanceformostofthelatterhalfofthe20thc- tury, the late 1990's and early 2000'ssaw a new emergingfundamentalconcern that has begun to shape our day-to-day thinking in system design - power dissipation. As we elaborate in Chapter 1, a variety of factors colluded to raise power-ef?ciency as a ?rst class design concern in the designer's mind, with profound consequences all over the ?eld: semiconductor process design, circuit design, design automation tools, system and application software, all the way to large data centers. Power-ef?cient System Design originated from a desire to capture and highlight the exciting developments in the rapidly evolving ?eld of power and energy op- mization in electronic and computer based systems. Tremendous progress has been made in the last two decades, and the topic continues to be a fascinating research area. To develop a clearer focus, we have concentrated on the relatively higher level of design abstraction that is loosely called the system level. In addition to the ext- sive coverage of traditional power reduction targets such as CPU and memory, the book is distinguished by detailed coverage of relatively modern power optimization ideas focussing on components such as compilers, operating systems, servers, data centers, and graphics processors.
Artificial Intelligence (AI) is penetrating in all sciences as a multidisciplinary approach. However, adopting the theory of AI including computer vision and computer audition to urban intellectual space, is always difficult for architecture and urban planners. This book overcomes this challenge through a conceptual framework by merging computer vision and audition to urban studies based on a series of workshops called Remorph, conducted by Tehran Urban Innovation Center (TUIC).
By virtue of their special algebraic structures, Pythagorean-hodograph (PH) curves offer unique advantages for computer-aided design and manufacturing, robotics, motion control, path planning, computer graphics, animation, and related fields. This book offers a comprehensive and self-contained treatment of the mathematical theory of PH curves, including algorithms for their construction and examples of their practical applications. Special features include an emphasis on the interplay of ideas from algebra and geometry and their historical origins, detailed algorithm descriptions, and many figures and worked examples. The book may appeal, in whole or in part, to mathematicians, computer scientists, and engineers.
Technology computer-aided design, or TCAD, is critical to today's semiconductor technology and anybody working in this industry needs to know something about TCAD. This book is about how to use computer software to manufacture and test virtually semiconductor devices in 3D. It brings to life the topic of semiconductor device physics, with a hands-on, tutorial approach that de-emphasizes abstract physics and equations and emphasizes real practice and extensive illustrations. Coverage includes a comprehensive library of devices, representing the state of the art technology, such as SuperJunction LDMOS, GaN LED devices, etc.
The authors have consolidated their research work in this volume titled Soft Computing for Data Mining Applications. The monograph gives an insight into the research in the ?elds of Data Mining in combination with Soft Computing methodologies. In these days, the data continues to grow - ponentially. Much of the data is implicitly or explicitly imprecise. Database discovery seeks to discover noteworthy, unrecognized associations between the data items in the existing database. The potential of discovery comes from the realization that alternate contexts may reveal additional valuable information. The rate at which the data is storedis growing at a phenomenal rate. Asaresult, traditionaladhocmixturesofstatisticaltechniquesanddata managementtools are no longer adequate for analyzing this vast collection of data. Severaldomainswherelargevolumesofdataarestoredincentralizedor distributeddatabasesincludesapplicationslikeinelectroniccommerce, bio- formatics, computer security, Web intelligence, intelligent learning database systems, ?nance, marketing, healthcare, telecommunications, andother?elds. E?cient tools and algorithms for knowledge discovery in large data sets have been devised during the recent years. These methods exploit the ca- bility of computers to search huge amounts of data in a fast and e?ective manner. However, the data to be analyzed is imprecise and a?icted with - certainty. In the case of heterogeneous data sources such as text and video, the data might moreover be ambiguous and partly con?icting. Besides, p- terns and relationships of interest are usually approximate. Thus, in order to make the information mining process more robust it requires tolerance toward imprecision, uncertainty and exc
Cognitive Informatics (CI) is the science of cognitive information processing and its applications in cognitive computing. CI is a transdisciplinary enquiry of computer science, information science, cognitive science, and intelligence science that investigates into the internal information processing mechanisms and processes of the brain. Advances and engineering applications of CI have led to the emergence of cognitive computing and the development of Cognitive Computers (CCs) that reason and learn. As initiated by Yingxu Wang and his colleagues, CC has emerged and developed based on the transdisciplinary research in CI, abstract intelligence (aI), and denotational mathematics after the inauguration of the series of IEEE International Conference on Cognitive Informatics since 2002 at Univ. of Calgary, Stanford Univ., and Tsinghua Univ., etc. This volume in LNCS (subseries of Computational Intelligence), LNCI 323, edited by Y. Wang, D. Zhang, and W. Kinsner, presents the latest development in cognitive informatics and cognitive computing. The book focuses on the explanation of cognitive models of the brain, the layered reference model of the brain, the fundamental mechanisms of abstract intelligence, and the implementation of computational intelligence by autonomous inference and learning engines based on CCs.
As diverse as tomorrow's society constituent groups may be, they will share the common requirements that their life should become safer and healthier, offering higher levels of effectiveness, communication and personal freedom. The key common part to all potential solutions fulfilling these requirements is wearable embedded systems, with longer periods of autonomy, offering wider functionality, more communication possibilities and increased computational power. As electronic and information systems on the human body, their role is to collect relevant physiological information, and to interface between humans and local and/or global information systems. Within this context, there is an increasing need for applications in diverse fields, from health to rescue to sport and even remote activities in space, to have real-time access to vital signs and other behavioral parameters for personalized healthcare, rescue operation planning, etc. This book's coverage will span all scientific and technological areas that define wearable monitoring systems, including sensors, signal processing, energy, system integration, communications, and user interfaces. Six case studies will be used to illustrate the principles and practices introduced.
As Moore 's law continues to unfold, two important trends have recently emerged. First, the growth of chip capacity is translated into a corresponding increase of number of cores. Second, the parallelization of the computation and 3D integration technologies lead to distributed memory architectures.This book describes recent research that addresses urgent challenges in many-core architectures and application mapping. It addresses the architectural design of many core chips, memory and data management, power management, design and programming methodologies. It also describes how new techniques have been applied in various industrial case studies.
This book reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. Features: presents an overview of the underlying mathematical theory, covering vector spaces, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations; reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces; examines techniques for computing curvature from polygonal meshes; describes algorithms for mesh smoothing, mesh parametrization, and mesh optimization and simplification; discusses point location databases and convex hulls of point sets; investigates the reconstruction of triangle meshes from point clouds, including methods for registration of point clouds and surface reconstruction; provides additional material at a supplementary website; includes self-study exercises throughout the text.
One of the leading causes of automobile accidents is the slow reaction of the driver while responding to a hazardous situation. State-of-the-art wireless electronics can automate several driving functions, leading to significant reduction in human error and improvement in vehicle safety. With continuous transistor scaling, silicon fabrication technology now has the potential to substantially reduce the cost of automotive radar sensors. This book bridges an existing gap between information available on dependable system/architecture design and circuit design. It provides the background of the field and detailed description of recent research and development of silicon-based radar sensors. System-level requirements and circuit topologies for radar transceivers are described in detail. Holistic approaches towards designing radar sensors are validated with several examples of highly-integrated radar ICs in silicon technologies. Circuit techniques to design millimeter-wave circuits in silicon technologies are discussed in depth.
This edited volume is targeted at presenting the latest state-of-the-art methodologies in "Hybrid Evolutionary Algorithms." The chapters deal with the theoretical and methodological aspects, as well as various applications to many real world problems from science, technology, business or commerce. Overall, the book has 14 chapters including an introductory chapter giving the fundamental definitions and some important research challenges. The contributions were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed.
This book presents a new set of embedded system design techniques called multidimensional data flow, which combine the various benefits offered by existing methodologies such as block-based system design, high-level simulation, system analysis and polyhedral optimization. It describes a novel architecture for efficient and flexible high-speed communication in hardware that can be used both in manual and automatic system design and that offers various design alternatives, balancing achievable throughput with required hardware size. This book demonstrates multidimensional data flow by showing its potential for modeling, analysis, and synthesis of complex image processing applications. These applications are presented in terms of their fundamental properties and resulting design constraints. Coverage includes a discussion of how far the latter can be met better by multidimensional data flow than alternative approaches. Based on these results, the book explains the principles of fine-grained system level analysis and high-speed communication synthesis. Additionally, an extensive review of related techniques is given in order to show their relation to multidimensional data flow. |
You may like...
Flash Memory Integration - Performance…
Jalil Boukhobza, Pierre Olivier
Hardcover
R1,831
Discovery Miles 18 310
Attention and Orienting - Sensory and…
Peter J. Lang, Robert F. Simons, …
Hardcover
R4,253
Discovery Miles 42 530
The Orienting Response in Information…
Heikki Lyytinen, Risto Naatanen, …
Hardcover
R4,236
Discovery Miles 42 360
Evaluating Websites and Web Services
Denis Yannacopoulos, Panagiotis Manolitzas, …
Hardcover
R5,413
Discovery Miles 54 130
Interactive Web-based Virtual Reality…
Chi Chung Ko, Chang Dong Cheng
Hardcover
R4,213
Discovery Miles 42 130
|