![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This book contains the extended papers presented at the 2nd Workshop on Supervised and Unsupervised Ensemble Methods and their Applications (SUEMA)heldon21-22July,2008inPatras,Greece,inconjunctionwiththe 18thEuropeanConferenceon Arti?cial Intelligence(ECAI 2008). This wo- shop was a successor of the smaller event held in 2007 in conjunction with 3rd Iberian Conference on Pattern Recognition and Image Analysis, Girona, Spain. The success of that event as well as the publication of workshop - pers in the edited book "Supervised and Unsupervised Ensemble Methods and their Applications", published by Springer-Verlag in Studies in Com- tational Intelligence Series in volume 126, encouraged us to continue a good tradition. The scope of both SUEMA workshops (hence, the book as well) is the application of theoretical ideas in the ?eld of ensembles of classi?cation and clusteringalgorithmstoreal/lifeproblemsinscienceandindustry. Ensembles, which represent a number of algorithms whose class or cluster membership predictions are combined together to produce a single outcome value, have alreadyprovedto be a viable alternativeto a single best algorithmin various practical tasks under di?erent scenarios, from bioinformatics to biometrics, from medicine to network security. The ensemble approach is caused to life by the famous "no free lunch" theorem, stating that there is no absolutely best algorithm to solve all problems. Although ensembles cannot be cons- ered as absolute remedy of a single algorithm de?ciency, it is widely believed thatensemblesprovideabetteranswerto"nofreelunch"theoremthanas- glebestalgorithm. Statistical,algorithmical,representational,computational and practical reasons can explain the success of ensemble methods.
Old age is currently the greatest risk factor for developing dementia. Since older people make up a larger portion of the population than ever before, the resulting increase in the incidence of dementia presents a major challenge for society. Dementia is complex and multifaceted and impacts not only the person with the diagnosis but also those caring for them and society as a whole. Human-Computer Interaction (HCI) design and development are pivotal in enabling people with dementia to live well and be supported in the communities around them. HCI is increasingly addressing the need for inclusivity and accessibility in the design and development of new technologies, interfaces, systems, services, and tools. Using interdisciplinary approaches HCI engages with the complexities and 'messiness' of real-world design spaces to provide novel perspectives and new ways of addressing the challenge of dementia and multi-stakeholder needs. HCI and Design in the Context of Dementia brings together the work of international experts, designers and researchers working across disciplines. It provides methodologies, methods and frameworks, approaches to participatory engagement and case studies showing how technology can impact the lives of people living with dementia and those around them. It includes examples of how to conduct dementia research and design in-context in the field of HCI, ethically and effectively and how these issues transcend the design space of dementia to inform HCI design and technology development more broadly. The book is valuable for and aimed at designers, researchers, scholars and caregivers that work with vulnerable groups like people with dementia, and those directly impacted.
To satisfy the higher requirements of digitally converged embedded systems, this book describes heterogeneous multicore technology that uses various kinds of low-power embedded processor cores on a single chip. With this technology, heterogeneous parallelism can be implemented on an SoC, and greater flexibility and superior performance per watt can then be achieved. This book defines the heterogeneous multicore architecture and explains in detail several embedded processor cores including CPU cores and special-purpose processor cores that achieve highly arithmetic-level parallelism. The authors developed three multicore chips (called RP-1, RP-2, and RP-X) according to the defined architecture with the introduced processor cores. The chip implementations, software environments, and applications running on the chips are also explained in the book. Provides readers an overview and practical discussion of heterogeneous multicore technologies from both a hardware and software point of view; Discusses a new, high-performance and energy efficient approach to designing SoCs for digitally converged, embedded systems; Covers hardware issues such as architecture and chip implementation, as well as software issues such as compilers, operating systems, and application programs; Describes three chips developed according to the defined heterogeneous multicore architecture, including chip implementations, software environments, and working applications.
This book gives Abaqus users who make use of finite-element models in academic or practitioner-based research the in-depth program knowledge that allows them to debug a structural analysis model. The book provides many methods and guidelines for different analysis types and modes, that will help readers to solve problems that can arise with Abaqus if a structural model fails to converge to a solution. The use of Abaqus affords a general checklist approach to debugging analysis models, which can also be applied to structural analysis. The author uses step-by-step methods and detailed explanations of special features in order to identify the solutions to a variety of problems with finite-element models. The book promotes: * a diagnostic mode of thinking concerning error messages; * better material definition and the writing of user material subroutines; * work with the Abaqus mesher and best practice in doing so; * the writing of user element subroutines and contact features with convergence issues; and * consideration of hardware and software issues and a Windows HPC cluster solution. The methods and information provided facilitate job diagnostics and help to obtain converged solutions for finite-element models regarding structural component assemblies in static or dynamic analysis. The troubleshooting advice ensures that these solutions are both high-quality and cost-effective according to practical experience. The book offers an in-depth guide for students learning about Abaqus, as each problem and solution are complemented by examples and straightforward explanations. It is also useful for academics and structural engineers wishing to debug Abaqus models on the basis of error and warning messages that arise during finite-element modelling processing.
More and more information, audio and video but also a range of other information type, is generated, processed and used by machines today, even though the end user may be a human. The result over the past 15 years has been a substantial increase in the type of information and change in the way humans generate, classify, store, search, access and consume information. Conversion of information to digital form is a prerequisite for this enhanced machine role, but must be done having in mind requirements such as compactness, fidelity, interpretability etc. This book presents new ways of dealing with digital information and new types of digital information underpinning the evolution of society and business.
Artificial Intelligence (AI) is penetrating in all sciences as a multidisciplinary approach. However, adopting the theory of AI including computer vision and computer audition to urban intellectual space, is always difficult for architecture and urban planners. This book overcomes this challenge through a conceptual framework by merging computer vision and audition to urban studies based on a series of workshops called Remorph, conducted by Tehran Urban Innovation Center (TUIC).
There have been substantial developments in meshfree methods, particle methods, and generalized finite element methods since the mid 1990s. The growing interest in these methods is in part due to the fact that they offer extremely flexible numerical tools and can be interpreted in a number of ways. For instance, meshfree methods can be viewed as a natural extension of classical finite element and finite difference methods to scattered node configurations with no fixed connectivity. Furthermore, meshfree methods have a number of advantageous features that are especially attractive when dealing with multiscale phenomena: A-priori knowledge about the solution's particular local behavior can easily be introduced into the meshfree approximation space, and coarse scale approximations can be seamlessly refined by adding fine scale information. However, the implementation of meshfree methods and their parallelization also requires special attention, for instance with respect to numerical integration.
The Information and communication technology (ICT) industry is said to account for 2% of the worldwide carbon emissions - a fraction that continues to grow with the relentless push for more and more sophisticated computing equipment, c- munications infrastructure, and mobile devices. While computers evolved in the directionofhigherandhigherperformanceformostofthelatterhalfofthe20thc- tury, the late 1990's and early 2000'ssaw a new emergingfundamentalconcern that has begun to shape our day-to-day thinking in system design - power dissipation. As we elaborate in Chapter 1, a variety of factors colluded to raise power-ef?ciency as a ?rst class design concern in the designer's mind, with profound consequences all over the ?eld: semiconductor process design, circuit design, design automation tools, system and application software, all the way to large data centers. Power-ef?cient System Design originated from a desire to capture and highlight the exciting developments in the rapidly evolving ?eld of power and energy op- mization in electronic and computer based systems. Tremendous progress has been made in the last two decades, and the topic continues to be a fascinating research area. To develop a clearer focus, we have concentrated on the relatively higher level of design abstraction that is loosely called the system level. In addition to the ext- sive coverage of traditional power reduction targets such as CPU and memory, the book is distinguished by detailed coverage of relatively modern power optimization ideas focussing on components such as compilers, operating systems, servers, data centers, and graphics processors.
The papers in this volume represent research and development in the field of artificial intelligence. This volume demonstrates both the breadth and depth of artificial intelligence in design and points the way forward for our understanding of design as a process and for the development of advanced computer-based tools to aid designers. The paper describes advances in both theory and applications. This volume should be of particular interest to researchers, developers and users of advanced computer systems in design.
Computer-Aided Innovation (CAI) is emerging as a strategic domain of research and application to support enterprises throughout the overall innovation process. The 5.4 Working Group of IFIP aims at defining the scientific foundation of Computer Aided Innovation systems and at identifying state of the art and trends of CAI tools and methods. These Proceedings derive from the second Topical Session on Computer- Aided Innovation organized within the 20th World Computer Congress of IFIP. The goal of the Topical Session is to provide a survey of existing technologies and research activities in the field and to identify opportunities of integration of CAI with other PLM systems. According to the heterogeneous needs of innovation-related activities, the papers published in this volume are characterized by multidisciplinary contents and complementary perspectives and scopes. Such a richness of topics and disciplines will certainly contribute to the promotion of fruitful new collaborations and synergies within the IFIP community. Gaetano Cascini th Florence, April 30 20 08 CAI Topical Session Organization The IFIP Topical Session on Computer-Aided Innovation (CAI) is a co-located conference organized under the auspices of the IFIP World Computer Congress (WCC) 2008 in Milano, Italy Gaetano Cascini CAI Program Committee Chair [email protected]
Technology computer-aided design, or TCAD, is critical to today's semiconductor technology and anybody working in this industry needs to know something about TCAD. This book is about how to use computer software to manufacture and test virtually semiconductor devices in 3D. It brings to life the topic of semiconductor device physics, with a hands-on, tutorial approach that de-emphasizes abstract physics and equations and emphasizes real practice and extensive illustrations. Coverage includes a comprehensive library of devices, representing the state of the art technology, such as SuperJunction LDMOS, GaN LED devices, etc.
Providing a step-by-step guide for the implementation of virtual manufacturing using Creo Parametric software (formerly known as Pro-Engineer), this book creates an engaging and interactive learning experience for manufacturing engineering students. Featuring graphic illustrations of simulation processes and operations, and written in accessible English to promote user-friendliness, the book covers key topics in the field including: the engraving machining process, face milling, profile milling, surface milling, volume rough milling, expert machining, electric discharge machining (EDM), and area turning using the lathe machining process. Maximising reader insights into how to simulate material removal processes, and how to generate cutter location data and G-codes data, this valuable resource equips undergraduate, postgraduate, BTech and HND students in the fields of manufacturing engineering, computer aided design (CAD) and computer aided engineering (CAE) with transferable skills and knowledge. This book is also intended for technicians, technologists and engineers new to Creo Parametric software.
The authors have consolidated their research work in this volume titled Soft Computing for Data Mining Applications. The monograph gives an insight into the research in the ?elds of Data Mining in combination with Soft Computing methodologies. In these days, the data continues to grow - ponentially. Much of the data is implicitly or explicitly imprecise. Database discovery seeks to discover noteworthy, unrecognized associations between the data items in the existing database. The potential of discovery comes from the realization that alternate contexts may reveal additional valuable information. The rate at which the data is storedis growing at a phenomenal rate. Asaresult, traditionaladhocmixturesofstatisticaltechniquesanddata managementtools are no longer adequate for analyzing this vast collection of data. Severaldomainswherelargevolumesofdataarestoredincentralizedor distributeddatabasesincludesapplicationslikeinelectroniccommerce, bio- formatics, computer security, Web intelligence, intelligent learning database systems, ?nance, marketing, healthcare, telecommunications, andother?elds. E?cient tools and algorithms for knowledge discovery in large data sets have been devised during the recent years. These methods exploit the ca- bility of computers to search huge amounts of data in a fast and e?ective manner. However, the data to be analyzed is imprecise and a?icted with - certainty. In the case of heterogeneous data sources such as text and video, the data might moreover be ambiguous and partly con?icting. Besides, p- terns and relationships of interest are usually approximate. Thus, in order to make the information mining process more robust it requires tolerance toward imprecision, uncertainty and exc
One of the leading causes of automobile accidents is the slow reaction of the driver while responding to a hazardous situation. State-of-the-art wireless electronics can automate several driving functions, leading to significant reduction in human error and improvement in vehicle safety. With continuous transistor scaling, silicon fabrication technology now has the potential to substantially reduce the cost of automotive radar sensors. This book bridges an existing gap between information available on dependable system/architecture design and circuit design. It provides the background of the field and detailed description of recent research and development of silicon-based radar sensors. System-level requirements and circuit topologies for radar transceivers are described in detail. Holistic approaches towards designing radar sensors are validated with several examples of highly-integrated radar ICs in silicon technologies. Circuit techniques to design millimeter-wave circuits in silicon technologies are discussed in depth.
As diverse as tomorrow's society constituent groups may be, they will share the common requirements that their life should become safer and healthier, offering higher levels of effectiveness, communication and personal freedom. The key common part to all potential solutions fulfilling these requirements is wearable embedded systems, with longer periods of autonomy, offering wider functionality, more communication possibilities and increased computational power. As electronic and information systems on the human body, their role is to collect relevant physiological information, and to interface between humans and local and/or global information systems. Within this context, there is an increasing need for applications in diverse fields, from health to rescue to sport and even remote activities in space, to have real-time access to vital signs and other behavioral parameters for personalized healthcare, rescue operation planning, etc. This book's coverage will span all scientific and technological areas that define wearable monitoring systems, including sensors, signal processing, energy, system integration, communications, and user interfaces. Six case studies will be used to illustrate the principles and practices introduced.
By virtue of their special algebraic structures, Pythagorean-hodograph (PH) curves offer unique advantages for computer-aided design and manufacturing, robotics, motion control, path planning, computer graphics, animation, and related fields. This book offers a comprehensive and self-contained treatment of the mathematical theory of PH curves, including algorithms for their construction and examples of their practical applications. Special features include an emphasis on the interplay of ideas from algebra and geometry and their historical origins, detailed algorithm descriptions, and many figures and worked examples. The book may appeal, in whole or in part, to mathematicians, computer scientists, and engineers.
This edited volume is targeted at presenting the latest state-of-the-art methodologies in "Hybrid Evolutionary Algorithms." The chapters deal with the theoretical and methodological aspects, as well as various applications to many real world problems from science, technology, business or commerce. Overall, the book has 14 chapters including an introductory chapter giving the fundamental definitions and some important research challenges. The contributions were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed.
Cognitive Informatics (CI) is the science of cognitive information processing and its applications in cognitive computing. CI is a transdisciplinary enquiry of computer science, information science, cognitive science, and intelligence science that investigates into the internal information processing mechanisms and processes of the brain. Advances and engineering applications of CI have led to the emergence of cognitive computing and the development of Cognitive Computers (CCs) that reason and learn. As initiated by Yingxu Wang and his colleagues, CC has emerged and developed based on the transdisciplinary research in CI, abstract intelligence (aI), and denotational mathematics after the inauguration of the series of IEEE International Conference on Cognitive Informatics since 2002 at Univ. of Calgary, Stanford Univ., and Tsinghua Univ., etc. This volume in LNCS (subseries of Computational Intelligence), LNCI 323, edited by Y. Wang, D. Zhang, and W. Kinsner, presents the latest development in cognitive informatics and cognitive computing. The book focuses on the explanation of cognitive models of the brain, the layered reference model of the brain, the fundamental mechanisms of abstract intelligence, and the implementation of computational intelligence by autonomous inference and learning engines based on CCs.
As Moore 's law continues to unfold, two important trends have recently emerged. First, the growth of chip capacity is translated into a corresponding increase of number of cores. Second, the parallelization of the computation and 3D integration technologies lead to distributed memory architectures.This book describes recent research that addresses urgent challenges in many-core architectures and application mapping. It addresses the architectural design of many core chips, memory and data management, power management, design and programming methodologies. It also describes how new techniques have been applied in various industrial case studies.
This book reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. Features: presents an overview of the underlying mathematical theory, covering vector spaces, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations; reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces; examines techniques for computing curvature from polygonal meshes; describes algorithms for mesh smoothing, mesh parametrization, and mesh optimization and simplification; discusses point location databases and convex hulls of point sets; investigates the reconstruction of triangle meshes from point clouds, including methods for registration of point clouds and surface reconstruction; provides additional material at a supplementary website; includes self-study exercises throughout the text.
This book presents a new set of embedded system design techniques called multidimensional data flow, which combine the various benefits offered by existing methodologies such as block-based system design, high-level simulation, system analysis and polyhedral optimization. It describes a novel architecture for efficient and flexible high-speed communication in hardware that can be used both in manual and automatic system design and that offers various design alternatives, balancing achievable throughput with required hardware size. This book demonstrates multidimensional data flow by showing its potential for modeling, analysis, and synthesis of complex image processing applications. These applications are presented in terms of their fundamental properties and resulting design constraints. Coverage includes a discussion of how far the latter can be met better by multidimensional data flow than alternative approaches. Based on these results, the book explains the principles of fine-grained system level analysis and high-speed communication synthesis. Additionally, an extensive review of related techniques is given in order to show their relation to multidimensional data flow.
Human lives are getting increasingly entangled with technology, especially comp- ing and electronics. At each step we take, especially in a developing world, we are dependent on various gadgets such as cell phones, handheld PDAs, netbooks, me- cal prosthetic devices, and medical measurement devices (e.g., blood pressure m- itors, glucometers). Two important design constraints for such consumer electronics are their form factor and battery life. This translates to the requirements of reduction in the die area and reduced power consumption for the semiconductor chips that go inside these gadgets. Performance is also important, as increasingly sophisticated applications run on these devices, and many of them require fast response time. The form factor of such electronics goods depends not only on the overall area of the chips inside them but also on the packaging, which depends on thermal ch- acteristics. Thermal characteristics in turn depend on peak power signature of the chips. As a result, while the overall energy usage reduction increases battery life, peak power reduction in?uences the form factor. One more important aspect of these electronic equipments is that every 6 months or so, a newer feature needs to be added to keep ahead of the market competition, and hence new designs have to be completed with these new features, better form factor, battery life, and performance every few months. This extreme pressure on the time to market is another force that drives the innovations in design automation of semiconductor chips.
This book describes the current state of the art for simulating paint shop applications, their advantages and limitations, as well as corresponding high-performance computing (HPC) methods utilized in this domain. The authors provide a comprehensive introduction to fluid simulations, corresponding optimization methods from the HPC domain, as well as industrial paint shop applications. They showcase how the complexity of these applications bring corresponding fluid simulation methods to their limits and how these shortcomings can be overcome by employing HPC methods. To that end, this book covers various optimization techniques for three individual fluid simulation techniques, namely grid-based methods, volumetric decomposition methods, and particle-based methods.
Through a series of step-by-step tutorials and numerous hands-on exercises, this book aims to equip the reader with both a good understanding of the importance of space in the abstract world of engineers and the ability to create a model of a product in virtual space - a skill essential for any designer or engineer who needs to present ideas concerning a particular product within a professional environment. The exercises progress logically from the simple to the more complex; while Solid Works or NX is the software used, the underlying philosophy is applicable to all modeling software. In each case, the explanation covers the entire procedure from the basic idea and production capabilities through to the real model; the conversion from 3D model to 2D manufacturing drawing is also clearly explained. Topics covered include modeling of prism, axisymmetric, symmetric and sophisticated shapes; digitization of physical models using modeling software; creation of a CAD model starting from a physical model; free form surface modeling; modeling of product assemblies following bottom-up and top-down principles; and the presentation of a product in accordance with the rules of technical documentation. This book, which includes more than 500 figures, will be ideal for students wishing to gain a sound grasp of space modeling techniques. Academics and professionals will find it to be an excellent teaching and research aid, and an easy-to-use guide.
Mixed Reality is moving out of the research-labs into our daily lives. It plays an increasing role in architecture, design and construction. The combination of digital content with reality creates an exciting synergy that sets out to enhance engagement within architectural design and construction. State-of-the-art research projects on theories and applications within Mixed Reality are presented by leading researchers covering topics in architecture, design collaboration, construction and education. They discuss current projects and offer insight into the next wave of Mixed Reality possibilities. |
![]() ![]() You may like...
Head First Java, 3rd Edition - A…
Kathy Sierra, Bert Bates, …
Paperback
Probabilistic Nodes Combination (PNC…
Dariusz Jacek Jakobczak
Hardcover
R5,023
Discovery Miles 50 230
|