![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Technology: general issues > Technical design
This monograph presents the latest developments and applications of computational tools related to the biosciences and medical engineering. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open completely new research fields with a closer joining of the engineering and bio/medical area. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the "language" is sometimes different from discipline to discipline. This monograph reports the results of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices and medical materials. It is also shown that the application of computational methods often necessitates mathematical and experimental methods.
Unifying two decades of research, this book is the first to establish a comprehensive foundation for a systematic analysis and design of linear systems with general state and input constraints. For such systems, which can be used as models for most nonlinear systems, the issues of stability, controller design, additonal constraints, and satisfactory performance are addressed. The book is an excellent reference for practicing engineers, graduate students, and researchers in control systems theory and design. It may also serve as an advanced graduate text for a course or a seminar in nonlinear control systems theory and design in applied mathematics or engineering departments. Minimal prerequisites include a first graduate course in state-space methods as well as a first course in control systems design.
Proceedings of China Modern Logistics Engineering covers nearly all areas of logistics engineering technology, focusing on the latest findings and the following theoretical aspects: Logistics Systems and Management Research; Green Logistics and Emergency Logistics; Enterprise Logistics; Material Handling; Warehousing Technology Research; Supply Chain Management; Logistics Equipment; Logistics Packaging Technology; Third-party Logistics, etc. The book will help readers to grasp the relevant aspects of the theory involved, research and development trends, while also offering guidance for their work and related studies. It is intended for researchers, scholars and graduate students in logistics management, logistics engineering, transportation, business administration, E-commerce and industrial engineering.
Consolidating existing knowledge in Design Science, this book proposes a new research method to aid the exploration of design and problem solving within business, science and technology. It seeks to overcome a dichotomy that exists in the field between theory and practice to enable researches to find solutions to problems, rather than focusing on the explanation and exploration of the problems themselves. Currently, researches concentrate on to describing, exploring, explaining and predicting phenomena, and little attention is devoted to prescribing solutions. Herbert Simon proposes the need to develop a Science of the Artificial (Design Science), arguing that our reality is much more artificial than natural. However, the research conducted on the Design Science premises has so far been scattered and erratic in different fields of research, such as management, systems information and engineering. This book aims to address this issue by bringing these fields together and emphasising the need for solutions. This book provides a valuable resource to students and researchers of research methods, information systems, management and management science, and production and operations management.
This book is about formal veri?cation, that is, the use of mathematical reasoning to ensure correct execution of computing systems. With the increasing use of c- puting systems in safety-critical and security-critical applications, it is becoming increasingly important for our well-being to ensure that those systems execute c- rectly. Over the last decade, formal veri?cation has made signi?cant headway in the analysis of industrial systems, particularly in the realm of veri?cation of hardware. A key advantage of formal veri?cation is that it provides a mathematical guarantee of their correctness (up to the accuracy of formal models and correctness of r- soning tools). In the process, the analysis can expose subtle design errors. Formal veri?cation is particularly effective in ?nding corner-case bugs that are dif?cult to detect through traditional simulation and testing. Nevertheless, and in spite of its promise, the application of formal veri?cation has so far been limited in an ind- trial design validation tool ?ow. The dif?culties in its large-scale adoption include the following (1) deductive veri?cation using theorem provers often involves - cessive and prohibitive manual effort and (2) automated decision procedures (e. g. , model checking) can quickly hit the bounds of available time and memory. This book presents recent advances in formal veri?cation techniques and d- cusses the applicability of the techniques in ensuring the reliability of large-scale systems. We deal with the veri?cation of a range of computing systems, from - quential programsto concurrentprotocolsand pipelined machines.
Since its emergence as an important research area in the early 1980s, the topic of wavelets has undergone tremendous development on both theoretical and applied fronts. Myriad research and survey papers and monographs have been published on the subject, documenting different areas of applications such as sound and image processing, denoising, data compression, tomography, and medical imaging. The study of wavelets remains a very active field of research, and many of its central techniques and ideas have evolved into new and promising research areas. This volume, a collection of invited contributions developed from talks at an international conference on wavelets, is divided into three parts: Part I is devoted to the mathematical theory of wavelets and features several papers on wavelet sets and the construction of wavelet bases in different settings. Part II looks at the use of multiscale harmonic analysis for understanding the geometry of large data sets and extracting information from them. Part III focuses on applications of wavelet theory to the study of several real-world problems. Overall, the book is an excellent reference for graduate students, researchers, and practitioners in theoretical and applied mathematics, or in engineering.
Embedded systems applications that are either mission or safety-critical usually entail low- to mid- production volumes, require the rapid development of specific tasks, which are typically computing intensive, and are cost bounded. The adoption of re-configurable FPGAs in such application domains is constrained to the availability of suitable techniques to guarantee the dependability requirements entailed by critical applications. This book describes the challenges faced by designers when implementing a mission- or safety-critical application using re-configurable FPGAs and it details various techniques to overcome these challenges. In addition to an overview of the key concepts of re-configurable FPGAs, it provides a theoretical description of the failure modes that can cause incorrect operation of re-configurable FPGA-based electronic systems. It also outlines analysis techniques that can be used to forecast such failures and covers the theory behind solutions to mitigate fault effects. This book also reviews current technologies available for building re-configurable FPGAs, specifically SRAM-based technology and Flash-based technology. For each technology introduced, theoretical concepts presented are applied to real cases. Design techniques and tools are presented to develop critical applications using commercial, off-the-shelf devices, such as Xilinx Virtex FPGAs, and Actel ProASIC FPGAs. Alternative techniques based on radiation hardened FPGAs, such as Xilinx SIRF and Atmel ATF280 are also presented. This publication is an invaluable reference for anyone interested in understanding the technologies of re-configurable FPGAs, as well as designers developing critical applications based on these technologies.
This book contains selected papers of the 11th OpenFOAM (R) Workshop that was held in Guimaraes, Portugal, June 26 - 30, 2016. The 11th OpenFOAM (R) Workshop had more than 140 technical/scientific presentations and 30 courses, and was attended by circa 300 individuals, representing 180 institutions and 30 countries, from all continents. The OpenFOAM (R) Workshop provided a forum for researchers, industrial users, software developers, consultants and academics working with OpenFOAM (R) technology. The central part of the Workshop was the two-day conference, where presentations and posters on industrial applications and academic research were shown. OpenFOAM (R) (Open Source Field Operation and Manipulation) is a free, open source computational toolbox that has a larger user base across most areas of engineering and science, from both commercial and academic organizations. As a technology, OpenFOAM (R) provides an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics, among several others. Additionally, the OpenFOAM technology offers complete freedom to customize and extend its functionalities.
Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and clock networks, and analog/mixed-signal circuits; Helps chip designers understand the potential and limitations of their design tools, improving their design productivity; Presents analysis of each algorithm with practical applications in the context of real circuit design; Includes numerical examples for the quantitative analysis and evaluation of algorithms presented.
This book provides a selection of contributions to the DIPSI workshop 2019 (Droplet Impact Phenomena & Spray Investigations) as well as recent progress of the Int. Research Training Group "DROPIT".The DIPSI workshop, which is now at its thirteenth edition, represents an important opportunity to share recent knowledge on droplets and sprays in a variety of research fields and industrial applications. The research training group "DROPIT" is focused on droplet interaction technologies where microscopic effects influence strongly macroscopic behavior. This requires the inclusion of interface kinetics and/or a detailed analysis of surface microstructures. Normally, complicated technical processes cover the underlying basic mechanisms, and therefore, progress in the overall process modelling can hardly be gained. Therefore, DROPIT focuses on the underlying basic processes. This is done by investigating different spatial and/or temporal scales of the problems and by linking them through a multi-scale approach. In addition, multi-physics are required to understand e.g. problems for droplet-wall interactions, where porous structures are involved.
This book contains the extended papers presented at the 2nd Workshop on Supervised and Unsupervised Ensemble Methods and their Applications (SUEMA)heldon21-22July,2008inPatras,Greece,inconjunctionwiththe 18thEuropeanConferenceon Arti?cial Intelligence(ECAI 2008). This wo- shop was a successor of the smaller event held in 2007 in conjunction with 3rd Iberian Conference on Pattern Recognition and Image Analysis, Girona, Spain. The success of that event as well as the publication of workshop - pers in the edited book "Supervised and Unsupervised Ensemble Methods and their Applications", published by Springer-Verlag in Studies in Com- tational Intelligence Series in volume 126, encouraged us to continue a good tradition. The scope of both SUEMA workshops (hence, the book as well) is the application of theoretical ideas in the ?eld of ensembles of classi?cation and clusteringalgorithmstoreal/lifeproblemsinscienceandindustry. Ensembles, which represent a number of algorithms whose class or cluster membership predictions are combined together to produce a single outcome value, have alreadyprovedto be a viable alternativeto a single best algorithmin various practical tasks under di?erent scenarios, from bioinformatics to biometrics, from medicine to network security. The ensemble approach is caused to life by the famous "no free lunch" theorem, stating that there is no absolutely best algorithm to solve all problems. Although ensembles cannot be cons- ered as absolute remedy of a single algorithm de?ciency, it is widely believed thatensemblesprovideabetteranswerto"nofreelunch"theoremthanas- glebestalgorithm. Statistical,algorithmical,representational,computational and practical reasons can explain the success of ensemble methods.
Old age is currently the greatest risk factor for developing dementia. Since older people make up a larger portion of the population than ever before, the resulting increase in the incidence of dementia presents a major challenge for society. Dementia is complex and multifaceted and impacts not only the person with the diagnosis but also those caring for them and society as a whole. Human-Computer Interaction (HCI) design and development are pivotal in enabling people with dementia to live well and be supported in the communities around them. HCI is increasingly addressing the need for inclusivity and accessibility in the design and development of new technologies, interfaces, systems, services, and tools. Using interdisciplinary approaches HCI engages with the complexities and 'messiness' of real-world design spaces to provide novel perspectives and new ways of addressing the challenge of dementia and multi-stakeholder needs. HCI and Design in the Context of Dementia brings together the work of international experts, designers and researchers working across disciplines. It provides methodologies, methods and frameworks, approaches to participatory engagement and case studies showing how technology can impact the lives of people living with dementia and those around them. It includes examples of how to conduct dementia research and design in-context in the field of HCI, ethically and effectively and how these issues transcend the design space of dementia to inform HCI design and technology development more broadly. The book is valuable for and aimed at designers, researchers, scholars and caregivers that work with vulnerable groups like people with dementia, and those directly impacted.
This book contains the proceedings of the Additive Manufacturing in Product Development Conference. The content focus on how to support real-world value chains by developing additive manufactured series products.
This book considers all aspects of performability engineering, providing a holistic view of the activities associated with a product throughout its entire life cycle of the product, as well as the cost of minimizing the environmental impact at each stage, while maximizing the performance. Building on the editor's previous Handbook of Performability Engineering, it explains how performability engineering provides us with a framework to consider both dependability and sustainability in the optimal design of products, systems and services, and explores the role of performability in energy and waste minimization, raw material selection, increased production volume, and many other areas of engineering and production. The book discusses a range of new ideas, concepts, disciplines, and applications in performability, including smart manufacturing and Industry 4.0; cyber-physical systems and artificial intelligence; digital transformation of railways; and asset management. Given its broad scope, it will appeal to researchers, academics, industrial practitioners and postgraduate students involved in manufacturing, engineering, and system and product development.
To satisfy the higher requirements of digitally converged embedded systems, this book describes heterogeneous multicore technology that uses various kinds of low-power embedded processor cores on a single chip. With this technology, heterogeneous parallelism can be implemented on an SoC, and greater flexibility and superior performance per watt can then be achieved. This book defines the heterogeneous multicore architecture and explains in detail several embedded processor cores including CPU cores and special-purpose processor cores that achieve highly arithmetic-level parallelism. The authors developed three multicore chips (called RP-1, RP-2, and RP-X) according to the defined architecture with the introduced processor cores. The chip implementations, software environments, and applications running on the chips are also explained in the book. Provides readers an overview and practical discussion of heterogeneous multicore technologies from both a hardware and software point of view; Discusses a new, high-performance and energy efficient approach to designing SoCs for digitally converged, embedded systems; Covers hardware issues such as architecture and chip implementation, as well as software issues such as compilers, operating systems, and application programs; Describes three chips developed according to the defined heterogeneous multicore architecture, including chip implementations, software environments, and working applications.
This book gives Abaqus users who make use of finite-element models in academic or practitioner-based research the in-depth program knowledge that allows them to debug a structural analysis model. The book provides many methods and guidelines for different analysis types and modes, that will help readers to solve problems that can arise with Abaqus if a structural model fails to converge to a solution. The use of Abaqus affords a general checklist approach to debugging analysis models, which can also be applied to structural analysis. The author uses step-by-step methods and detailed explanations of special features in order to identify the solutions to a variety of problems with finite-element models. The book promotes: * a diagnostic mode of thinking concerning error messages; * better material definition and the writing of user material subroutines; * work with the Abaqus mesher and best practice in doing so; * the writing of user element subroutines and contact features with convergence issues; and * consideration of hardware and software issues and a Windows HPC cluster solution. The methods and information provided facilitate job diagnostics and help to obtain converged solutions for finite-element models regarding structural component assemblies in static or dynamic analysis. The troubleshooting advice ensures that these solutions are both high-quality and cost-effective according to practical experience. The book offers an in-depth guide for students learning about Abaqus, as each problem and solution are complemented by examples and straightforward explanations. It is also useful for academics and structural engineers wishing to debug Abaqus models on the basis of error and warning messages that arise during finite-element modelling processing.
This book offers a clear, yet comprehensive guide to how to structure a design project, focusing in particular on the key questions designers, architects, policy makers and health professionals should consider when working towards inclusion through design. The book is based on a series of lessons held by the author and his colleague Avril Accolla, whose aim was to train technicians at all levels to be capable of catering for the needs of the elderly. It clearly draws the outline of their "Ask the Right Question" approach, whose purpose is to help convey the notions in question appropriately to people with such widely different backgrounds, curricula, interests and cultures. Using a minimalist approach, based mainly on the discussion of eye-catching real-life examples placed in logical order and a crystal clear, engaging style, this book is a must-have for designers, technicians, customers and health practitioners, as well as social scientists and policy makers who deal with inclusive design at different levels and anyone interested in topics related to technological evolution and social integration.
This book provides an introduction to the unique and fascinating properties of alloys and composites from novel commercialized thermosetting resins based on polybenzoxazines. Their outstanding properties such as processability, thermal, mechanical, electrical properties as well as ballistic impact properties of polybenzoxazine alloys and composites make them attractive for various applications in electronic packaging encapsulation, light weight ballistic armour composites and bipolar plate in fuel cells.
More and more information, audio and video but also a range of other information type, is generated, processed and used by machines today, even though the end user may be a human. The result over the past 15 years has been a substantial increase in the type of information and change in the way humans generate, classify, store, search, access and consume information. Conversion of information to digital form is a prerequisite for this enhanced machine role, but must be done having in mind requirements such as compactness, fidelity, interpretability etc. This book presents new ways of dealing with digital information and new types of digital information underpinning the evolution of society and business.
During the past two decades, there has been a dramatic increase in interest in the study of ageing-related changes in cognitive abilities. In this volume researchers from a variety of theoretical perspectives discuss adult age differences in a wide range of cognitive skills. Of special interest is the extent to which ageing effects on performance are related to variations in the representation, organization and utilization of knowledge, broadly defined. Recent research and theory in the field of ageing has emphasized the need to examine such processes more closely in order to provide a more complete understanding of ageing effects on cognitive behaviour.
Artificial Intelligence (AI) is penetrating in all sciences as a multidisciplinary approach. However, adopting the theory of AI including computer vision and computer audition to urban intellectual space, is always difficult for architecture and urban planners. This book overcomes this challenge through a conceptual framework by merging computer vision and audition to urban studies based on a series of workshops called Remorph, conducted by Tehran Urban Innovation Center (TUIC).
The concept generation process seems like an intuitional thought: difficult to capture and perform, although everyone is capable of it. It is not an analytical process but a synthetic process which has yet to be clarified. Furthermore, new research methods for investigating the concept generation process-a very difficult task since the concept generation process is driven by inner feelings deeply etched in the mind-are necessary to establish its theory and methodology. Concept Generation for Design Creativity - A Systematized Theory and Methodology presents the concept generation process both theoretically and methodologically. Theoretically, the concept generation process is discussed by comparing metaphor, abduction, and General Design Theory from the notions of similarities and dissimilarities. Analogy, blending, and integration by thematic relation have been explained methodologically. So far, these theories and methods have been discussed independently, and the relations among them have not been clarified. Two newly developed research methods to investigate the concept generation process are clearly explained: the explanation-based protocol analysis and constructive simulation. By reading Concept Generation for Design Creativity - A Systematized Theory and Methodology, students, researchers and lecturers in design disciplines (including engineering design, industrial design, software design, CHI, design education, and cognitive science ) can obtain a clear picture of the advanced research findings and the outline of the theories and methods for concept generation. Furthermore, readers are expected to achieve the competence to generate new concepts. "
The book provides readers with a snapshot of recent research and technological trends in the field of condition monitoring of machinery working under a broad range of operating conditions. Each chapter, accepted after a rigorous peer-review process, reports on an original piece of work presented and discussed at the 4th International Conference on Condition Monitoring of Machinery in Non-stationary Operations, CMMNO 2014, held on December 15-16, 2014, in Lyon, France. The contributions have been grouped into three different sections according to the main subfield (signal processing, data mining or condition monitoring techniques) they are related to. The book includes both theoretical developments as well as a number of industrial case studies, in different areas including, but not limited to: noise and vibration; vibro-acoustic diagnosis; signal processing techniques; diagnostic data analysis; instantaneous speed identification; monitoring and diagnostic systems; and dynamic and fault modeling. This book not only provides a valuable resource for both academics and professionals in the field of condition monitoring, it also aims at facilitating communication and collaboration between the two groups.
After two succesful conferences held in Innsbruck (Prof. Manfred Husty) in 2006 and Cassino in 2008 (Prof Marco Ceccarelli) with the participation of the most important well-known scientists from the European Mechanism Science Community, a further conference was held in Cluj Napoca, Romania, in 2010 (Prof. Doina Pisla) to discuss new developments in the field. This book presents the most recent research advances in Mechanism Science with different applications. Amongst the topics treated are papers on Theoretical kinematics, Computational kinematics, Mechanism design, Mechanical transmissions, Linkages and manipulators, Mechanisms for biomechanics, Micro-mechanisms, Experimental mechanics, Mechanics of robots, Dynamics of multi-body systems, Dynamics of machinery, Control issues of mechanical systems, Novel designs, History of mechanism science etc. |
![]() ![]() You may like...
Corpus Stylistics in Heart of Darkness…
Lorenzo Mastropierro
Hardcover
R4,582
Discovery Miles 45 820
Handbook of Artificial Intelligence in…
Saravanan Krishnan, Ramesh Kesavan, …
Hardcover
R5,968
Discovery Miles 59 680
|