![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
E-maintenance is the synthesis of two major trends in today's society: the growing importance of maintenance as a key technology and the rapid development of information and communication technology. E-maintenance gives the reader an overview of the possibilities offered by new and advanced information and communication technology to achieve efficient maintenance solutions in industry, energy production and transportation, thereby supporting sustainable development in society. Sixteen chapters cover a range of different technologies, such as: new micro sensors, on-line lubrication sensors, smart tags for condition monitoring, wireless communication and smart personal digital assistants. E-maintenance also discusses semantic data-structuring solutions; ontology structured communications; implementation of diagnostics and prognostics; and maintenance decision support by economic optimisation. It includes four industrial cases that are both described and analysed in detail, with an outline of a global application solution. E-maintenance is a useful tool for engineers and technicians who wish to develop e-maintenance in industrial sites. It is also a source of new and stimulating ideas for researchers looking to make the next step towards sustainable development.
Designing Inclusive Interactions contains the proceedings of the fifth Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), incorporating the 8th Cambridge Workshop on Rehabilitation Robotics, held in Cambridge, England, in March 2010. It contains contributions from an international group of leading researchers in the fields of Universal Access and Assistive Technology. This conference will mainly focus on the following principal topics: 1. Designing assistive and rehabilitation technology for working and daily living environments 2. Measuring inclusion for the design of products for work and daily living 3. Inclusive interaction design and new technologies for inclusive design 4. Assembling new user data for inclusive design 5. The design of accessible and inclusive contexts: work and daily living environments 6. Business advantages and applications of inclusive design 7. Legislation, standards and government awareness of inclusive design
This book helps readers evaluate and specificy the best Warehouse Management System (WMS) for their need. The advice is based on practical knowledge, describing in detail fundamental processes and technologies needed for a basic understanding. New approaches in the structure and design of WMS are presented, along with discussion of the limitations of current systems. The book shows how to operate a simple WMS based on the open-source initiative myWMS.
It is widely acknowledged that the cost of validation and testing comprises a s- nificant percentage of the overall development costs for electronic systems today, and is expected to escalate sharply in the future. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected up front, they can result in severe consequence- both financially and from a safety viewpoint. Indeed, several recent instances of high-profile functional errors (e. g. , the Pentium FDIV bug) have resulted in - creased attention paid to verifying the functional correctness of designs. Recent efforts have proposed augmenting the traditional RTL simulation-based validation methodology with formal techniques in an attempt to uncover hard-to-find c- ner cases, with the goal of trying to reach RTL functional verification closure. However, what is often not highlighted is the fact that in spite of the tremendous time and effort put into such efforts at the RTL and lower levels of abstraction, the complexity of contemporary embedded systems makes it difficult to guarantee functional correctness at the system level under all possible operational scenarios. The problem is exacerbated in current System-on-Chip (SOC) design meth- ologies that employ Intellectual Property (IP) blocks composed of processor cores, coprocessors, and memory subsystems. Functional verification becomes one of the major bottlenecks in the design of such systems.
The Core Test Wrapper Handbook: Rationale and Application of IEEE Std. 1500tm provides insight into the rules and recommendations of IEEE Std. 1500. This book focuses on practical design considerations inherent to the application of IEEE Std. 1500 by discussing design choices and other decisions relevant to this IEEE standard. The authors provide background information about some of the choices and decisions made throughout the design of IEEE Std. 1500.
This book is about formal veri?cation, that is, the use of mathematical reasoning to ensure correct execution of computing systems. With the increasing use of c- puting systems in safety-critical and security-critical applications, it is becoming increasingly important for our well-being to ensure that those systems execute c- rectly. Over the last decade, formal veri?cation has made signi?cant headway in the analysis of industrial systems, particularly in the realm of veri?cation of hardware. A key advantage of formal veri?cation is that it provides a mathematical guarantee of their correctness (up to the accuracy of formal models and correctness of r- soning tools). In the process, the analysis can expose subtle design errors. Formal veri?cation is particularly effective in ?nding corner-case bugs that are dif?cult to detect through traditional simulation and testing. Nevertheless, and in spite of its promise, the application of formal veri?cation has so far been limited in an ind- trial design validation tool ?ow. The dif?culties in its large-scale adoption include the following (1) deductive veri?cation using theorem provers often involves - cessive and prohibitive manual effort and (2) automated decision procedures (e. g. , model checking) can quickly hit the bounds of available time and memory. This book presents recent advances in formal veri?cation techniques and d- cusses the applicability of the techniques in ensuring the reliability of large-scale systems. We deal with the veri?cation of a range of computing systems, from - quential programsto concurrentprotocolsand pipelined machines.
I am very pleased to play even a small part in the publication of this book on the SIGNAL language and its environment POLYCHRONY. I am sure it will be a s- ni?cant milestone in the development of the SIGNAL language, of synchronous computing in general, and of the data?ow approach to computation. In data?ow, the computation takes place in a producer-consumer network of - dependent processing stations. Data travels in streams and is transformed as these streams pass through the processing stations (often called ?lters). Data?ow is an attractive model for many reasons, not least because it corresponds to the way p- duction,transportation,andcommunicationare typicallyorganizedin the real world (outside cyberspace). I myself stumbled into data?ow almost against my will. In the mid-1970s, Ed Ashcroft and I set out to design a "super" structured programming language that, we hoped, would radically simplify proving assertions about programs. In the end, we decided that it had to be declarative. However, we also were determined that iterative algorithms could be expressed directly, without circumlocutions such as the use of a tail-recursive function. The language that resulted, which we named LUCID, was much less traditional then we would have liked. LUCID statements are equations in a kind of executable temporallogic thatspecifythe (time)sequencesof variablesinvolvedin aniteration.
This book gathers the latest experience of experts, research teams and leading organizations involved in computer-aided design of user interfaces of interactive applications. This area investigates how it is desirable and possible to support, to facilitate and to speed up the development life cycle of any interactive system. In particular, it stresses how the design activity could be better understood for different types of advanced interactive systems.
This book describes the state-of-the-art in RF, analog, and mixed-signal circuit design for Software Defined Radio (SDR). It synthesizes for analog/RF circuit designers the most important general design approaches to take advantage of the most recent CMOS technology, which can integrate millions of transistors, as well as several real examples from the most recent research results.
As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a computation but also about the software flow that supports the design process. The goal of this book is to help designers become comfortable with these issues, and thus be able to exploit the vast opportunities possible with reconfigurable logic.
Embedded processors are the heart of embedded systems. Reconfigurable embedded processors comprise an extended instruction set that is implemented using a reconfigurable fabric (similar to a field-programmable gate array, FPGA). This book presents novel concepts, strategies, and implementations to increase the run-time adaptivity of reconfigurable embedded processors. Concepts and techniques are presented in an accessible, yet rigorous context. A complex, realistic H.264 video encoder application with a high demand for adaptivity is presented and used as an example for motivation throughout the book. A novel, run-time system is demonstrated to exploit the potential for adaptivity and particular approaches/algorithms are presented to implement it.
Kinetic energy harvesting converts movement or vibrations into electrical energy, enables battery free operation of wireless sensors and autonomous devices and facilitates their placement in locations where replacing a battery is not feasible or attractive. This book provides an introduction to operating principles and design methods of modern kinetic energy harvesting systems and explains the implications of harvested power on autonomous electronic systems design. It describes power conditioning circuits that maximize available energy and electronic systems design strategies that minimize power consumption and enable operation. The principles discussed in the book will be supported by real case studies such as battery-less monitoring sensors at water waste processing plants, embedded battery-less sensors in automotive electronics and sensor-networks built with ultra-low power wireless nodes suitable for battery-less applications.
This book covers the practical application of dependable electronic systems in real industry, such as space, train control and automotive control systems, and network servers/routers. The impact from intermittent errors caused by environmental radiation (neutrons and alpha particles) and EMI (Electro-Magnetic Interference) are introduced together with their most advanced countermeasures. Power Integration is included as one of the most important bases of dependability in electronic systems. Fundamental technical background is provided, along with practical design examples. Readers will obtain an overall picture of dependability from failure causes to countermeasures for their relevant systems or products, and therefore, will be able to select the best choice for maximum dependability.
Dead-Reckoning aided with Doppler velocity measurement has been the most common method for underwater navigation for small vehicles. Unfortunately DR requires frequent position recalibrations and underwater vehicle navigation systems are limited to periodic position update when they surface. Finally standard Global Positioning System (GPS) receivers are unable to provide the rate or precision required when used on a small vessel. To overcome this, a low cost high rate motion measurement system for an Unmanned Surface Vehicle (USV) with underwater and oceanographic purposes is proposed. The proposed onboard system for the USV consists of an Inertial Measurement Unit (IMU) with accelerometers and rate gyros, a GPS receiver, a flux-gate compass, a roll and tilt sensor and an ADCP. Interfacing all the sensors proved rather challenging because of their different characteristics. The proposed data fusion technique integrates the sensors and develops an embeddable software package, using real time data fusion methods, for a USV to aid in navigation and control as well as controlling an onboard Acoustic Doppler Current Profiler (ADCP). While ADCPs non-intrusively measure water flow, the vessel motion needs to be removed to analyze the data and the system developed provides the motion measurements and processing to accomplish this task.
High definition video requires substantial compression in order to be transmitted or stored economically. Advances in video coding standards from MPEG-1, MPEG-2, MPEG-4 to H.264/AVC have provided ever increasing coding efficiency, at the expense of great computational complexity which can only be delivered through massively parallel processing. This book will present VLSI architectural design and chip implementation for high definition H.264/AVC video encoding, using a state-of-the-art video application, with complete VLSI prototype, via FPGA/ASIC. It will serve as an invaluable reference for anyone interested in VLSI design and high-level (EDA) synthesis for video.
Single-threaded software applications have ceased to see signi?cant gains in p- formance on a general-purpose CPU, even with further scaling in very large scale integration (VLSI) technology. This is a signi?cant problem for electronic design automation (EDA) applications, since the design complexity of VLSI integrated circuits (ICs) is continuously growing. In this research monograph, we evaluate custom ICs, ?eld-programmable gate arrays (FPGAs), and graphics processors as platforms for accelerating EDA algorithms, instead of the general-purpose sing- threaded CPU. We study applications which are used in key time-consuming steps of the VLSI design ?ow. Further, these applications also have different degrees of inherent parallelism in them. We study both control-dominated EDA applications and control plus data parallel EDA applications. We accelerate these applications on these different hardware platforms. We also present an automated approach for accelerating certain uniprocessor applications on a graphics processor. This monograph compares custom ICs, FPGAs, and graphics processing units (GPUs) as potential platforms to accelerate EDA algorithms. It also provides details of the programming model used for interfacing with the GPUs.
With the increasing complexity and dynamism in today's product design and manufacturing, more optimal, robust and practical approaches and systems are needed to support product design and manufacturing activities. Multi-objective Evolutionary Optimisation for Product Design and Manufacturing presents a focused collection of quality chapters on state-of-the-art research efforts in multi-objective evolutionary optimisation, as well as their practical applications to integrated product design and manufacturing. Multi-objective Evolutionary Optimisation for Product Design and Manufacturing consists of two major sections. The first presents a broad-based review of the key areas of research in multi-objective evolutionary optimisation. The second gives in-depth treatments of selected methodologies and systems in intelligent design and integrated manufacturing. Recent developments and innovations in multi-objective evolutionary optimisation make Multi-objective Evolutionary Optimisation for Product Design and Manufacturing a useful text for a broad readership, from academic researchers to practicing engineers.
This volume contains the refereed and revised papers of the Fourth International Conference on Design Computing and Cognition (DCC'10), held in Stuttgart, Germany. The material in this book represents the state-of-the-art research and developments in design computing and design cognition. The papers are grouped under the following nine headings, describing both advances in theory and application and demonstrating the depth and breadth of design computing and design cognition: Design Cognition; Framework Models in Design; Design Creativity; Lines, Planes, Shape and Space in Design; Decision-Making Processes in Design; Knowledge and Learning in Design; Using Design Cognition; Collaborative/Collective Design; and Design Generation. This book is of particular interest to researchers, developers and users of advanced computation in design across all disciplines and to those who need to gain better understanding of designing.
This book deals with energy delivery challenges of the power processing unit of modern computer microprocessors. It describes in detail the consequences of current trends in miniaturization and clock frequency increase, upon the power delivery unit, referred to as voltage regulator. This is an invaluable reference for anybody needing to understand the key performance limitations and opportunities for improvement, from both a circuit and systems perspective, of state-of-the-art power solutions for next generation CPUs.
This book is intended to give a general overview of reliability, faults, fault models, nanotechnology, nanodevices, fault-tolerant architectures and reliability evaluation techniques. Additionally, the book provides an in depth state-of-the-art research results and methods for fault tolerance as well as the methodology for designing fault-tolerant systems out of highly unreliable components.
The field of large-scale dimensional metrology (LSM) deals with objects that have linear dimensions ranging from tens to hundreds of meters. It has recently attracted a great deal of interest in many areas of production, including the automotive, railway, and shipbuilding sectors. Distributed Large-Scale Dimensional Metrology introduces a new paradigm in this field that reverses the classical metrological approach: measuring systems that are portable and can be easily moved around the location of the measured object, which is preferable to moving the object itself. Distributed Large-Scale Dimensional Metrology combines the concepts of distributed systems and large scale metrology at the application level. It focuses on the latest insights and challenges of this new generation of systems from the perspective of the designers and developers. The main topics are: coverage of measuring area, sensors calibration, on-line diagnostics, probe management, and analysis of metrological performance. The general descriptions of each topic are further enriched by specific examples concerning the use of commercially available systems or the development of new prototypes. This will be particularly useful for professional practitioners such as quality engineers, manufacturing and development engineers, and procurement specialists, but Distributed Large-Scale Dimensional Metrology also has a wealth of information for interested academics.
This book tackles head-on the challenges of digital design in the era of billion-transistor SoCs. It discusses fundamental design concepts in design and coding required to produce robust, functionally correct designs. It also provides specific techniques for measuring and minimizing complexity in RTL code. Finally, it discusses the tradeoff between RTL and high-level (C-based) design and how tools and languages must progress to address the needs of tomorrow's SoC designs.
This book presents new results on applications of geometric algebra. The time when researchers and engineers were starting to realize the potential of quaternions for - plications in electrical, mechanic, and control engineering passed a long time ago. Since the publication of Space-Time Algebra by David Hestenes (1966) and Clifford Algebra to Geometric Calculus: A Uni?ed Language for Mathematics and Physics by David Hestenes and Garret Sobczyk (1984), consistent progress in the app- cations of geometric algebra has taken place. Particularly due to the great dev- opments in computer technology and the Internet, researchers have proposed new ideas and algorithms to tackle a variety of problems in the areas of computer science and engineering using the powerful language of geometric algebra. In this process, pioneer groups started the conference series entitled "Applications of Geometric Algebra in Computer Science and Engineering" (AGACSE) in order to promote the research activity in the domain of the application of geometric algebra. The ?rst conference, AGACSE'1999, organized by Eduardo Bayro-Corrochano and Garret Sobczyk, took place in Ixtapa-Zihuatanejo, Mexico, in July 1999. The contri- tions were published in Geometric Algebra with Applications in Science and En- neering, Birkhauser, 2001. The second conference, ACACSE'2001, was held in the Engineering Department of the Cambridge University on 9-13 July 2001 and was organizedbyLeoDorst,ChrisDoran,andJoanLasenby. Thebestconferencecont- butions appeared as a book entitled Applications of Geometric Algebra in Computer Science and Engineering, Birkhauser, 2002. The third conference, AGACSE'2008, took place in August 2008 in Grimma, Leipzig, Germany.
The importance of research and education in design continues to grow. For example, government agencies are gradually increasing funding of design research, and increasing numbers of engineering schools are revising their curricula to emphasize design. This is because of an increasing realization that design is part of the wealth creation of a nation and needs to be better understood and taught. The continuing globalization of industry and trade has required nations to re-examine where their core contributions lie if not in production efficiency. Design is a precursor to manufacturing for phy- cal objects and is the precursor to implementation for virtual objects. At the same time, the need for sustainable development is requiring design of new products and processes, and feeding a movement towards design - novations and inventions. There are now three sources for design research: design computing, design cognition and human-centered information technology. The foun- tions for much of design computing remains artificial intelligence with its focus on ways of representation and on processes that support simulation and generation. Artificial intelligence continues to provide an environm- tally rich paradigm within which design research based on computational constructions can be carried out. Design cognition is founded on concepts from cognitive science, an even newer area than artificial intelligence. It provides tools and methods to study human designers in both laboratory and practice settings. |
You may like...
Powerful Pulsed Plasma Generators…
Victor Kolikov, Alexander Bogomaz, …
Hardcover
R3,354
Discovery Miles 33 540
More than Ramps - A Guide to Improving…
Lisa I. Iezzoni, Bonnie L. O'Day
Hardcover
R2,453
Discovery Miles 24 530
Web Technologies & Applications
Sammulal Porika, M Peddi Kishore
Hardcover
Web-Based Services - Concepts…
Information Reso Management Association
Hardcover
R16,895
Discovery Miles 168 950
|