![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book contains the papers from the IFIP Working Group 8.1 conference on Situational Method Engineering. Over the last decade, Method Engineering, defined as the engineering discipline to design, construct and adapt methods, including supportive tools, has emerged as the research and application area for using methods for systems development.
A Designer's Guide to VHDL Synthesis is intended for both design engineers who want to use VHDL-based logic synthesis ASICs and for managers who need to gain a practical understanding of the issues involved in using this technology. The emphasis is placed more on practical applications of VHDL and synthesis based on actual experiences, rather than on a more theoretical approach to the language. VHDL and logic synthesis tools provide very powerful capabilities for ASIC design, but are also very complex and represent a radical departure from traditional design methods. This situation has made it difficult to get started in using this technology for both designers and management, since a major learning effort and culture' change is required. A Designer's Guide to VHDL Synthesis has been written to help design engineers and other professionals successfully make the transition to a design methodology based on VHDL and log synthesis instead of the more traditional schematic based approach. While there are a number of texts on the VHDL language and its use in simulation, little has been written from a designer's viewpoint on how to use VHDL and logic synthesis to design real ASIC systems. The material in this book is based on experience gained in successfully using these techniques for ASIC design and relies heavily on realistic examples to demonstrate the principles involved.
The advent of multimedia technology is creating a number of new problems in the fields of computer and communication systems. Perhaps the most important of these problems in communication, and certainly the most interesting, is that of designing networks to carry multimedia traffic, including digital audio and video, with acceptable quality. The main challenge in integrating the different services needed by the different types of traffic into the same network (an objective that is made worthwhile by its obvious economic advantages) is to satisfy the performance requirements of continuous media applications, as the quality of audio and video streams at the receiver can be guaranteed only if bounds on delay, delay jitters, bandwidth, and reliability are guaranteed by the network. Since such guarantees cannot be provided by traditional packet-switching technology, a number of researchers and research groups during the last several years have tried to meet the challenge by proposing new protocols or modifications of old ones, to make packet-switching networks capable of delivering audio and video with good quality while carrying all sorts of other traffic. The focus of this book is on HeiTS (the Heidelberg Transport System), and its contributions to integrated services network design. The HeiTS architecture is based on using the Internet Stream Protocol Version 2 (ST-II) at the network layer. The Heidelberg researchers were the first to implement ST-II. The author documents this activity in the book and provides thorough coverage of the improvements made to the protocol. The book also includes coverage of HeiTP as used in error handling, error control, congestion control, and the full specification of ST2+, a new version of ST-II. The ideas and techniques implemented by the Heidelberg group and their coverage in this volume apply to many other approaches to multimedia networking.
Targeted audience * Specialists in numerical computations, especially in numerical optimiza tion, who are interested in designing algorithms with automatie result ver ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be. * Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational com plexity of numerical computations. * Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing.
This useful book addresses electrothermal problems in modern VLSI systems. It discusses electrothermal phenomena and the fundamental building blocks that electrothermal simulation requires. The authors present three important applications of VLSI electrothermal analysis: temperature-dependent electromigration diagnosis, cell-level thermal placement, and temperature-driven power and timing analysis.
The availability of effective global communication facilities in the last decade has changed the business goals of many manufacturing enterprises. They need to remain competitive by developing products and processes which are specific to individual requirements, completely packaged and manufactured globally. Networks of enterprises are formed to operate across time and space with world-wide distributed functions such as manufacturing, sales, customer support, engineering, quality assurance, supply chain management and so on. Research and technology development need to address architectures, methodologies, models and tools supporting intra- and inter-enterprise operation and management. Throughout the life cycle of products and enterprises there is the requirement to transform information sourced from globally distributed offices and partners into knowledge for decision and action. Building on the success of previous DrrSM conferences (Tokyo 1993, Eindhoven 1996, Fort Worth 1998), the fourth International Conference on Design of Information Infrastructure Systems for Manufacturing (DrrSM 2000) aims to: * Establish and manage the dynamics of virtual enterprises, define the information system requirements and develop solutions; * Develop and deploy information management in multi-cultural systems with universal applicability of the proposed architecture and solutions; * Develop enterprise integration architectures, methodologies and information infrastructure support for reconfigurable enterprises; * Explore information transformation into knowledge for decision and action by machine and skilful people; These objectives reflect changes of the business processes due to advancements of information and communication technologies (ICT) in the last couple of years.
This book brings together in one place important contributions and state-of-the-art research in the rapidly advancing area of analog VLSI neural networks. The book serves as an excellent reference, providing insights into some of the most important issues in analog VLSI neural networks research efforts.
Here is a comprehensive presentation of methodology for the design and synthesis of an intelligent complex robotic system, connecting formal tools from discrete system theory, artificial intelligence, neural network, and fuzzy logic. The necessary methods for solving real time action planning, coordination and control problems are described. A notable chapter presents a new approach to intelligent robotic agent control acting in a realworld environment based on a lifelong learning approach combining cognitive and reactive capabilities. Another key feature is the homogeneous description of all solutions and methods based on system theory formalism.
This guidebook on e-science presents real-world examples of practices and applications, demonstrating how a range of computational technologies and tools can be employed to build essential infrastructures supporting next-generation scientific research. Each chapter provides introductory material on core concepts and principles, as well as descriptions and discussions of relevant e-science methodologies, architectures, tools, systems, services and frameworks. Features: includes contributions from an international selection of preeminent e-science experts and practitioners; discusses use of mainstream grid computing and peer-to-peer grid technology for "open" research and resource sharing in scientific research; presents varied methods for data management in data-intensive research; investigates issues of e-infrastructure interoperability, security, trust and privacy for collaborative research; examines workflow technology for the automation of scientific processes; describes applications of e-science.
This book presents and discusses the most recent innovations, trends, results, experiences and concerns with regard to information systems. Individual chapters focus on IT for facility management, process management and applications, corporate information systems, design and manufacturing automation. The book includes new findings on software engineering, industrial internet, engineering cloud and advance BPM methods. It presents the latest research on intelligent information systems, computational intelligence methods in Information Systems and new trends in Business Process Management, making it a valuable resource for both researchers and practitioners looking to expand their information systems expertise.
The Dynamics program and handbook allows the reader to explore nonlinear dynamics and chaos by the use of illustrated graphics. It is suitable for research and educational needs. This new edition allows the program = to run 3 times faster on the processes that are time consuming. Other major changes include: 1. There will be an add-your-own equation facility. This means it = will be unnecessary to have a compiler. PD and Lyanpunov exponents and Newton method for finding periodic orbits can all be carried out numerically without adding specific code for partial derivatives. 2. The program will support color postscript. 3. New menu system in which the user is prompted by options when a command is chosen. This means that the program is much easier to learn and to remember in comparison to current version. 4. Mouse support is added. 5. The program will be able to use the expanded memory available on modern PC's. This means pictures will be higher resolution. There are also many minor chan ce much of the source code will be available on the web, although some of ges such as zoom facility and help facility.=20 6. Due to limited spa it willr emain on the disk so that the unix users still have to purchase the book. This will allow minor upgrades for Unix users.
Memory Issues in Embedded Systems-On-Chip: Optimizations and Explorations is designed for different groups in the embedded systems-on-chip arena. First, it is designed for researchers and graduate students who wish to understand the research issues involved in memory system optimization and exploration for embedded systems-on-chip. Second, it is intended for designers of embedded systems who are migrating from a traditional micro-controllers centered, board-based design methodology to newer design methodologies using IP blocks for processor-core-based embedded systems-on-chip. Also, since Memory Issues in Embedded Systems-on-Chip: Optimization and Explorations illustrates a methodology for optimizing and exploring the memory configuration of embedded systems-on-chip, it is intended for managers and system designers who may be interested in the emerging capabilities of embedded systems-on-chip design methodologies for memory-intensive applications.
Integrated circuit technology is widely used for the full integration of electronic systems. In general, these systems are realized using digital techniques implemented in CMOS technology. The low power dissipation, high packing density, high noise immunity, ease of design and the relative ease of scaling are the driving forces of CMOS technology for digital applications. Parts of these systems cannot be implemented in the digital domain and will remain analog. In order to achieve complete system integration these analog functions are preferably integrated in the same CMOS technology. An important class of analog circuits that need to be integrated in CMOS are analog filters. This book deals with very high frequency (VHF) filters, which are filters with cut-off frequencies ranging from the low megahertz range to several hundreds of megahertz. Until recently the maximal cut-off frequencies of CMOS filters were limited to the low megahertz range. By applying the techniques presented in this book the limit could be pushed into the true VHF domain, and integrated VHF filters become feasible. Application of these VHF filters can be found in the field of communication, instrumentation and control systems. For example, pre and post filtering for high-speed AD and DA converters, signal reconstruction, signal decoding, etc. The general design philosophy used in this book is to allow only the absolute minimum of signal carrying nodes throughout the whole filter. This strategy starts at the filter synthesis level and is extended to the level of electronic circuitry. The result is a filter realization in which all capacitators (including parasitics) have a desired function. The advantage of this technique is that high frequency parasitic effects (parasitic poles/zeros) are minimally present. The book is a reference for engineers in research or development, and is suitable for use as a text for advanced courses on the subject. >
Operations Research and Cyber-Infrastructure is the companion volume to the Eleventh INFORMS Computing Society Conference (ICS 2009), held in Charleston, South Carolina, from January 11 to 13, 2009. It includes 24 high-quality refereed research papers. As always, the focus of interest for ICS is the interface between Operations Research and Computer Science, and the papers in this volume reflect that interest. This is naturally an evolving area as computational power increases rapidly while decreasing in cost even more quickly. The papers included here illustrate the wide range of topics at this interface. For convenience, they are grouped in broad categories and subcategories. There are three papers on modeling, reflecting the impact of recent development in computing on that area. Eight papers are on optimization (three on integer programming, two on heuristics, and three on general topics, of which two involve stochastic/probabilistic processes). Finally, there are thirteen papers on applications (three on the conference theme of cyber-infrastructure, four on routing, and six on other interesting topics). Several of the papers could be classified in more than one way, reflecting the interactions between these topic areas.
The ways in which humans communicate with one another is constantly evolving. Technology plays a large role in this evolution via new methods and avenues of social and business interaction. The Handbook of Research on Human Interaction and the Impact of Information Technologies is a primary reference source featuring the latest scholarly perspectives on technological breakthroughs in user operation and the processes of communication in the digital era. Including a number of topics such as health information technology, multimedia, and social media, this publication is ideally designed for professionals, technology developers, and researchers seeking current research on technology's role in communication.
Thisvolumecontainstheinvitedandregularpaperspresentedat TCS 2010, the 6thIFIP International Conference on Theoretical Computer Science, organised by IFIP Tech- cal Committee 1 (Foundations of Computer Science) and IFIP WG 2.2 (Formal - scriptions of Programming Concepts) in association with SIGACT and EATCS. TCS 2010 was part of the World Computer Congress held in Brisbane, Australia, during September 20-23, 2010 ( ). TCS 2010 is composed of two main areas: (A) Algorithms, Complexity and Models of Computation, and (B) Logic, Semantics, Speci?cation and Veri?cation. The selection process led to the acceptance of 23 papers out of 39 submissions, eachofwhichwasreviewedbythreeProgrammeCommitteemembers.TheProgramme Committee discussion was held electronically using Easychair. The invited speakers at TCS 2010 are: Rob van Glabbeek (NICTA, Australia) Bart Jacobs (Nijmegen, The Netherlands) Catuscia Palamidessi (INRIA and LIX, Paris, France) Sabina Rossi (Venice, Italy) James Harland (Australia) and Barry Jay (Australia) acted as TCS 2010 Chairs. We take this occasion to thank the members of the Programme Committees and the external reviewers for the professional and timely work; the conference Chairs for their support; the invited speakers for their scholarly contribution; and of course the authors for submitting their work to TCS 2010
This book presents some of the most recent research results in the area of machine learning and robot perception. The chapters represent new ways of solving real-world problems. The book covers topics such as intelligent object detection, foveated vision systems, online learning paradigms, reinforcement learning for a mobile robot, object tracking and motion estimation, 3D model construction, computer vision system and user modelling using dialogue strategies. This book will appeal to researchers, senior undergraduate/postgraduate students, application engineers and scientists.
Automatic biometrics recognition techniques are becoming increasingly important in corporate and public security systems and have increased in methods due to rapid field development. ""Behavioral Biometrics for Human Identification: Intelligent Applications"" discusses classic behavioral biometrics as well as collects the latest advances in techniques, theoretical approaches, and dynamic applications. A critical mass of research, this innovative collection serves as an important reference tool for researchers, practitioners, academicians, and technologists.
Adult students demand a wider variety of instructional strategies that encompass real-world, interactive, cooperative, and discovery learning experiences. ""Designing Instruction for the Traditional, Adult, and Distance Learner: A New Engine for Technology-Based Teaching"" explores how technology impacts the process of devising instructional plans as well as learning itself in adult students. Containing research from leading international experts, this publication proposes realistic and accurate archetypes to assist educators in incorporating state-of-the-art technologies into online instruction.
Since their introduction in 1984, Field-Programmable Gate Arrays (FPGAs) have become one of the most popular implementation media for digital circuits and have grown into a $2 billion per year industry. As process geometries have shrunk into the deep-submicron region, the logic capacity of FPGAs has greatly increased, making FPGAs a viable implementation alternative for larger and larger designs. To make the best use of these new deep-submicron processes, one must re-design one's FPGAs and Computer- Aided Design (CAD) tools. Architecture and CAD for Deep-Submicron FPGAs addresses several key issues in the design of high-performance FPGA architectures and CAD tools, with particular emphasis on issues that are important for FPGAs implemented in deep-submicron processes. Three factors combine to determine the performance of an FPGA: the quality of the CAD tools used to map circuits into the FPGA, the quality of the FPGA architecture, and the electrical (i.e. transistor-level) design of the FPGA. Architecture and CAD for Deep-Submicron FPGAs examines all three of these issues in concert. In order to investigate the quality of different FPGA architectures, one needs CAD tools capable of automatically implementing circuits in each FPGA architecture of interest. Once a circuit has been implemented in an FPGA architecture, one next needs accurate area and delay models to evaluate the quality (speed achieved, area required) of the circuit implementation in the FPGA architecture under test. This book therefore has three major foci: the development of a high-quality and highly flexible CAD infrastructure, the creation of accurate area and delay models for FPGAs, and the study of several important FPGA architectural issues. Architecture and CAD for Deep-Submicron FPGAs is an essential reference for researchers, professionals and students interested in FPGAs.
Formal Description Techniques and Protocol Specification, Testing and Verification addresses formal description techniques (FDTs) applicable to distributed systems and communication protocols. It aims to present the state of the art in theory, application, tools and industrialization of FDTs. Among the important features presented are: FDT-based system and protocol engineering; FDT-application to distributed systems; Protocol engineering; Practical experience and case studies. Formal Description Techniques and Protocol Specification, Testing and Verification comprises the proceedings of the Joint International Conference on Formal Description Techniques for Distributed Systems and Communication Protocols and Protocol Specification, Testing and Verification, sponsored by the International Federation for Information Processing, held in November 1998, Paris, France. Formal Description Techniques and Protocol Specification, Testing and Verification is suitable as a secondary text for a graduate-level course on Distributed Systems or Communications, and as a reference for researchers and practitioners in industry.
Intelligent Multimodal Information Presentation relates to the ability of a computer system to automatically produce interactive information presentations, taking into account the specifics about the user, such as needs, interests and knowledge, and engaging in a collaborative interaction that helps the retrieval of relevant information and its understanding on the part of the user. The volume includes descriptions of some of the most representative recent works on Intelligent Information Presentation and a view of the challenges ahead. Audience: Practitioners of the field of human computer interfaces; larger audience interested in the issues related to the effectiveness of information presentation in different scenarios, including educational entertainment and electronic commerce. |
You may like...
Stream-Tube Method - A Complex-Fluid…
Jean-Robert Clermont, Amine Ammar
Hardcover
R4,965
Discovery Miles 49 650
Numerical Ship Hydrodynamics - An…
Lars Larsson, Frederick Stern, …
Hardcover
R4,765
Discovery Miles 47 650
ECCOMAS Multidisciplinary Jubilee…
Josef Eberhardsteiner, Christian Hellmich, …
Hardcover
R4,210
Discovery Miles 42 100
|