![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
This book contains the papers from the IFIP Working Group 8.1 conference on Situational Method Engineering. Over the last decade, Method Engineering, defined as the engineering discipline to design, construct and adapt methods, including supportive tools, has emerged as the research and application area for using methods for systems development.
This handbook provides design considerations and rules-of-thumb to ensure the functionality you want will work. It brings together all the information needed by systems designers to develop applications that include configurability, from the simplest implementations to the most complicated.
The design of digital (computer) systems requires several design phases: from the behavioural design, over the logical structural design to the physical design, where the logical structure is implemented in the physical structure of the system (the chip). Due to the ever increasing demands on computer system performance, the physical design phase being one of the most complex design steps in the entire process. The major goal of this book is to develop a priori wire length estimation methods that can help the designer in finding a good lay-out of a circuit in less iterations of physical design steps and that are useful to compare different physical architectures. For modelling digital circuits, the interconnection complexity is of major importance. It can be described by the so called Rent's rule and the Rent exponent. A Priori Wire Length Estimates for Digital Design will provide the reader with more insight in this rule and clearly outlines when and where the rule can be used and when and where it fails. Also, for the first time, a comprehensive model for the partitioning behaviour of multi-terminal nets is developed. This leads to a new parameter for circuits that describes the distribution of net degrees over the nets in the circuit. This multi-terminal net model is used throughout the book for the wire length estimates but it also induces a method for the generation of synthetic benchmark circuits that has major advantages over existing benchmark generators. In the domain of wire length estimations, the most important contributions of this work are (i) a new model for placement optimization in a physical (computer) architecture and (ii) the inclusion of the multi-terminal net modelin the wire length estimates. The combination of the placement optimization model with Donath's model for a hierarchical partitioning and placement results in more accurate wire length estimates. The multi-terminal net model allows accurate assessments of the impact of multi-terminal nets on wire length estimates. We distinguish between delay-related applications, ' for which the length of source-sink pairs is important, and routing-related applications, ' for which the entire (Steiner) length of the multi-terminal net has to be taken into account. The wire length models are further extended by taking into account the interconnections between internal components and the chip boundary. The application of the models to three-dimensional systems broadens the scope to more exotic architectures and to opto-electronic design techniques. We focus on anisotropic three-dimensional systems and propose a way to estimate wire lengths for opto-electronic systems. The wire length estimates can be used for prediction of circuit characteristics, for improving placement and routing tools in Computer-Aided Design and for evaluating new computer architectures. All new models are validated with experiments on benchmark circuits.
This guidebook on e-science presents real-world examples of practices and applications, demonstrating how a range of computational technologies and tools can be employed to build essential infrastructures supporting next-generation scientific research. Each chapter provides introductory material on core concepts and principles, as well as descriptions and discussions of relevant e-science methodologies, architectures, tools, systems, services and frameworks. Features: includes contributions from an international selection of preeminent e-science experts and practitioners; discusses use of mainstream grid computing and peer-to-peer grid technology for "open" research and resource sharing in scientific research; presents varied methods for data management in data-intensive research; investigates issues of e-infrastructure interoperability, security, trust and privacy for collaborative research; examines workflow technology for the automation of scientific processes; describes applications of e-science.
Content-based multimedia retrieval is a challenging research field with many unsolved problems. This monograph details concepts and algorithms for robust and efficient information retrieval of two different types of multimedia data: waveform-based music data and human motion data. It first examines several approaches in music information retrieval, in particular general strategies as well as efficient algorithms. The book then introduces a general and unified framework for motion analysis, retrieval, and classification, highlighting the design of suitable features, the notion of similarity used to compare data streams, and data organization.
The availability of effective global communication facilities in the last decade has changed the business goals of many manufacturing enterprises. They need to remain competitive by developing products and processes which are specific to individual requirements, completely packaged and manufactured globally. Networks of enterprises are formed to operate across time and space with world-wide distributed functions such as manufacturing, sales, customer support, engineering, quality assurance, supply chain management and so on. Research and technology development need to address architectures, methodologies, models and tools supporting intra- and inter-enterprise operation and management. Throughout the life cycle of products and enterprises there is the requirement to transform information sourced from globally distributed offices and partners into knowledge for decision and action. Building on the success of previous DrrSM conferences (Tokyo 1993, Eindhoven 1996, Fort Worth 1998), the fourth International Conference on Design of Information Infrastructure Systems for Manufacturing (DrrSM 2000) aims to: * Establish and manage the dynamics of virtual enterprises, define the information system requirements and develop solutions; * Develop and deploy information management in multi-cultural systems with universal applicability of the proposed architecture and solutions; * Develop enterprise integration architectures, methodologies and information infrastructure support for reconfigurable enterprises; * Explore information transformation into knowledge for decision and action by machine and skilful people; These objectives reflect changes of the business processes due to advancements of information and communication technologies (ICT) in the last couple of years.
This book brings together in one place important contributions and state-of-the-art research in the rapidly advancing area of analog VLSI neural networks. The book serves as an excellent reference, providing insights into some of the most important issues in analog VLSI neural networks research efforts.
Here is a comprehensive presentation of methodology for the design and synthesis of an intelligent complex robotic system, connecting formal tools from discrete system theory, artificial intelligence, neural network, and fuzzy logic. The necessary methods for solving real time action planning, coordination and control problems are described. A notable chapter presents a new approach to intelligent robotic agent control acting in a realworld environment based on a lifelong learning approach combining cognitive and reactive capabilities. Another key feature is the homogeneous description of all solutions and methods based on system theory formalism.
Targeted audience * Specialists in numerical computations, especially in numerical optimiza tion, who are interested in designing algorithms with automatie result ver ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be. * Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational com plexity of numerical computations. * Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing.
This book presents and discusses the most recent innovations, trends, results, experiences and concerns with regard to information systems. Individual chapters focus on IT for facility management, process management and applications, corporate information systems, design and manufacturing automation. The book includes new findings on software engineering, industrial internet, engineering cloud and advance BPM methods. It presents the latest research on intelligent information systems, computational intelligence methods in Information Systems and new trends in Business Process Management, making it a valuable resource for both researchers and practitioners looking to expand their information systems expertise.
This useful book addresses electrothermal problems in modern VLSI systems. It discusses electrothermal phenomena and the fundamental building blocks that electrothermal simulation requires. The authors present three important applications of VLSI electrothermal analysis: temperature-dependent electromigration diagnosis, cell-level thermal placement, and temperature-driven power and timing analysis.
This monograph develops a framework for modeling and solving utility maximization problems in nonconvex wireless systems. The first part develops a model for utility optimization in wireless systems. The model is general enough to encompass a wide array of system configurations and performance objectives. Based on the general model, a set of methods for solving utility maximization problems is developed in the second part of the book. The development is based on a careful examination of the properties that are required for the application of each method. This part focuses on problems whose initial formulation does not allow for a solution by standard methods and discusses alternative approaches. The last part presents two case studies to demonstrate the application of the proposed framework. In both cases, utility maximization in multi-antenna broadcast channels is investigated.
One of the fastest growing areas in computer science, granular computing, covers theories, methodologies, techniques, and tools that make use of granules in complex problem solving and reasoning. Novel Developments in Granular Computing: Applications for Advanced Human Reasoning and Soft Computation analyzes developments and current trends of granular computing, reviewing the most influential research and predicting future trends. This book not only presents a comprehensive summary of existing practices, but enhances understanding on human reasoning.
The Dynamics program and handbook allows the reader to explore nonlinear dynamics and chaos by the use of illustrated graphics. It is suitable for research and educational needs. This new edition allows the program = to run 3 times faster on the processes that are time consuming. Other major changes include: 1. There will be an add-your-own equation facility. This means it = will be unnecessary to have a compiler. PD and Lyanpunov exponents and Newton method for finding periodic orbits can all be carried out numerically without adding specific code for partial derivatives. 2. The program will support color postscript. 3. New menu system in which the user is prompted by options when a command is chosen. This means that the program is much easier to learn and to remember in comparison to current version. 4. Mouse support is added. 5. The program will be able to use the expanded memory available on modern PC's. This means pictures will be higher resolution. There are also many minor chan ce much of the source code will be available on the web, although some of ges such as zoom facility and help facility.=20 6. Due to limited spa it willr emain on the disk so that the unix users still have to purchase the book. This will allow minor upgrades for Unix users.
The seventh book in the CHDL Series is composed of a selection of the best articles from the Forum on Specification and Design Languages (FDL'04). FDL is the European Forum to learn and exchange on new trends on the application of languages and models for the design of electronic and heterogeneous systems. The forum was structured around four workshops that are all represented in the book by outstanding articles: Analog and Mixed-Signal Systems, UML-based System Specification and Design, C/C++-Based System Design and Languages for Formal Specification and Verification. The Analog and Mixed-Signal Systems contributions bring some
answers to the difficult problem of co-simulating discrete and
continuous models of computation. The UML-based System
Specification and Design chapters bring insight into how to use the
Model Driven Engineering to design Systems-on-Chip. The C/C++-Based
System Design articles mainly explore system level design with
SystemC. The Languages for Formal Overall Advances in Design and Specification Languages for SoCs is an excellent opportunity to catch up with the latest research developments in the field of languages for electronic and heterogeneous system design.
Integrated circuit technology is widely used for the full integration of electronic systems. In general, these systems are realized using digital techniques implemented in CMOS technology. The low power dissipation, high packing density, high noise immunity, ease of design and the relative ease of scaling are the driving forces of CMOS technology for digital applications. Parts of these systems cannot be implemented in the digital domain and will remain analog. In order to achieve complete system integration these analog functions are preferably integrated in the same CMOS technology. An important class of analog circuits that need to be integrated in CMOS are analog filters. This book deals with very high frequency (VHF) filters, which are filters with cut-off frequencies ranging from the low megahertz range to several hundreds of megahertz. Until recently the maximal cut-off frequencies of CMOS filters were limited to the low megahertz range. By applying the techniques presented in this book the limit could be pushed into the true VHF domain, and integrated VHF filters become feasible. Application of these VHF filters can be found in the field of communication, instrumentation and control systems. For example, pre and post filtering for high-speed AD and DA converters, signal reconstruction, signal decoding, etc. The general design philosophy used in this book is to allow only the absolute minimum of signal carrying nodes throughout the whole filter. This strategy starts at the filter synthesis level and is extended to the level of electronic circuitry. The result is a filter realization in which all capacitators (including parasitics) have a desired function. The advantage of this technique is that high frequency parasitic effects (parasitic poles/zeros) are minimally present. The book is a reference for engineers in research or development, and is suitable for use as a text for advanced courses on the subject. >
The Turn analyzes the research of information seeking and retrieval (IS&R) and proposes a new direction of integrating research in these two areas: the fields should turn off their separate and narrow paths and construct a new avenue of research. An essential direction for this avenue is context as given in the subtitle Integration of Information Seeking and Retrieval in Context. Other essential themes in the book include: IS&R research models, frameworks and theories; search and works tasks and situations in context; interaction between humans and machines; information acquisition, relevance and information use; research design and methodology based on a structured set of explicit variables - all set into the holistic cognitive approach. The present monograph invites the reader into a construction project - there is much research to do for a contextual understanding of IS&R. The Turn represents a wide-ranging perspective of IS&R by providing a novel unique research framework, covering both individual and social aspects of information behavior, including the generation, searching, retrieval and use of information. Regarding traditional laboratory information retrieval research, the monograph proposes the extension of research toward actors, search and work tasks, IR interaction and utility of information. Regarding traditional information seeking research, it proposes the extension toward information access technology and work task contexts. The Turn is the first synthesis of research in the broad area of IS&R ranging from systems oriented laboratory IR research to social science oriented information seeking studies.
A book that took 10 years to make! A book about a bygone era of computing that never really rolled over and played dead, more like dug a tunnel andwent underground. Here is a modern collection of ancient writings about a computer thought of as extinct-- the Commodore! Relive or discover for thefirst time what it was like to use and work with the best selling single board computer in history through the eyes of one who still admires itscomplex simplicity.
Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work - with the end result of high-quality trusted data and information, so critical to today's data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations - for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization's standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before.
Data science has always been an effective way of extracting knowledge and insights from information in various forms. One industry that can utilize the benefits from the advances in data science is the healthcare field. The Handbook of Research on Data Science for Effective Healthcare Practice and Administration is a critical reference source that overviews the state of data analysis as it relates to current practices in the health sciences field. Covering innovative topics such as linear programming, simulation modeling, network theory, and predictive analytics, this publication is recommended for all healthcare professionals, graduate students, engineers, and researchers that are seeking to expand their knowledge of efficient techniques for information analysis in the healthcare professions.
Operations Research and Cyber-Infrastructure is the companion volume to the Eleventh INFORMS Computing Society Conference (ICS 2009), held in Charleston, South Carolina, from January 11 to 13, 2009. It includes 24 high-quality refereed research papers. As always, the focus of interest for ICS is the interface between Operations Research and Computer Science, and the papers in this volume reflect that interest. This is naturally an evolving area as computational power increases rapidly while decreasing in cost even more quickly. The papers included here illustrate the wide range of topics at this interface. For convenience, they are grouped in broad categories and subcategories. There are three papers on modeling, reflecting the impact of recent development in computing on that area. Eight papers are on optimization (three on integer programming, two on heuristics, and three on general topics, of which two involve stochastic/probabilistic processes). Finally, there are thirteen papers on applications (three on the conference theme of cyber-infrastructure, four on routing, and six on other interesting topics). Several of the papers could be classified in more than one way, reflecting the interactions between these topic areas.
Memory Issues in Embedded Systems-On-Chip: Optimizations and Explorations is designed for different groups in the embedded systems-on-chip arena. First, it is designed for researchers and graduate students who wish to understand the research issues involved in memory system optimization and exploration for embedded systems-on-chip. Second, it is intended for designers of embedded systems who are migrating from a traditional micro-controllers centered, board-based design methodology to newer design methodologies using IP blocks for processor-core-based embedded systems-on-chip. Also, since Memory Issues in Embedded Systems-on-Chip: Optimization and Explorations illustrates a methodology for optimizing and exploring the memory configuration of embedded systems-on-chip, it is intended for managers and system designers who may be interested in the emerging capabilities of embedded systems-on-chip design methodologies for memory-intensive applications.
This book presents some of the most recent research results in the area of machine learning and robot perception. The chapters represent new ways of solving real-world problems. The book covers topics such as intelligent object detection, foveated vision systems, online learning paradigms, reinforcement learning for a mobile robot, object tracking and motion estimation, 3D model construction, computer vision system and user modelling using dialogue strategies. This book will appeal to researchers, senior undergraduate/postgraduate students, application engineers and scientists.
Adult students demand a wider variety of instructional strategies that encompass real-world, interactive, cooperative, and discovery learning experiences. ""Designing Instruction for the Traditional, Adult, and Distance Learner: A New Engine for Technology-Based Teaching"" explores how technology impacts the process of devising instructional plans as well as learning itself in adult students. Containing research from leading international experts, this publication proposes realistic and accurate archetypes to assist educators in incorporating state-of-the-art technologies into online instruction. |
You may like...
Office 2010 Made Simple
Guy Hart-Davis, Msl Made Simple Learning
Paperback
Illustrated Microsoft (R) Office 365 (R…
Lynn Wermers
Paperback
New Perspectives Collection, Microsoft…
Cengage Cengage
Paperback
The Shelly Cashman Series (R) Microsoft…
Steven Freund, Joy Starks
Paperback
Scaling Networks v6 Course Booklet
Cisco Networking Academy
Paperback
|