![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
Parallel Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of Parallel, and what's more convenient than getting to know both sides! Want to know More? Buy today!
Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.
One of the grand challenges in the nano-scopic computing era is
guarantees of robustness. Robust computing system design is
confronted with quantum physical, probabilistic, and even
biological phenomena, and guaranteeing high reliability is much
more difficult than ever before. Scaling devices down to the level
of single electron operation will bring forth new challenges due to
probabilistic effects and uncertainty in guaranteeing 'zero-one'
based computing. Minuscule devices imply billions of devices on a
single chip, which may help mitigate the challenge of uncertainty
by replication and redundancy. However, such device densities will
create a design and validation nightmare with the shear scale.
The VITAL specification addresses the issues of interoperability, backannotation and high performance simulation for sign-off quality ASIC libraries in VHDL. VITAL provides modeling guidelines and a set of pre-defined packages (containing pre-defined routines for modeling functionality and timing) to facilitate the acceleration of designs which use cells from a VITAL library. The VITAL Level-I guidelines constrain the modeling capabilities provided by VHDL in order to facilitate higher performance (Figure I). Accumulating "gains" Constrained "flexibility" Higher performance & Increased capacity Benefits Flexibility FujI VHDL 1076 Figure 1: VHDL and VITAL Even within the Level-I guidelines, there are several ways in which a model can be written. In this chapter, we highlight the various modeling trade-offs and provide guidelines which can be used for developing efficient models. We will also discuss the techniques that can be used by tool developers to accelerate the simulation of VIT AL based designs. 2.2. OVERVIEW OF A VITAL LEVEL-l ARCIDTECTURE The VITAL specification is versatile enough to support several modeling styles e.g., distributed delay style, pin-to-pin delay style etc. In general, a VITAL Level-I model can have the structure illustrated in Figure 2."
The book is a collection of extended papers which have been selected for presentation during the SIMHYDRO 2012 conference held in Sophia Antipolis in September 2012. The papers present the state of the art numerical simulation in domains such as (1) New trends in modelling for marine, river & urban hydraulics; (2) Stakeholders & practitioners of simulation; (3) 3D CFD & applications. All papers have been peer reviewed and by scientific committee members with report about quality, content and originality. The target audience for this book includes scientists, engineers and practitioners involved in the field of numerical modelling in the water sector: flood management, natural resources preservation, hydraulic machineries, and innovation in numerical methods, 3D developments and applications.
The development of any Software (Industrial) Intensive System, e.g. critical embedded software, requires both different notations, and a strong devel- ment process. Different notations are mandatory because different aspects of the Software System have to be tackled. A strong development process is mandatory as well because without a strong organization we cannot warrantee the system will meet its requirements. Unfortunately, much more is needed! The different notations that can be used must all possess at least one property: formality. The development process must also have important properties: a exha- tive coverage of the development phases, and a set of well integrated support tools. In Computer Science it is now widely accepted that only formal notations can guarantee a perfect de?ned meaning. This becomes a more and more important issue since software systems tend to be distributed in large systems (for instance in safe public transportation systems), and in small ones (for instance numerous processors in luxury cars). Distribution increases the complexity of embedded software while safety criteria get harder to be met. On the other hand, during the past decade Software Engineering techniques have been improved a lot, and are now currently used to conduct systematic and rigorous development of large software systems. UML has become the de facto standard notation for documenting Software Engineering projects. UML is supported by many CASE tools that offer graphical means for the UML notation.
More than anything else, this book is a tribute to Edsger W. Dijkstra, on the occasion of his sixtieth birthday, by just a few of those fortunate enough to be influenced by him and his work and to be called his friend or relation, his master, colleague, or pupil. This book contains fifty-four technical contributions in different areas of endeavor, although many of them deal with an area of particular concern to Dijkstra: programming. Each contribution is relatively short and could be digested in one sitting. Together, they form a nice cross section of the discipline of programming at the beginning of the nineties. While many know of Dijkstra's technical contributions, they may not be aware of his ultimate goal, the mastery of complexity in mathematics and computing science. He has forcefully argued that beauty and elegance are essential to this mastery. The title of this book, chosen to reflect his ultimate goal, comes from a sentence in an article of his on some beautiful arguments using mathematical induction: .".". when we " "recognize the battle against chaos, mess, and unmastered " "complexity as one of computing sci"- "ence's major callings, " "we must admit that 'Beauty Is Our Business'.""
Low-Power Digital VLSI Design: Circuits and Systems addresses both process technologies and device modeling. Power dissipation in CMOS circuits, several practical circuit examples, and low-power techniques are discussed. Low-voltage issues for digital CMOS and BiCMOS circuits are emphasized. The book also provides an extensive study of advanced CMOS subsystem design. A low-power design methodology is presented with various power minimization techniques at the circuit, logic, architecture and algorithm levels. Features: Low-voltage CMOS device modeling, technology files, design rules Switching activity concept, low-power guidelines to engineering practice Pass-transistor logic families Power dissipation of I/O circuits Multi- and low-VT CMOS logic, static power reduction circuit techniques State of the art design of low-voltage BiCMOS and CMOS circuits Low-power techniques in CMOS SRAMS and DRAMS Low-power on-chip voltage down converter design Numerous advanced CMOS subsystems (e.g. adders, multipliers, data path, memories, regular structures, phase-locked loops) with several design options trading power, delay and area Low-power design methodology, power estimation techniques Power reduction techniques at the logic, architecture and algorithm levels More than 190 circuits explained at the transistor level.
An exciting aspect of contemporary legal scholarship is a concern for law from a global perspective across all legal fields. The book draws upon examples from North America, Western Europe, Africa, Asia, Eastern Europe, and Latin America. It refers to the basic private law fields of torts, property, contracts, and family law. It also refers to the basic public law fields of constitutional law, administrative law, criminal law, and international law. It analyzes diverse legal policy problems from a perspective that is designed to produce solutions whereby conservatives, liberals, and other major viewpoints can all come out ahead of their best initial expectations simultaneously. Such solutions can be considered an important part of an innovative concept of justice that emphasizes being effective, efficient, and equitable simultaneously, rather than compromising on any of those justice components. Another exciting aspect of contemporary legal scholarship is a concern for the use of modern technology in the form of microcomputer software that can be helpful in law teaching, practice, and research. Computer-aided instruction can supplement the case method by using what-if analysis to make changes in the goals to be achieved, alternative decisions available for achieving them, the factual relations, and other inputs to see how the decisions might change with changes in those inputs. Computer-aided law practice can be helpful in counseling, negotiation, mediation, case analysis, legal policy evaluation, and advocacy. Computer-aided research can be helpful in testing deductive or statistical models to determine how well they can explain variance across the judicial process or other legal processes.
These proceedings derive from an international conference on the history of computing and education. This conference is the third of hopefully a series of conferences that will take place within the International Federation for Information Processing (IFIP) and hence, we describe it as the "Third IFIP Conference on the History of Computing and Education" or simply "History of Computing and Education 3" (HCE3). This volume consists of a collection of articles presented at the HCE3 conference held in association with the IFIP 2008 World Computer Congress in Milano, Italy. Articles range from a wide variety of computing perspectives and they represent activities from six continents. The HCE3 conference is an event of the IFIP Working Group 9.7 on the History of Computing, a working group of IFIP' s Technical Committee 9 (TC9) on the Relationship between Computers and Society. In addition, it is in cooperation with the IFIP Technical Committee 3 (TC3) on Education. The HCE3 conference brings to light a broad spectrum of issues. It illustrates topics in computing as they occurred in the "early days" of computing whose ramifications or overtones remain with us today. Indeed, many of the early challenges remain part of our educational tapestry; most likely, many will evolve into future challenges. Therefore, these proceedings provide additional value to the reader as it will reflect in part the future development of computing and education to stimulate new ideas and models in educational development.
This volume is a how-to guide to the use of computers in library-based adult literacy programs. Since the commitment to literacy training has become an integral part of libraries' efforts to offer equal access to information, Linda Main and Char Whitaker provide a comprehensive study of the efficacious role the computer can play in achieving this objective. The problems and successes associated with the introduction of computers into library literacy programs, as well as financial requirements, space, furniture, training, and the effect on other library operations are central to the study. The text also features a design for an ideal computerized literacy lab, an overview of compatible software, both existing and proposed, and a look at the rewards and challenges facing librarians, professional educators, and literacy program directors in the future. Appendixes provide country-wide information on libraries currently involved in automating literacy, main suppliers of literacy software, and consulting personnel.
Post COVID-19 pandemic, researchers have been evaluating the healthcare system for improvements that can be made. Understanding global healthcare systems' operations is essential to preventative measures to be taken for the next global health crisis. A key part to bettering healthcare is the implementation of information management and One Health. The Handbook of Research on Information Management and One Health evaluates the concepts in global health and the application of essential information management in healthcare organizational strategic contexts. This text promotes understanding in how evaluation health and information management are decisive for health planning, management, and implementation of the One Health concept. Covering topics like development partnerships, global health, and the nature of pandemics, this text is essential for health administrators, policymakers, government officials, public health officials, information systems experts, data scientists, analysts, health information science and global health scholars, researchers, practitioners, doctors, students, and academicians.
This book provides an overview of recent progress in computer simulations of nonperturbative phenomena in quantum field theory, particularly in the context of the lattice approach. It is a collection of extensive self-contained reviews of various subtopics, including algorithms, spectroscopy, finite temperature physics, Yukawa and chiral theories, bounds on the Higgs meson mass, the renormalization group, and weak decays of hadrons.Physicists with some knowledge of lattice gauge ideas will find this book a useful and interesting source of information on the recent developments in the field.
The emergence of the system-on-chip (SoC) era is creating many new challenges at all stages of the design process. Engineers are reconsidering how designs are specified, partitioned and verified. With systems and software engineers programming in C/C++ and their hardware counterparts working in hardware description languages such as VHDL and Verilog, problems arise from the use of different design languages, incompatible tools and fragmented tool flows. Momentum is building behind the SystemC language and modeling platform as the best solution for representing functionality, communication, and software and hardware implementations at various levels of abstraction. The reason is clear: increasing design complexity demands very fast executable specifications to validate system concepts, and only C/C++ delivers adequate levels of abstraction, hardware-software integration, and performance. System design today also demands a single common language and modeling foundation in order to make interoperable system--level design tools, services and intellectual property a reality. SystemC is entirely based on C/C++ and the complete source code for the SystemC reference simulator can be freely downloaded from www.systemc.org and executed on both PCs and workstations. System Design and SystemC provides a comprehensive introduction to the powerful modeling capabilities of the SystemC language, and also provides a large and valuable set of system level modeling examples and techniques. Written by experts from Cadence Design Systems, Inc. and Synopsys, Inc. who were deeply involved in the definition and implementation of the SystemC language and reference simulator, this book will provide you with thekey concepts you need to be successful with SystemC. System Design with SystemC thoroughly covers the new system level modeling capabilities available in SystemC 2.0 as well as the hardware modeling capabilities available in earlier versions of SystemC. designed and implemented the SystemC language and reference simulator, this book will provide you with the key concepts you need to be successful with SystemC. System Design with SystemC will be of interest to designers in industry working on complex system designs, as well as students and researchers within academia. All of the examples and techniques described within this book can be used with freely available compilers and debuggers &endash; no commercial software is needed. Instructions for obtaining the free source code for the examples obtained within this book are included in the first chapter.
The intention of IMHO is to make readers think, presenting the "facts" that proponents and opponents of technology use to support their positions in a way that lets readers determine what these facts really mean. Ultimately, IMHO is a reminder that the future of human communication is in our hands, and that we are the active participants in the shaping of it.
This book provides an overview of recent progress in computer simulations of nonperturbative phenomena in quantum field theory, particularly in the context of the lattice approach. It is a collection of extensive self-contained reviews of various subtopics, including algorithms, spectroscopy, finite temperature physics, Yukawa and chiral theories, bounds on the Higgs meson mass, the renormalization group, and weak decays of hadrons.Physicists with some knowledge of lattice gauge ideas will find this book a useful and interesting source of information on the recent developments in the field.
This text concerns the computer-based design and modelling, computational approaches and instrumental methods for elucidating molecular mechanisms of protein folding. Ligand-acceptor interactions are included in volumes 202 and 203 as are genetic and chemical methods for the production of functional molecules including antibodies and antigens, enzymes, receptors, nucleic acids and polysaccharides and drugs.
The future of English linguistics as envisaged by the editors of Topics in English Linguistics lies in empirical studies which integrate work in English linguistics into general and theoretical linguistics on the one hand, and comparative linguistics on the other. The TiEL series features volumes that present interesting new data and analyses, and above all fresh approaches that contribute to the overall aim of the series, which is to further outstanding research in English linguistics.
This volume presents the proceedings of the IFIP TC2 WG 2.5 Conference on Grid-Based Problem Solving Environments: Implications for Development and Deployment of Numerical Software, held in Prescott, Arizona from July 17-21, 2006. Grid-Based Problem Solving Environments will be of particular interest to users of both grid-based and traditional problem solving environments, developers of both grid-based and traditional problem solving environments, developers of grid infrastructure, and developers of numerical software. Among other topics, Grid-Based Problem Solving Environments explores the following: - accuracy contracts and software services - standards for problem specification - service models for the use of numerical software - using the grid to link numerical and other services together - experiences with web-based numerical services - application-oriented numerical interfaces such as web portals - software deployment issues including updates and bug fixes - large data (including data security) and grid-based numerical software - grid-based services as an alternative to deployment - evaluation and comparison of both production and research software
Enterprise Modeling: Improving Global Industrial Competitiveness gives an overview of the current state-of-the-art in enterprise modeling and its application. Enterprise modeling is both a concept and a tool that is highly developed at the research level, but which still promises many new industrial applications. Enterprise models constitute a theoretical basis for the information system in an enterprise and are regarded by many as a substantial opportunity to improve global industrial competitiveness. Enterprise Modeling: Improving Global Industrial Competitiveness gives the reader an understanding of enterprise modeling as a concept and provides examples of its application by describing some of the currently available tools. It is organized in five parts: overview and international trends, the basis of enterprise modeling, application areas, implementation, and industrial experience with enterprise modeling. Enterprise Modeling: Improving Global Industrial Competitiveness is useful to developers of business information systems, users of technical information systems, engineers within operations management, and engineers and economists dealing with performance assessment and improvement. Enterprise Modeling: Improving Global Industrial Competitiveness is suitable as a secondary text for a graduate level course, and as a reference for researchers and practitioners in industry.
Wireless ad hoc sensor networks has recently become a very active research subject. Achieving efficient, fault-tolerant realizations of very large, highly dynamic, complex, unconventional networks is a real challenge for abstract modelling, algorithmic design and analysis, but a solid foundational and theoretical background seems to be lacking. This book presents high-quality contributions by leading experts worldwide on the key algorithmic and complexity-theoretic aspects of wireless sensor networks. The intended audience includes researchers and graduate students working on sensor networks, and the broader areas of wireless networking and distributed computing, as well as practitioners in the relevant application areas. The book can also serve as a text for advanced courses and seminars.
Reuse Techniques for VLSI Design is a reflection on the current state of the art in design reuse for microelectronic systems. To that end, it is the first book to garner the input of leading experts from both research and application areas. These experts document herein not only their more mature approaches, but also their latest research results. Firstly, it sets out the background and support from international organisations that enforce System-on-a-Chip (SoC) design by reuse- oriented methodologies. This overview is followed by a number of technical presentations covering different requirements of the reuse domain. These are presented from different points of view, i.e., IP provider, IP user, designer, isolated reuse, intra-company or inter-company reuse. More general systems or case studies, e.g., metrics, are followed by comprehensive reuse systems, e.g., reuse management systems partly including business models. Since design reuse must not be restricted to digital components, mixed- signal and analog reuse approaches are also presented. In parallel to the digital domain, this area covers research in reuse database design. Design verification and legal aspects are two important topics that are closely related to the realization of design reuse. These hot topics are covered by presentations that finalize the survey of outstanding research, development and application of design reuse for SoC design. Reuse Techniques for VLSI Design is an invaluable reference for researchers and engineers involved in VLSI/ASIC design.
Design of Low-Voltage, Low-Power CMOS Operational Amplifier Cells describes the theory and design of the circuit elements that are required to realize a low-voltage, low-power operational amplifier. These elements include constant-gm rail-to-rail input stages, class-AB rail-to-rail output stages and frequency compensation methods. Several examples of each of these circuit elements are investigated. Furthermore, the book illustrates several silicon realizations, giving their measurement results. The text focuses on compact low-voltage low-power operational amplifiers with good performance. Six simple high-performance class-AB amplifiers are realized using a very compact topology making them particularly suitable for use as VLSI library cells. All of the designs can use a supply voltage as low as 3V. One of the amplifier designs dissipates only 50uW with a unity gain frequency of 1.5 MHz. A second set of amplifiers run on a supply voltage slightly above 1V. The amplifiers combine a low power consumption with a gain of 120 dB. In addition, the design of three fully differential operational amplifiers is addressed. Design of Low-Voltage, Low-Power CMOS Operational Amplifier Cells is intended for professional designers of analog circuits. It is also suitable for use as a text book for an advanced course in CMOS operational amplifier design." |
You may like...
|