![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer hardware & operating systems > Storage media & peripherals
Oracle Exadata Survival Guide is a hands-on guide for busy Oracle database administrators who are migrating their skill sets to Oracle's Exadata database appliance. The book covers the concepts behind Exadata, and the available configurations for features such as smart scans, storage indexes, Smart Flash Cache, hybrid columnar compression, and more. You'll learn about performance metrics and execution plans, and how to optimize SQL running in Oracle's powerful, new environment. The authors also cover migration from other servers. Oracle Exadata is fast becoming the standard for large installations such as those running data warehouse, business intelligence, and large-scale OLTP systems. Exadata is like no other platform, and is new ground even for experienced Oracle database administrators. The Oracle Exadata Survival Guide helps you navigate the ins and outs of this new platform, de-mystifying this amazing appliance and its exceptional performance. The book takes a highly practical approach, not diving too deeply into the details, but giving you just the right depth of information to quickly transfer your skills to Oracle's important new platform.* Helps transfer your skills to the platform of the future * Covers the important ground without going too deep * Takes a practical and hands-on approach to everyday tasks What you'll learn * Learn the components and basic architecture of an Exadata machine * Reduce data transfer overhead by processing queries in the storage layer * Examine and take action on Exadata-specific performance metrics * Deploy Hybrid Columnar Compression to reduce storage and I/O needs * Create worry-free migrations from existing databases into Exadata * Understand and address issues specific to ERP migrations Who this book is for Oracle Exadata Survival Guide is for the busy enterprise Oracle DBA who has suddenly been thrust into the Exadata arena. Readers should have a sound grasp of traditional Oracle database administration, and be prepared to learn new aspects that are specific to the Exadata appliance.
Verification of real-time requirements in systems-on-chip becomes more complex as more applications are integrated. Predictable and composable systems can manage the increasing complexity using formal verification and simulation. This book explains the concepts of predictability and composability and shows how to apply them to the design and analysis of a memory controller, which is a key component in any real-time system.
This book constitutes the refereed proceedings of the 16th National Conference on Computer Engineering and Technology, NCCET 2012, held in Shanghai, China, in August 2012. The 27 papers presented were carefully reviewed and selected from 108 submissions. They are organized in topical sections named: microprocessor and implementation; design of integration circuit; I/O interconnect; and measurement, verification, and others.
With the semiconductor market growth, new Integrated Circuit designs are pushing the limit of the technology and in some cases, require speci?c ?ne-tuning of certain process modules in manufacturing. Thus the communities of design and technology are increasingly intertwined. The issues that require close interactions and colla- ration for trade-off and optimization across the design/device/process ?elds are addressed in this book. It contains a set of outstanding papers, keynote and tutorials presented during 3 days at the International Conference on Integrated Circuit Design and Technology (ICICDT) held in June 2008 in Minatec, Grenoble. The selected papers are spread over ?ve chapters covering various aspects of emerging technologies and devices, advanced circuit design, reliability, variability issues and solutions, advanced memories and analog and mixed signals. All these papers are focusing on design and technology interactions and comply with the scope of the conference. v . Contents Part I Introduction 1 Synergy Between Design and Technology: A Key Factor in the Evolving Microelectronic Landscape. . . . . . . . . . . . . . . . . . . . . . 3 Michel Brilloue]t Part II Emerging Technologies and Circuits 2 New State Variable Opportunities Beyond CMOS: A System Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Victor V. Zhirnov, Ralph K. Cavin, and George I. Bourianoff 3 A Simple Compact Model to Analyze the Impact of Ballistic and Quasi-Ballistic Transport on Ring Oscillator Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 S. Martinie, D. Munteanu, G. Le Carval, and J. L. Autran Part III Advanced Devices and Circuits 4 Low-Voltage Scaled 6T FinFET SRAM Cells . . . . . . . . . . . . . . . . . . . 55 N. Collaert, K. von Arnim, R. Rooyackers, T."
Universal access and management of information has been one of the driving forces in the evolution of computer technology. Central computing gave the ability to perform large and complex computations and advanced information manipulation. Advances in networking connected computers together and led to distributed computing. Web technology and the Internet went even further to provide hyper-linked information access and global computing. However, restricting access stations to physical locations limits the boundary of the vision. The real global network can be achieved only via the ability to compute and access information from anywhere and anytime. This is the fundamental wish that motivates mobile computing. This evolution is the cumulative result of both hardware and software advances at various levels motivated by tangible application needs. Infrastructure research on communications and networking is essential for realizing wireless systems.Equally important is the design and implementation of data management applications for these systems, a task directly affected by the characteristics of the wireless medium and the resulting mobility of data resources and computation. Although a relatively new area, mobile data management has provoked a proliferation of research efforts motivated both by a great market potential and by many challenging research problems. The focus of Data Management for Mobile Computing is on the impact of mobile computing on data management beyond the networking level. The purpose is to provide a thorough and cohesive overview of recent advances in wireless and mobile data management. The book is written with a critical attitude. This volume probes the new issues introduced by wireless and mobile access to data and their conceptual and practical consequences. Data Management for Mobile Computing provides a single source for researchers and practitioners who want to keep abreast of the latest innovations in the field.It can also serve as a textbook for an advanced course on mobile computing or as a companion text for a variety of courses including courses on distributed systems, database management, transaction management, operating or file systems, information retrieval or dissemination, and web computing.
Advanced Database Indexing begins by introducing basic material on storage media, including magnetic disks, RAID systems and tertiary storage such as optical disk and tapes. Typical access methods (e.g. B+ trees, dynamic hash files and secondary key retrieval) are also introduced. The remainder of the book discusses recent advances in indexing and access methods for particular database applications. More specifically, issues such as external sorting, file structures for intervals, temporal access methods, spatial and spatio-temporal indexing, image and multimedia indexing, perfect external hashing methods, parallel access methods, concurrency issues in indexing and parallel external sorting are presented for the first time in a single book. Advanced Database Indexing is an excellent reference for database professionals and may be used as a text for advanced courses on the topic.
Desktop or DIY 3D printers are devices you can either buy preassembled as a kit, or build from a collection of parts to design and print physical objects including replacement household parts, custom toys, and even art, science, or engineering projects. Maybe you have one, or maybe you're thinking about buying or building one. Practical 3D Printers takes you beyond how to build a 3D printer, to calibrating, customizing, and creating amazing models, including 3D printed text, a warship model, a robot platform, windup toys, and arcade-inspired alien invaders. You'll learn about the different types of personal 3D printers and how they work; from the MakerBot to the RepRap printers like the Huxley and Mendel, as well as the whiteAnt CNC featured in the Apress book Printing in Plastic. You'll discover how easy it is to find and design 3D models using web-based 3D modeling, and even how to create a 3D model from a 2D image. After learning the basics, this book will walk you through building multi-part models with a steampunk warship project, working with meshes to build your own action heroes, and creating an autonomous robot chassis. Finally, you'll find even more bonus projects to build, including wind-up walkers, faceted vases for the home, and a handful of useful upgrades to modify and improve your 3D printer.
This state-of-the-art survey features topics related to the impact of multicore, manycore, and coprocessor technologies in science and large-scale applications in an interdisciplinary environment. The papers included in this survey cover research in mathematical modeling, design of parallel algorithms, aspects of microprocessor architecture, parallel programming languages, hardware-aware computing, heterogeneous platforms, manycore technologies, performance tuning, and requirements for large-scale applications. The contributions presented in this volume are an outcome of an inspiring conference conceived and organized by the editors at the University of Applied Sciences (HfT) in Stuttgart, Germany, in September 2012. The 10 revised full papers selected from 21 submissions are presented together with the twelve poster abstracts and focus on combination of new aspects of microprocessor technologies, parallel applications, numerical simulation, and software development; thus they clearly show the potential of emerging technologies in the area of multicore and manycore processors that are paving the way towards personal supercomputing and very likely towards exascale computing.
Unternehmensmodellierung dient dazu, die wichtigsten Komponenten von Organisationen sowie deren Relationen zueinander abzubilden. Sie wird bei einer Vielzahl strategischer und operativer Aufgaben eingesetzt. In dem Buch erlautern die Autoren anhand der Methode "Kochbuch" die Grundlagen und Einsatzwecke der Unternehmensmodellierung, insbesondere stellen sie die unterschiedlichen Perspektiven auf ein Unternehmen und die Analysetechniken dar. Die Konzepte und Methoden koennen durch genaue Vorgehensbeschreibungen unmittelbar angewendet werden.
'Now . . . in the Analytical Engine I had devised mechanical means equivalent to memory. ' For the past twenty-five years or so, scientists and engineers have been endeavouring to realize in new technologies the claim made by Charles Babbage in his memoirs over a century ago. The modern computer industry depends to a very large extent on the success of their efforts. In this book we discuss the wide variety of techniques which have been used and are being developed to meet the range of requirements for digital storage systems in computers and other applications. The book has been written as a guide for the designer of any system employing digital techniques, firstly to guide him in his choice of store for differing applications and, secondly, to give him an apprecia tion of the problems which confront the engineer designing storage systems. Technology never stands still and developments in recent years have, of necessity, greatly increased the amount of material included in this second edition. The opportunity has also been taken to reorganize the contents and more emphasis has been given to those developments which have had, or which are likely to have, the greatest effect on computer development. Brief descriptions of obsolete or obsolescent systems have been retained, both as a warn ing to designers of the problems likely to be encountered in develop ment and to demonstrate how changes in technology can give a new impetus to old designs."
These are the proceedings of a NATO Advanced Study Institute (ASI) held in Cetraro, Italy during 6-17 June 1983. The title of the ASI was Computer Arehiteetures for SpatiaZZy vistributed Vata, and it brouqht together some 60 participants from Europe and America. Presented ere are 21 of the lectures that were delivered. The articles cover a wide spectrum of topics related to computer architecture s specially oriented toward the fast processing of spatial data, and represent an excellent review of the state-of-the-art of this topic. For more than 20 years now researchers in pattern recognition, image processing, meteorology, remote sensing, and computer engineering have been looking toward new forms of computer architectures to speed the processing of data from two- and three-dimensional processes. The work can be said to have commenced with the landmark article by Steve Unger in 1958, and it received a strong forward push with the development of the ILIAC III and IV computers at the University of Illinois during the 1960's. One clear obstacle faced by the computer designers in those days was the limitation of the state-of-the-art of hardware, when the only switching devices available to them were discrete transistors. As aresult parallel processing was generally considered to be imprae tieal, and relatively little progress was made."
A major technological trend for large database systems has been the introduction of ever-larger mass storage systems. This allows computing centers and business data processing installations to maintain on line their program libraries, less frequently used data files, transaction logs and backup copies under unified system control. Tapes, disks and drums are classical examples of mass storage media. The more recent IBM 3851 Mass Storage Facility, part of the IBM 3850 Mass Storage System, represents a new direction in mass storage development, namely, it is two-dimensional. With the maturity of magnetic bubble technology, more sophisticated, massive, multi-trillion-bit storage systems are not far in the future. While large in capacity, mass storage systems have in general relatively long access times. Since record access probabilities are usually not uniform, various algorithms have been devised to position the records to decrease the average access time. The first two chapters of this book are devoted mainly to such algorithmic studies in linear and two-dimensional mass storage systems. In the third chapter, we view the bubble memory as more than a storage medium. In fact, we discuss different structures where routine operations, such as data rearrangement, sorting, searching, etc., can be done in the memory itself, freeing the CPU for more complicated tasks. The problems discussed in this book are combinatorial in nature.
Current issues and approaches in the reliability and safety analysis of dynamic process systems are the subject of this book. The authors of the chapters are experts from nuclear, chemical, mechanical, aerospace and defense system industries, and from institutions including universities, national laboratories, private consulting companies, and regulatory bodies. Both the conventional approaches and dynamic methodologies which explicitly account for the time element in system evolution in failure modeling are represented. The papers on conventional approaches concentrate on the modeling of dynamic effects and the need for improved methods. The dynamic methodologies covered include the DYLAM methodology, the theory of continuous event trees, several Markov model construction procedures, Monte Carlo simulation, and utilization of logic flowgraphs in conjunction with Petri nets. Special emphasis is placed on human factors such as procedures and training.
For the technological progress in communication technology it is necessary that the advanced studies in circuit and software design are accompanied with recent results of the technological research and physics in order to exceed its limitations. This book is a guide which treats many components used in mobile communications, and in particular focuses on non-volatile memories. It emerges following the conducting line of the non-volatile memory in the wireless system: On the one hand it develops the foundations of the interdisciplinary issues needed for design analysis and testing of the system. On the other hand it deals with many of the problems appearing when the systems are realized in industrial production. These cover the difficulties from the mobile system to the different types of non-volatile memories. The book explores memory cards, multichip technologies, and algorithms of the software management as well as error handling. It also presents techniques of assurance for the single components and a guide through the Datasheet lectures.
The architectural concept of a memory hierarchy has been immensely successful, making possible today's spectacular pace of technology evolution in both the volume of data and the speed of data access. Its success is difficult to understand, however, when examined within the traditional "memoryless" framework of performance analysis. The memoryless' framework cannot properly reflect a memory hierarchy's ability to take advantage of patterns of data use that are transient. The Fractal Structure of Data Reference: Applications to the Memory Hierarchy both introduces, and justifies empirically, an alternative modeling framework in which arrivals are driven by a statistically self-similar underlying process, and are transient in nature. The substance of this book comes from the ability of the model to impose a mathematically tractable structure on important problems involving the operation and performance of a memory hierarchy. It describes events as they play out at a wide range of time scales, from the operation of file buffers and storage control cache, to a statistical view of entire disk storage applications. Striking insights are obtained about how memory hierarchies work, and how to exploit them to best advantage. The emphasis is on the practical application of such results. The Fractal Structure of Data Reference: Applications to the Memory Hierarchy will be of interest to professionals working in the area of applied computer performance and capacity planning, particularly those with a focus on disk storage. The book is also an excellent reference for those interested in database and data structure research.
Kevin Zhang Advancement of semiconductor technology has driven the rapid growth of very large scale integrated (VLSI) systems for increasingly broad applications, incl- ing high-end and mobile computing, consumer electronics such as 3D gaming, multi-function or smart phone, and various set-top players and ubiquitous sensor and medical devices. To meet the increasing demand for higher performance and lower power consumption in many different system applications, it is often required to have a large amount of on-die or embedded memory to support the need of data bandwidth in a system. The varieties of embedded memory in a given system have alsobecome increasingly more complex, ranging fromstatictodynamic and volatile to nonvolatile. Among embedded memories, six-transistor (6T)-based static random access memory (SRAM) continues to play a pivotal role in nearly all VLSI systems due to its superior speed and full compatibility with logic process technology. But as the technology scaling continues, SRAM design is facing severe challenge in mainta- ing suf?cient cell stability margin under relentless area scaling. Meanwhile, rapid expansion in mobile application, including new emerging application in sensor and medical devices, requires far more aggressive voltage scaling to meet very str- gent power constraint. Many innovative circuit topologies and techniques have been extensively explored in recent years to address these challenges.
Web caching and content delivery technologies provide the
infrastructure on which systems are built for the scalable
distribution of information. This proceedings of the eighth annual
workshop, captures a cross-section of the latest issues and
techniques of interest to network architects and researchers in
large-scale content delivery. Topics covered include the
distribution of streaming multimedia, edge caching and computation,
multicast, delivery of dynamic content, enterprise content
delivery, streaming proxies and servers, content transcoding,
replication and caching strategies, peer-to-peer content delivery,
and Web prefetching.
This book is concerned with studying the co-design methodology in general, and how to determine the more suitable interface mechanism in a co-design system in particular. This is based on the characteristics of the application and those of the target architecture of the system. Guidelines are provided to support the designer's choice of the interface mechanism. Some new trends in co-design and system acceleration are also introduced.
The monograph will be dedicated to SRAM (memory) design and test issues in nano-scaled technologies by adapting the cell design and chip design considerations to the growing process variations with associated test issues. Purpose: provide process-aware solutions for SRAM design and test challenges.
This book proposes novel memory hierarchies and software optimization techniques for the optimal utilization of memory hierarchies. It presents a wide range of optimizations, progressively increasing in the complexity of analysis and of memory hierarchies. The final chapter covers optimization techniques for applications consisting of multiple processes found in most modern embedded devices.
This is the first comprehensive book on ferroelectric memories which contains chapters on device design, processing, testing, and device physics, as well as on breakdown, leakage currents, switching mechanisms, and fatigue. State-of-the-art device designs are included and illustrated among the books many figures. More than 500 up-to-date references and 76 problems make it useful as a research reference for physicists, engineers and students.
This volume contains invited and contributed papers presented at the 12th edition of the International Summer School on Neural Networks "Eduardo R. Caianiello," co-organized by the RIKEN BSI (Japan) and the Department of Physics of the U- versity of Salerno (Italy). The 12th edition of the school was directed by Maria Marinaro (University of Salerno), Silvia Scarpetta (University of Salerno) and Yoko Yamaguchi (RIKEN BSI Japan) and hosted in the Ettore Majoranca Center in Erice in Italy. The contributions collected in this book are aimed at providing primarily high-level tutorial coverage of the fields related to neural dynamics, reporting recent experim- tal and theoretical results investigating the role of collective dynamics in hippocampal and parahippocampal regions and in the mammalian olfactory system. This book is devoted to graduate students and researchers with different scientific background (including physics, mathematics, biology, neuroscience, etc.) who wish to learn about brain science beyond the boundary of their fields. Each lecture aimed to include basic guidance in each field. Topics of lectures include the hippocampus and entorhinal cortex dynamics and mammalian olfactory system dynamics, memory and phase coding, mechanisms for spatial navigation and for episodic memory function, oscillations in neural assemblies, cortical up and down states, and related topics where frontier efforts in recent decades have been successfully linked to a remarkable evolution of the field. April 2008 M. Marinaro S. Scarpetta Y. Yamaguchi
Semiconductor Memories provides in-depth coverage in the areas of
design for testing, fault tolerance, failure modes and mechanisms,
and screening and qualification methods including.
Welcome to the 2nd International Conference on Image and Video Retrieval, CIVR2003. The goal of CIVR is to illuminate the state of the art in visual information retrieval and to stimulate collaboration between researchers and practitioners. This year we received 110 submissions from 26 countries. Based upon the reviews of at least 3 members of the program committee, 43 papers were accepted for the research track of the conference. First, we would like to thank all of the members of the Program Committee and the additional referees listed below. Their reviews of the submissions played a pivotal role in the quality of the conference. Moreover, we are grateful to Nicu Sebe and Xiang Zhou for helping to organize the review process; Shih-Fu Chang and Alberto del Bimbo for setting up the practitioner track; and Erwin Bakker for editing the proceedings and designing the conference poster. Special thanks go to our keynote and plenary speakers, Nevenka Dimitrova fromPhilipsResearch, RameshJainfromGeorgiaTech, ChrisPorterfromGetty Images, andAlanSmeatonfromDublinCityUniversity.Furthermore, wewishto acknowledge our sponsors, the Beckman Institute at the University of Illinois at Urbana-Champaign, TsingHuaUniversity, theInstitutionofElectricalEngineers (IEE), PhilipsResearch, andtheLeidenInstituteofAdvancedComputerScience at Leiden University. Finally, we would like to express our thanks to severalpeople who performed important work related to the organization of the conference: Jennifer Quirk and Catherine Zech for the localorganizationat the BeckmanInstitute; Richard Harvey for his help with promotional activity and sponsorship for CIVR2003; andtotheorganizingcommitteeofthe?rstCIVRforsettinguptheinternational mission and structure of the co
The 2002 IFIP Workshop on Internet Technologies, Applications, and Societal Impact (WITASI 2002), held in Wroclaw, Poland, October 10-11, 2002, presents different research aspects of the Internet, both technical and societal. The workshop aims at getting together scientists and practitioners from different research areas to work together on Internet development and reflect on Internet consequences to the economy and society. The papers presented in these proceedings describe state-of-the-art research in such areas of Internet applications as languages, mobility, multimedia, quality of service, voice over IP, and wireless access. A total of 40 papers were submitted to WITASI 2002 out of which 18 papers were selected for presentation at the workshop and inclusion in the proceedings. The workshop also includes 4 invited papers. WIT ASI 2002 was sponsored by IFIP -the International Federation for Information Processing. It was organized by Working Group WG 6.4 on Internet Applications Engineering of the Technical Committee TC 6 on Communication Systems. Locally, WITASI 2002 was organized by the Institute of Control and Systems Engineering, Wroclaw University of Technology. |
![]() ![]() You may like...
Extremisms In Africa
Alain Tschudin, Stephen Buchanan-Clarke, …
Paperback
![]()
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R7,619
Discovery Miles 76 190
|