![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
The size of technically producible integrated circuits increases continuously, but the ability to design and verify these circuits does not keep up. Therefore today 's design flow has to be improved. Using a visionary approach, this book analyzes the current design methodology and verification methodology, a number of deficiencies are identified and solutions suggested. Improvements in the methodology as well as in the underlying algorithms are proposed.
Symbolic Boolean manipulation using binary decision diagrams (BDDs) has been successfully applied to a wide variety of tasks, particularly in very large scale integration (VLSI) computer-aided design (CAD). The concept of decision graphs as an abstract representation of Boolean functions dates back to the early work by Lee and Akers. In the last ten years, BDDs have found widespread use as a concrete data structure for symbolic Boolean manipulation. With BDDs, functions can be constructed, manipulated, and compared by simple and efficient graph algorithms. Since Boolean functions can represent not just digital circuit functions, but also such mathematical domains as sets and relations, a wide variety of CAD problems can be solved using BDDs. Binary Decision Diagrams and Applications for VLSI CAD provides valuable information for both those who are new to BDDs as well as to long time aficionados.' -from the Foreword by Randal E. Bryant. Over the past ten years ... BDDs have attracted the attention of many researchers because of their suitability for representing Boolean functions. They are now widely used in many practical VLSI CAD systems. ... this book can serve as an introduction to BDD techniques and ... it presents several new ideas on BDDs and their applications. ... many computer scientists and engineers will be interested in this book since Boolean function manipulation is a fundamental technique not only in digital system design but also in exploring various problems in computer science.' - from the Preface by Shin-ichi Minato.
Whether you are an experienced Security or System Administrator or a Newbie to the industry, you will learn how to use native, "out-of-the-box," operating system capabilities to secure your UNIX environment. No need for third-party software or freeware tools to be and stay secure! This book will help you ensure that your system is protected from unauthorized users and conduct intrusion traces to identify the intruders if this does occur. It provides you with practical information to use of the native OS security capabilities without the need for a third party security software application. Also included are hundreds of security tips, tricks, ready-to-use scripts and configuration files that will be a valuable resource in your endeavor to secure your UNIX systems.
The consequences of recent floods and flash floods in many parts of the world have been devastating. One way to improving flood management practice is to invest in data collection and modelling activities which enable an understanding of the functioning of a system and the selection of optimal mitigation measures. A Digital Terrain Model (DTM) provides the most essential information for flood managers. Light Detection and Ranging (LiDAR) surveys which enable the capture of spot heights at a spacing of 0.5m to 5m with a horizontal accuracy of 0.3m and a vertical accuracy of 0.15m can be used to develop high accuracy DTM but needs careful processing before using it for any application.This book presents the augmentation of an existing Progressive Morphological filtering algorithm for processing raw LiDAR data to support a 1D/2D urban flood modelling framework. The key characteristics of this improved algorithm are: (1) the ability to deal with different kinds of buildings; (2) the ability to detect elevated road/rail lines and represent them in accordance to the reality; (3) the ability to deal with bridges and riverbanks; and (4) the ability to recover curbs and the use of appropriated roughness coefficient of Manning's value to represent close-to-earth vegetation (e.g. grass and small bush).
Although the origins of parallel computing go back to the last century, it was only in the 1970s that parallel and vector computers became available to the scientific community. The first of these machines-the 64 processor llliac IV and the vector computers built by Texas Instruments, Control Data Corporation, and then CRA Y Research Corporation-had a somewhat limited impact. They were few in number and available mostly to workers in a few government laboratories. By now, however, the trickle has become a flood. There are over 200 large-scale vector computers now installed, not only in government laboratories but also in universities and in an increasing diversity of industries. Moreover, the National Science Foundation's Super computing Centers have made large vector computers widely available to the academic community. In addition, smaller, very cost-effective vector computers are being manufactured by a number of companies. Parallelism in computers has also progressed rapidly. The largest super computers now consist of several vector processors working in parallel. Although the number of processors in such machines is still relatively small (up to 8), it is expected that an increasing number of processors will be added in the near future (to a total of 16 or 32). Moreover, there are a myriad of research projects to build machines with hundreds, thousands, or even more processors. Indeed, several companies are now selling parallel machines, some with as many as hundreds, or even tens of thousands, of processors."
Protocols that remain zero-knowledge when many instances are executed concurrently are called concurrent zero-knowledge, and this book is devoted to their study. The book presents constructions of concurrent zero-knowledge protocols, along with proofs of security. It also shows why "traditional" proof techniques (i.e., black-box simulation) are not suitable for establishing the concurrent zero-knowledge property of "message-efficient" protocols.
Educational initiatives attempt to introduce or promote a culture of quality within education by raising concerns related to student learning, providing services related to assessment, professional development of teachers, curriculum and pedagogy, and influencing educational policy, in the realm of technology.
Fuzzy Sets in the Management of Uncertainty presents an overview of current problems in business management, primarily for those situations involving decision making of an economic-financial nature. The monograph therefore discusses problems of planning, programming, control and brings light to the entire financial network in its three phases: raising funds, analysis and investment. Special attention is paid to production processes and marketing of products and services. This monograph is a highly readable overview and introduction for scientists, professionals, graduate students, managers and consultants in the growing field of applications and fuzzy logic in the field of management.
The thesis work was in two major parts: development and testing of
a new approach to detecting and
Over the past twenty years, the conventional knowledge management approach has evolved into a strategic management approach that has found applications and opportunities outside of business, in society at large, through education, urban development, governance, and healthcare, among others. Knowledge-Based Development for Cities and Societies: Integrated Multi-Level Approaches enlightens the concepts and challenges of knowledge management for both urban environments and entire regions, enhancing the expertise and knowledge of scholars, researchers, practitioners, managers and urban developers in the development of successful knowledge-based development policies, creation of knowledge cities and prosperous knowledge societies. This reference creates large knowledge base for scholars, managers and urban developers and increases the awareness of the role of knowledge cities and knowledge societies in the knowledge era, as well as of the challenges and opportunities for future research.
Our understanding of nature is often through nonuniform observations in space or time. In space, one normally observes the important features of an object, such as edges. The less important features are interpolated. History is a collection of important events that are nonuniformly spaced in time. Historians infer between events (interpolation) and politicians and stock market analysts forecast the future from past and present events (extrapolation). The 20 chapters of Nonuniform Sampling: Theory and Practice contain contributions by leading researchers in nonuniform and Shannon sampling, zero crossing, and interpolation theory. Its practical applications include NMR, seismology, speech and image coding, modulation and coding, optimal content, array processing, and digital filter design. It has a tutorial outlook for practising engineers and advanced students in science, engineering, and mathematics. It is also a useful reference for scientists and engineers working in the areas of medical imaging, geophysics, astronomy, biomedical engineering, computer graphics, digital filter design, speech and video processing, and phased array radar. A special feature of the package is a CD-ROM containing C-codes, Matlab and Mathcad programs for the algorithms presented.
The growth of mobile technology has caused considerable changes in the way we interact with one another within both personal and business environments. Advancements in mobile computing and mobile multimedia resonate with engineers, strategists, developers, and managers while also determining the behavior and interaction of end users. Advancing the Next-Generation of Mobile Computing: Emerging Technologies offers historical perspectives on mobile computing, as well as new frameworks and methodologies for mobile networks, intelligent mobile applications, and mobile computing applications. This collection of research aims to inform researchers, designers, and users of mobile technology and promote awareness of new trends and tools in this growing field of study.
The objective of the NATO Advanced Research Workshop "Learning electricity and electronics with advanced educational technology" was to bring together researchers coming from different domains. Electricity education is a domain where a lot of research has already been made. The first meeting on electricity teaching was organized in 1984 by R. Duit, W. Jung and C. von Rhoneck in Ludwigsburg (Germany). Since then, research has been going on and we can consider that the workshop was the successor of this first meeting. Our goal was not to organize a workshop grouping only people producing software in the field of electricity education or more generally in the field of physics education, even if this software was based on artificial intelligence techniques. On the contrary, we wanted this workshop to bring together researchers involved in the connection between cognitive science and the learning of a well defined domain such as electricity. So during the workshop, people doing research in physics education, cognitive psychology, and artificial intelligence had the opportunity to discuss and exchange. These proceedings reflect the different points of view. The main idea is that designing a learning environment needs the confrontation of different approaches. The proceedings are organized in five parts which reflect these different aspects.
This book focuses on the aspects related to the parallelization of evolutionary computations, such as parallel genetic operators, parallel fitness evaluation, distributed genetic algorithms, and parallel hardware implementations, as well as on their impact on several applications. It offers a wide spectrum of sample works developed in leading research about parallel implementations of efficient techniques at the heart of computational intelligence.
This book describes the emerging practice of e-mail tutoring; one-to-one correspondence between college students and writing tutors conducted over electronic mail. It reviews the history of Composition Studies, paying special attention to those ways in which writing centers and computers and composition have been previously hailed within a narrative of functional literacy and quick-fix solutions. The author suggests a new methodology for tutoring, and a new mandate for the writing center: a strong connection between the rhythms of extended, asynchronous writing and dialogic literacy. The electronic writing center can become a site for informed resistance to functional literacy.
Analog Design Issues in Digital VLSI Circuits and Systems brings together in one place important contributions and up-to-date research results in this fast moving area. Analog Design Issues in Digital VLSI Circuits and Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.
Mobile devices are the 'it' technology, and everyone wants to know how to apply them to their environments. This book brings together the best examples and insights for implementing mobile technology in libraries. Chapters cover a wide variety of the most important tools and procedures from developing applications to marketing and augmented reality. Readers of this volume will get complete and timely knowledge of library applications for handheld devices. The Handheld Librarian conferences have been a centrepiece of learning about how to apply mobile technologies to library services and collections as well as a forum for sharing examples and lessons learned. The conferences have brought our profession forward into the trend and kept us up to date with ongoing advances. This volume brings together the best from that rich story and presents librarians with the basic information they need to successfully make the case for and implement programs leveraging mobile devices in their libraries. Authors of the diverse practical and well researched pieces originate in all types of libraries and segments of the profession. This wide representation ensures that front line librarians, library administrators, systems staff, even library professors will find this volume perfectly geared for their needs. This book was published as a special issue of The Reference Librarian.
It has long been apparent to academic library administrators that the current technical services operations within libraries need to be redirected and refocused in terms of both format priorities and human resources. A number of developments and directions have made this reorganization imperative, many of which have been accelerated by the current economic crisis. All of the chapters detail some aspect of technical services reorganization due to downsizing and/or reallocation of human resources, retooling professional and support staff in higher level duties and/or non-MARC metadata, "value-added" metadata opportunities, outsourcing redundant activities, and shifting resources from analog to digital object organization and description. This book will assist both catalogers and library administrators with concrete examples of moving technical services operations and personnel from the analog to the digital environment. This book was published as a special double issue of Cataloging & Classification Quarterly.
'Et moi, ..., si j'avait su comment en revenir, One service mathematics has rendered the je n'y serais point alIe.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
This book takes a formal approach to teaching software engineering, using not only UML, but also Object Constraint Language (OCL) for specification and analysis of designed models. Employing technical details typically missing from existing textbooks on software engineering, the author shows how precise specifications lead to static verification of software systems. In addition, data management is given the attention that is required in order to produce a successful software project. Uses constraints in all phases of software development Follows recent developments in software technologies Technical coverage of data management issues and software verification Illustrated throughout to present analysis, specification, implementation and verification of multiple applications Includes end-of-chapter exercises and Instructor Presentation Slides
The evaluation of IT and its business value are recently the subject of many academic and business discussions, as business managers, management consultants and researchers regularly question whether and how the contribution of IT to business performance can be evaluated effectively. Investments in IT are growing extensively and business managers worry about the fact that the benefits of IT investments might not be as high as expected. This phenomenon is often called the IT investment paradox or the IT Black Hole: larges sums are invested in IT that seem to be swallowed by a large black hole without rendering many returns. Information Systems Evaluation Management discusses these issues among others, through its presentation of the most current research in the field of IS evaluation. It is an area of study that touches upon a variety of types of businesses and organization essentially all those who involve IT in their business practices.
Timing research in high performance VLSI systems has advanced at a steady pace over the last few years. Tools, however, especially theoretical mechanisms, lag behind. Much of the present timing research relies heavily on timing diagrams, which although intuitive, are inadequate for analysis of large designs with many parameters. Further, timing diagrams offer only approximations, not exact solutions to many timing problems and provide little insight in the cases where temporal properties of a design interact intricately with the design's logical functionalities. Timed Boolean Functions presents a methodology for timing research which facilitates analysis and design of circuits and systems in a unified temporal and logical domain. The goal of the book is to present the central idea of representing logical and timing information in a common structure, TBFs, and to present a canonical form suitable for efficient manipulation. This methodology is then applied to practical applications to provide intuition and insight into the subject so that these general methods can be adapted to specific engineering problems and also to further the research necessary to enhance the understanding of the field. Timed Boolean Functions is written for professionals involved in timing research and digital designers who want to enhance their understanding of the timing aspects of high speed circuits. The prerequisites are a common background in logic design, computer algorithms, combinatorial optimization and a certain degree of mathematical sophistication.
Gay provides an authoritative overview of the major developments, people, and organizations that have shaped the design and use of computers. He also describes innovations in computer research and technology (including highlights from developments in supercomputers, supercomputer networks, and Internet 2), the latest trends in consumer products (new applications that influence the way individuals and businesses interface with their world), social issues related to computers (Microsoft's influence, privacy, encryption, universal access, and adaptive technologies), and information on computer careers and how to prepare for them.
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee." |
You may like...
Atomic and Molecular Manipulation…
Andrew J. Mayne, Gerard Dujardin
Hardcover
R3,430
Discovery Miles 34 300
The Death Of Democracy - Hitler's Rise…
Benjamin Carter Hett
Paperback
(1)
|