![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
A presentation of the central and basic concepts, techniques, and tools of computer science, with the emphasis on presenting a problem-solving approach and on providing a survey of all of the most important topics covered in degree programmes. Scheme is used throughout as the programming language and the author stresses a functional programming approach to create simple functions so as to obtain the desired programming goal. Such simple functions are easily tested individually, which greatly helps in producing programs that work correctly first time. Throughout, the author aids to writing programs, and makes liberal use of boxes with "Mistakes to Avoid." Programming examples include: * abstracting a problem; * creating pseudo code as an intermediate solution; * top-down and bottom-up design; * building procedural and data abstractions; * writing progams in modules which are easily testable. Numerous exercises help readers test their understanding of the material and develop ideas in greater depth, making this an ideal first course for all students coming to computer science for the first time.
This is the first book to treat two areas of speech synthesis: natural language processing and the inherent problems it presents for speech synthesis; and digital signal processing, with an emphasis on the concatenative approach. The text guides the reader through the material in a step-by-step easy-to-follow way. The book will be of interest to researchers and students in phonetics and speech communication, in both academia and industry.
The book presents the state of the art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of high performance systems and heterogeneous architectures. The application contributions cover computational fluid dynamics, material science, medical applications and climate research. Innovative fields like coupled multi-physics or multi-scale simulations are presented. All papers were chosen from presentations given at the 14th Teraflop Workshop held in December 2011 at HLRS, University of Stuttgart, Germany and the Workshop on Sustained Simulation Performance at Tohoku University in March 2012.
With the ever-increasing speed of integrated circuits, violations of the performance specifications are becoming a major factor affecting the product quality level. The need for testing timing defects is further expected to grow with the current design trend of moving towards deep submicron devices. After a long period of prevailing belief that high stuck-at fault coverage is sufficient to guarantee high quality of shipped products, the industry is now forced to rethink other types of testing. Delay testing has been a topic of extensive research both in industry and in academia for more than a decade. As a result, several delay fault models and numerous testing methodologies have been proposed. Delay Fault Testing for VLSI Circuits presents a selection of existing delay testing research results. It combines introductory material with state-of-the-art techniques that address some of the current problems in delay testing. Delay Fault Testing for VLSI Circuits covers some basic topics such as fault modeling and test application schemes for detecting delay defects. It also presents summaries and conclusions of several recent case studies and experiments related to delay testing. A selection of delay testing issues and test techniques such as delay fault simulation, test generation, design for testability and synthesis for testability are also covered. Delay Fault Testing for VLSI Circuits is intended for use by CAD and test engineers, researchers, tool developers and graduate students. It requires a basic background in digital testing. The book can used as supplementary material for a graduate-level course on VLSI testing.
Knowledge in its pure state is tacit in nature-difficult to formalize and communicate-but can be converted into codified form and shared through both social interactions and the use of IT-based applications and systems. Even though there seems to be considerable synergies between the resulting huge data and the convertible knowledge, there is still a debate on how the increasing amount of data captured by corporations could improve decision making and foster innovation through effective knowledge-sharing practices. Big Data and Knowledge Sharing in Virtual Organizations provides innovative insights into the influence of big data analytics and artificial intelligence and the tools, methods, and techniques for knowledge-sharing processes in virtual organizations. The content within this publication examines cloud computing, machine learning, and knowledge sharing. It is designed for government officials and organizations, policymakers, academicians, researchers, technology developers, and students.
Enterprises have made amazing advances by taking advantage of data about their business to provide predictions and understanding of their customers, markets, and products. But as the world of business becomes more interconnected and global, enterprise data is no long a monolith; it is just a part of a vast web of data. Managing data on a world-wide scale is a key capability for any business today. The Semantic Web treats data as a distributed resource on the scale of the World Wide Web, and incorporates features to address the challenges of massive data distribution as part of its basic design. The aim of the first two editions was to motivate the Semantic Web technology stack from end-to-end; to describe not only what the Semantic Web standards are and how they work, but also what their goals are and why they were designed as they are. It tells a coherent story from beginning to end of how the standards work to manage a world-wide distributed web of knowledge in a meaningful way. The third edition builds on this foundation to bring Semantic Web practice to enterprise. Fabien Gandon joins Dean Allemang and Jim Hendler, bringing with him years of experience in global linked data, to open up the story to a modern view of global linked data. While the overall story is the same, the examples have been brought up to date and applied in a modern setting, where enterprise and global data come together as a living, linked network of data. Also included with the third edition, all of the data sets and queries are available online for study and experimentation at data.world/swwo.
This volume includes chapters presenting applications of different metaheuristics in reliability engineering, including ant colony optimization, great deluge algorithm, cross-entropy method and particle swarm optimization. It also presents chapters devoted to cellular automata and support vector machines, and applications of artificial neural networks, a powerful adaptive technique that can be used for learning, prediction and optimization. Several chapters describe aspects of imprecise reliability and applications of fuzzy and vague set theory.
Over the last five to six years, ontology has received increased attention within the information systems field. Ontology provides a basis for evaluating, analyzing, and engineering business analysis methods. It is that type of theology that has allowed many organizations utilizing ontology to become more competitive within today's global environment. Business Systems Analysis with Ontologies examines, thoroughly, the area of ontologies. All aspects of ontologies are covered; analysis, evaluation, and engineering of business systems analysis methods. Readers are shown the world of ontologies through a number of research methods. For example, survey methodologies, case studies, experimental methodologies, analytical modeling, and field studies are all used within this book to help the reader understand the usefulness of ontologies.
Within the last 10-13 years Binary Decision Diagrams (BDDs) have become the state-of-the-art data structure in VLSI CAD for representation and manipulation of Boolean functions. Today, BDDs are widely used and in the meantime have also been integrated in commercial tools, especially in the area of verification and synthesis. The interest in BDDs results from the fact that the data structure is generally accepted as providing a good compromise between conciseness of representation and efficiency of manipulation. With increasing numbers of applications, also in non-CAD areas, classical methods of handling BDDs are being improved and new questions and problems evolve and have to be solved. Binary Decision Diagrams: Theory and Implementation is intended both for newcomers to BDDs and for researchers and practitioners who need to implement them. Apart from giving a quick start for the reader who is not familiar with BDDs (or DDs in general), it also discusses several new aspects of BDDs, e.g. with respect to minimization and implementation of a package. It is an essential bookshelf item for any CAD designer or researcher working with BDDs.
This consistently written book provides a comprehensive presentation of a multitude of results stemming from the author's as well as various researchers' work in the field. It also covers functional decomposition for incompletely specified functions, decomposition for multi-output functions and non-disjoint decomposition.
This book introduces context-aware computing, providing definitions, categories, characteristics, and context awareness itself and discussing its applications with a particular focus on smart learning environments. It also examines the elements of a context-aware system, including acquisition, modelling, reasoning, and distribution of context. It also reviews applications of context-aware computing - both past and present - to offer readers the knowledge needed to critically analyse how context awareness can be put to use. It is particularly to those new to the subject area who are interested in learning how to develop context-aware computing-oriented applications, as well as postgraduates and researchers in computer engineering, communications engineering related areas of information technology (IT). Further it provides practical know-how for professionals working in IT support and technology, consultants and business decision-makers and those working in the medical, human, and social sciences.
Neurobiology research suggests that information can be represented by the location of an activity spot in a population of cells (place coding'), and that this information can be processed by means of networks of interconnections. Place Coding in Analog VLSI defines a representation convention of similar flavor intended for analog-integrated circuit design. It investigates its properties and suggests ways to build circuits on the basis of this coding scheme. In this electronic version of place coding, numbers are represented by the state of an array of nodes called a map, and computation is carried out by a network of links. In the simplest case, a link is just a wire connecting a node of an input map to a node of an output map. In other cases, a link is an elementary circuit cell. Networks of links are somewhat reminiscent of look-up tables in that they hardwire an arbitrary function of one or several variables. Interestingly, these structures are also related to fuzzy rules, as well as some types of artificial neural networks. The place coding approach provides several substantial benefits over conventional analog design: Networks of links can be synthesized by a simple procedure whatever the function to be computed. Place coding is tolerant to perturbations and noise in current-mode implementations. Tolerance to noise implies that the fundamental power dissipation limits of conventional analog circuits can be overcome by using place coding. The place coding approach is illustrated by three integrated circuits computing non-linear functions of several variables. The simplest one is made up of 80 links and achieves submicrowatt power consumption in continuous operation. The most complex one incorporates about 1800 links for a power consumption of 6 milliwatts, and controls the operation of an active vision system with a moving field of view. Place Coding in Analog VLSI is primarily intended for researchers and practicing engineers involved in analog and digital hardware design (especially bio-inspired circuits). The book is also a valuable reference for researchers and students in neurobiology, neuroscience, robotics, fuzzy logic and fuzzy control.
Embedded computer systems use both off-the-shelf microprocessors and application-specific integrated circuits (ASICs) to implement specialized system functions. Examples include the electronic systems inside laser printers, cellular phones, microwave ovens, and an automobile anti-lock brake controller. Embedded computing is unique because it is a co-design problem - the hardware engine and application software architecture must be designed simultaneously. Hardware-Software Co-Synthesis of Distributed Embedded Systems proposes new techniques such as fixed-point iterations, phase adjustment, and separation analysis to efficiently estimate tight bounds on the delay required for a set of multi-rate processes preemptively scheduled on a real-time reactive distributed system. Based on the delay bounds, a gradient-search co-synthesis algorithm with new techniques such as sensitivity analysis, priority prediction, and idle- processing elements elimination are developed to select the number and types of processing elements in a distributed engine, and determine the allocation and scheduling of processes to processing elements. New communication modeling is also presented to analyze communication delay under interaction of computation and communication, allocate interprocessor communication links, and schedule communication. Hardware-Software Co-Synthesis of Distributed Embedded Systems is the first book to describe techniques for the design of distributed embedded systems, which have arbitrary hardware and software topologies. The book will be of interest to: academic researchers for personal libraries and advanced-topics courses in co-design as well as industrial designers who are building high-performance, real-time embedded systems with multiple processors.
Probabilistic and Statistical Methods in Computer Science
Advanced Topics in Information Technology Standards and Standardization Research is a series of books which features the most current research findings in all aspects of IT standardization research, from a diversity of angles, traversing the traditional boundaries between individual disciplines. ""Advanced Topics in Information Technology Standards and Standardization Research, Volume 1"", is a part of this series. ""Advanced Topics in Information Technology Standards and Standardization Research, Volume 1,"" presents a collection of chapters addressing a variety of aspects related to IT standards and the setting of standards. This book covers a variety of topics, such as economic aspects of standards, alliances in standardization and the relation between 'formal' standards bodies and industry consortia. It also offers a glimpse inside a standards working group, as well as a look at applications of standards in different sectors.
Healthcare is significantly affected by technological advancements, as technology both shapes and changes health systems locally and globally. As areas of computer science, information technology, and healthcare merge, it is important to understand the current and future implications of health informatics. Healthcare and the Effect of Technology: Developments, Challenges and Advancements bridges the gap between today's empirical research findings and healthcare practice. It provides the reader with information on current technological integrations, potential uses for technology in healthcare, and the implications both positive and negative of health informatics for one's health. Technology in healthcare can improve efficiency, make patient records more accessible, increase professional communication, create global health networking, and increase access to healthcare. However, it is important to consider the ethical, confidential, and cultural implications technology in healthcare may impose. That is what makes this book is a must-read for policymakers, human resource professionals, management personnel, as well as for researchers, scholars, students, and healthcare professionals.
In Silico Chemistry and Biology: Current and Future Prospects provides a compact overview on recent advances in this highly dynamic branch of chemistry. Various methods of protein modelling and computer-assisted drug design are presented, including fragment- and ligand-based approaches. Many successful practical applications of these techniques are demonstrated. The authors also look to the future and describe the main challenges of the field.
In many organizations, Information Technology (IT) has become crucial in the support, the sustainability and the growth of the business. This pervasive use of technology has created a critical dependency on IT that calls for a specific focus on IT Governance. IT Governance consists of the leadership and organizational structures, processes and relational mechanisms that ensure that the organization's IT sustains and extends the organization's strategy and objectives. Strategies for Information Technology Governance records and interprets some important existing theories, models and practices in the IT Governance domain and aims to contribute to the understanding of IT Governance.
The rise in population and the concurrently growing consumption rate necessitates the evolution of agriculture to adopt current computational technologies to increase production at a faster and smoother scale. While existing technologies may help in crop processing, there is a need for studies that seek to understand how modern approaches like artificial intelligence, fuzzy logic, and hybrid algorithms can aid the agricultural process while utilizing energy sources efficiently. The Handbook of Research on Smart Computing for Renewable Energy and Agro-Engineering is an essential publication that examines the benefits and barriers of implementing computational models to agricultural production and energy sources as well as how these models can produce more cost-effective and sustainable solutions. Featuring coverage on a wide range of topics such as bacterial foraging, swarm intelligence, and combinatorial optimization, this book is ideally designed for agricultural engineers, farmers, municipal union leaders, computer scientists, information technologists, sustainable developers, managers, environmentalists, industry professionals, academicians, researchers, and students.
Algorithms for VLSI Physical Design Automation, Third Edition covers all aspects of physical design. The book is a core reference for graduate students and CAD professionals. For students, concepts and algorithms are presented in an intuitive manner. For CAD professionals, the material presents a balance of theory and practice. An extensive bibliography is provided which is useful for finding advanced material on a topic. At the end of each chapter, exercises are provided, which range in complexity from simple to research level. Algorithms for VLSI Physical Design Automation, Third Edition provides a comprehensive background in the principles and algorithms of VLSI physical design. The goal of this book is to serve as a basis for the development of introductory-level graduate courses in VLSI physical design automation. It provides self-contained material for teaching and learning algorithms of physical design. All algorithms which are considered basic have been included, and are presented in an intuitive manner. Yet, at the same time, enough detail is provided so that readers can actually implement the algorithms given in the text and use them. The first three chapters provide the background material, while the focus of each chapter of the rest of the book is on each phase of the physical design cycle. In addition, newer topics such as physical design automation of FPGAs and MCMs have been included. The basic purpose of the third edition is to investigate the new challenges presented by interconnect and process innovations. In 1995 when the second edition of this book was prepared, a six-layer process and 15 million transistor microprocessors were in advanced stages of design. In 1998, six metal process and 20 million transistor designs are in production. Two new chapters have been added and new material has been included in almost allother chapters. A new chapter on process innovation and its impact on physical design has been added. Another focus of the third edition is to promote use of the Internet as a resource, so wherever possible URLs have been provided for further investigation. Algorithms for VLSI Physical Design Automation, Third Edition is an important core reference work for professionals as well as an advanced level textbook for students.
Mathematical Visualization is a young new discipline. It offers
efficient visualization tools to the classical subjects of
mathematics, and applies mathematical techniques to problems in
computer graphics and scientific visualization. Originally, it
started in the interdisciplinary area of differential geometry,
numerical mathematics, and computer graphics. In recent years, the
methods developed have found important applications.
This volume brings together recent theoretical work in Learning Classifier Systems (LCS), which is a Machine Learning technique combining Genetic Algorithms and Reinforcement Learning. It includes self-contained background chapters on related fields (reinforcement learning and evolutionary computation) tailored for a classifier systems audience and written by acknowledged authorities in their area - as well as a relevant historical original work by John Holland. |
You may like...
Death in Documentaries - The Memento…
Benjamin Bennett-Carpenter
Paperback
R2,072
Discovery Miles 20 720
The Real of Reality: The Realist Turn in…
Christine Reeh-Peters, Stefan Schmidt, …
Hardcover
R4,727
Discovery Miles 47 270
|