![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
This comprehensive book draws together experts to explore how knowledge technologies can be exploited to create new multimedia applications, and how multimedia technologies can provide new contexts for the use of knowledge technologies. Thorough coverage of all relevant topics is given. The step-by-step approach guides the reader from fundamental enabling technologies of ontologies, analysis and reasoning, through to applications which have hitherto had less attention.
Future Data and Knowledge Base Systems will require new functionalities: richer data modelling capabilities, more powerful query languages, and new concepts of query answers. Future query languages will include functionalities such as hypothetical reasoning, abductive reasoning, modal reasoning, and metareasoning, involving knowledge and belief. Intentional answers will lead to cooperative query answering in which the answer to a query takes into consideration user's expectations. Non-classical logic plays an important role in this book for the formalization of new queries and new answers. It is shown how logic permits precise definitions for concepts like cooperative answers, subjective queries, or reliable sources of information, and gives a precise framework for reasoning about these complex concepts. It is worth noting that advances in knowledge management are not just an application domain for existing results in logic, but also require new developments in logic. The book is organized into 10 chapters which cover the areas of cooperative query answering (in the first three chapters), metareasoning and abductive reasoning (chapters 5 to 7), and, finally, hypothetical and subjunctive reasoning (last three chapters).
Adding internet access to embedded systems opens up a whole new world of capabilities. For example, a remote data logging system could automatically send data via the internet and be reconfigured - such as to log new types of data or to measure at different intervals - by commands sent over the internet from any computer or device with internet access. Embedded internet and internet appliances are the focus of great attention in the computing industry, as they are seen as the future of computing, but the design of such devices presents many technical challenges.;This book describes how to design, build and program embedded systems with internet access, giving special attention to sensors and actuators which gather data for transmission over the internet or execute commands sent by the internet, It shows how to build sensors and control devices that connect to the "tiny internet interface" (TINI) and explains how to write programs that control them in Java. Several design case histories are given, including weather monitoring stations, communications centres, automation systems, and data acquisitions systems. The authors discuss how these technologies work and where to get detailed specifications, and they provide ideas for the reader to pursue beyond the book. The accompanying CD-ROM includes Java source code for all the applications described in the book, and an electronic version of the text.
SystemC Kernel Extensions for Heterogeneous System Modeling is a result of an almost two year endeavour on our part to understand how SystemC can be made useful for system level modeling at higher levels of abstraction. Making it a truly heterogeneous modeling language and platform, for hardware/software co-design as well as complex embedded hardware designs has been our focus in the work reported in this book.
This book contains articles written by experts on a wide range of topics that are associated with the analysis and management of biological information at the molecular level. It contains chapters on RNA and protein structure analysis, DNA computing, sequence mapping, genome comparison, gene expression data mining, metabolic network modeling, and phyloinformatics. The important work of some representative researchers in bioinformatics is brought together for the first time in one volume. The topic is treated in depth and is related to, where applicable, other emerging technologies such as data mining and visualization. The goal of the book is to introduce readers to the principle techniques of bioinformatics in the hope that they will build on them to make new discoveries of their own. Contents: Exploring RNA Intermediate Conformations with the Massively Parallel Genetic Algorithm; Introduction to Self-Assembling DNA Nanostructures for Computation and Nanofabrication; Mapping Sequence to Rice FPC; Graph Theoretic Sequence Clustering Algorithms and their Applications to Genome Comparison; The Protein Information Resource for Functional Genomics and Proteomics; High-Grade Ore for Data Mining in 3D Structures; Protein Classification: A Geometric Hashing Approach; Interrelated Clustering: An Approach for Gene Expression Data Analysis; Creating Metabolic Network Models Using Text Mining and Expert Knowledge; Phyloinformatics and Tree Networks. Readership: Molecular biologists who rely on computers and mathematical scientists with interests in biology.
In 2002, the International Conference on Computer Aided Design (ICCAD) celebrates its 20th anniversary. This book commemorates contributions made by ICCAD to the broad field of design automation during that time. The foundation of ICCAD in 1982 coincided with the growth of Large Scale Integration. The sharply increased functionality of board-level circuits led to a major demand for more powerful Electronic Design Automation (EDA) tools. At the same time, LSI grew quickly and advanced circuit integration became widely avail able. This, in turn, required new tools, using sophisticated modeling, analysis and optimization algorithms in order to manage the evermore complex design processes. Not surprisingly, during the same period, a number of start-up com panies began to commercialize EDA solutions, complementing various existing in-house efforts. The overall increased interest in Design Automation (DA) re quired a new forum for the emerging community of EDA professionals; one which would be focused on the publication of high-quality research results and provide a structure for the exchange of ideas on a broad scale. Many of the original ICCAD volunteers were also members of CANDE (Computer-Aided Network Design), a workshop of the IEEE Circuits and Sys tem Society. In fact, it was at a CANDE workshop that Bill McCalla suggested the creation of a conference for the EDA professional. (Bill later developed the name)."
The information infrastructure---comprising computers, embedded devices, networks and software systems---is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection V describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: Themes and Issues, Control Systems Security, Infrastructure Security, and Infrastructure Modeling and Simulation. This book is the 5th volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of 14 edited papers from the 5th Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection, held at Dartmouth College, Hanover, New Hampshire, USA in the spring of 2011. Critical Infrastructure Protection V is an important resource for researchers, faculty members and graduate students, as well as for policy makers, practitioners and other individuals with interests in homeland security. Jonathan Butts is an Assistant Professor of Computer Science at the Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio, USA. Sujeet Shenoi is the F.P. Walter Professor of Computer Science at the University of Tulsa, Tulsa, Oklahoma, USA.
Because of the increased access to high-speed Internet and smart phones, many patients have started to use mobile applications to manage various health needs. These devices and mobile apps are now increasingly used and integrated with telemedicine and telehealth via the medical Internet of Things (IoT). The Handbook of Research on Big Data Management and the Internet of Things for Improved Health Systems is a critical scholarly resource that examines the digital transformation of healthcare. Featuring coverage on a broad range of topics, such as brain computer interface, data reduction techniques, and risk factors, this book is geared towards academicians, practitioners, researchers, and students seeking research on health and well-being data.
Contains revised, edited, cross-referenced, and thematically organized selected DumpAnalysis.org blog posts about memory dump and software trace analysis, software troubleshooting and debugging written in November 2010 - October 2011 for software engineers developing and maintaining products on Windows platforms, quality assurance engineers testing software on Windows platforms, technical support and escalation engineers dealing with complex software issues, and security researchers, malware analysts and reverse engineers. The sixth volume features: - 56 new crash dump analysis patterns including 14 new .NET memory dump analysis patterns - 4 new pattern interaction case studies - 11 new trace analysis patterns - New Debugware pattern - Introduction to UI problem analysis patterns - Introduction to intelligence analysis patterns - Introduction to unified debugging pattern language - Introduction to generative debugging, metadefect template library and DNA of software behavior - The new school of debugging - .NET memory dump analysis checklist - Software trace analysis checklist - Introduction to close and deconstructive readings of a software trace - Memory dump analysis compass - Computical and Stack Trace Art - The abductive reasoning of Philip Marlowe - Orbifold memory space and cloud computing - Memory worldview - Interpretation of cyberspace - Relationship of memory dumps to religion - Fully cross-referenced with Volume 1, Volume 2, Volume 3, Volume 4, and Volume 5
The 20th century saw tremendous achievements and progress in
science and
ISGC 2009, The International Symposium on Grid Computing was held at Academia Sinica, Taipei, Taiwan in April 2009 bringing together prestigious scientists and engineers worldwide to exchange ideas, present challenges/solutions and introduce future development in the field of Grid Computing. Managed Grids and Cloud Systems in the Asia-Pacific Research Community presents the latest achievements in grid technology including Cloud Computing. This volume also covers international projects in Grid Operation, Grid Middleware, E-Science applications, technical developments in grid operations and management, Security and Networking, Digital Library and more. The resources used to support these advances, such as volunteer grids, production managed grids, and cloud systems are discussed in detail. This book is designed for a professional audience composed of grid users, developers and researchers working in the grid computing. Advanced-level students focusing on computer science and engineering will find this book valuable as a reference or secondary text book.
"Set Theory for Computing" provides a comprehensive account of set-oriented symbolic manipulation methods suitable for automated reasoning. Its main objective is twofold: 1) to provide a flexible formalization for a variety of set languages, and 2) to clarify the semantics of set constructs firmly established in modern specification languages and in the programming practice. Topics include: semantic unification, decision algorithms, modal logics, declarative programming, tableau-based proof techniques, and theory-based theorem proving. The style of presentation is self-contained, rigorous and accurate. Some familiarity with symbolic logic is helpful but not a requirement. This book is a useful resource for all advanced students, professionals, and researchers in computing sciences, artificial intelligence, automated reasoning, logic, and computational mathematics. It will serve to complement their intuitive understanding of set concepts with the ability to master them by symbolic and logically based algorithmic methods and deductive techniques.
The growing commercial market of Microwave/ Millimeter wave industry over the past decade has led to the explosion of interests and opportunities for the design and development of microwave components.The design of most microwave components requires the use of commercially available electromagnetic (EM) simulation tools for their analysis. In the design process, the simulations are carried out by varying the design parameters until the desired response is obtained. The optimization of design parameters by manual searching is a cumbersome and time consuming process. Soft computing methods such as Genetic Algorithm (GA), Artificial Neural Network (ANN) and Fuzzy Logic (FL) have been widely used by EM researchers for microwave design since last decade. The aim of these methods is to tolerate imprecision, uncertainty, and approximation to achieve robust and low cost solution in a small time frame. Modeling and optimization are essential parts and powerful tools for the microwave/millimeter wave design. This book deals with the development and use of soft computing methods for tackling challenging design problems in the microwave/millimeter wave domain. The aim in the development of these methods is to obtain the design in small time frame while improving the accuracy of the design for a wide range of applications. To achieve this goal, a few diverse design problems of microwave field, representing varied challenges in the design, such as different microstrip antennas, microwave filters, a microstrip-via and also some critical high power components such as nonlinear tapers and RF-windows are considered as case-study design problems. Different design methodologies are developed for these applications. The presents soft computing methods, their review for microwave/millimeter wave design problems and specific case-study problems to infuse better insight and understanding of the subject.
'Managing Technology in The Operations Function' looks at issues in
technology from the operations function rather than from an IT
perspective. It explores the use of technology for processing,
provision of client services, risk management and business
management. The authors analyse the benefits of straight through
processing and the practical implications of managing technology
products in operations. System risk and opportunities are explored
and case studies are examined along with industry trends to assess
upcoming developments and their impacts.
Fluids, play an important role in environmental systems, appearing as surface water in rivers, lakes, and coastal regions or in the subsurface as well as in the atmosphere. Mechanics of environmental fluids is concerned with fluid motion, associated mass and heat transport in addition to deformation processes in subsurface systems. In this textbook the fundamental modelling approaches based on continuum mechanics for fluids in the environment are described, including porous media and turbulence. Numerical methods for solving the process governing equations and its object-oriented computer implementation are discussed and illustrated with examples. Finally the application of computer models in civil and environmental engineering is demonstrated.
This book presents the essential background for understanding semantic theories of mood. Mood as a category is widely used in the description of languages and the formal analysis of their grammatical properties. It typically refers to the features of a sentence-individual morphemes or grammatical patterns-that reflect how the sentence contributes to the modal meaning of a larger phrase, or that indicate the type of fundamental pragmatic function that it has in conversation. In this volume, Paul Portner discusses the most significant semantic theories relating to the two main subtypes of mood: verbal mood, including the categories of indicative and subjunctive subordinate clauses, and sentence mood, encompassing declaratives, interrogatives, and imperatives. He evaluates those theories, compares them, and draws connections between seemingly disparate approaches, and he formalizes some of the literature's most important ideas in new ways in order to draw out their most significant insights. Ultimately, this work shows that there are crucial connections between verbal mood and sentence mood which point the way towards a more general understanding of how mood works and its relation to other topics in linguistics; it also outlines the type of semantic and pragmatic theory which will make it possible to explain these relations. The book will be a valuable resource for researchers and students from advanced undergraduate level upwards in the fields of semantics and pragmatics, philosophy, computer science, and psychology.
As a progressive field of study, end-user computing is continually becoming a significant focus area for businesses, since refining end-user practices to enhance their productivity contributes greatly to positioning organizations for strategic and competitive advantage in the global economy.""Evolutionary Concepts in End User Productivity and Performance: Applications for Organizational Progress"" represents the most current investigations into a wide range of end-user computing issues. This book enhances the field with new insights useful for researchers, educators, and professionals in the end-user domain.
The Workshop on the Economics of Information Security was established in 2002 to bring together computer scientists and economists to understand and improve the poor state of information security practice. WEIS was borne out of a realization that security often fails for non-technical reasons. Rather, the incentives of both - fender and attacker must be considered. Earlier workshops have answered questions ranging from?nding optimal levels of security investement to understanding why privacy has been eroded. In the process, WEIS has attracted participation from the diverse?elds such as law, management and psychology. WEIS has now established itself as the leading forum for interdisciplinary scholarship on information security. The eigth installment of the conference returned to the United Kingdom, hosted byUniversityCollegeLondononJune24-25,2009.Approximately100researchers, practitioners and government of?cials from across the globe convened in London to hear presentations from authors of 21 peer-reviewed papers, in addition to a panel and keynote lectures from Hal Varian (Google), Bruce Schneier (BT Co- terpane), Martin Sadler (HP Labs), and Robert Coles (Merrill Lynch). Angela Sasse and David Pym chaired the conference, while Christos Ioannidis and Tyler Moore chaired the program committee.
This book contains extended and revised versions of the best papers that were p- sented during the 16th edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 16th conference was held at the Grand Hotel of Rhodes Island, Greece (October 13-15, 2008). Previous conferences have taken place in Edinburgh, Trondheim, V- couver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth, Nice and Atlanta. VLSI-SoC 2008 was the 16th in a series of international conferences sponsored by IFIP TC 10 Working Group 10.5 and IEEE CEDA that explores the state of the art and the new developments in the field of VLSI systems and their designs. The purpose of the conference was to provide a forum to exchange ideas and to present industrial and research results in the fields of VLSI/ULSI systems, embedded systems and - croelectronic design and test.
Queueing network models have been widely applied as a powerful tool for modelling, performance evaluation, and prediction of discrete flow systems, such as computer systems, communication networks, production lines, and manufacturing systems. Queueing network models with finite capacity queues and blocking have been introduced and applied as even more realistic models of systems with finite capacity resources and with population constraints. In recent years, research in this field has grown rapidly. Analysis of Queueing Networks with Blocking introduces queueing network models with finite capacity and various types of blocking mechanisms. It gives a comprehensive definition of the analytical model underlying these blocking queueing networks. It surveys exact and approximate analytical solution methods and algorithms and their relevant properties. It also presents various application examples of queueing networks to model computer systems and communication networks. This book is organized in three parts. Part I introduces queueing networks with blocking and various application examples. Part II deals with exact and approximate analysis of queueing networks with blocking and the condition under which the various techniques can be applied. Part III presents a review of various properties of networks with blocking, describing several equivalence properties both between networks with and without blocking and between different blocking types. Approximate solution methods for the buffer allocation problem are presented.
This title provides a survey on approaches to information systems supporting sustainable development in the private or public sector. It also documents and encourages the first steps of environmental information processing towards this more comprehensive goal.
This book presents the refereed proceedings of the Twelfth International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at Stanford University (California) in August 2016. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising in particular, in finance, statistics, computer graphics and the solution of PDEs.
According to the Semiconductor Industry Association's 1999 International Technology Roadmap for Semiconductors, by the year 2008 the integration of more than 500 million transistors will be possible on a single chip. Integrating transistors on silicon will depend increasingly on design reuse. Design reuse techniques have become the subject of books, conferences, and podium discussions over the last few years. However, most discussions focus on higher-level abstraction like RTL descriptions, which can be synthesized. Design reuse is often seen as an add-on to normal design activity, or a special design task that is not an integrated part of the existing design flow. This may all be true for the ASIC world, but not for high-speed, high-performance microprocessors. In the field of high-speed microprocessors, design reuse is an
integrated part of the design flow. The method of choice in this
demanding field was, and is always, physical design reuse at the
layout level. In the past, the practical implementations of this
method were linear shrinks and the lambda approach. With the
scaling of process technology down to 0.18 micron and below, this
approach lost steam and became inefficient. Automatic Layout Modification, Including design reuse of the Alpha CPU in 0.13 micron SOI technology is a welcome effort to improving some of the practices in chip design today. It is a comprehensive reference work on Automatic Layout Modification which will be valuable to VLSI courses at universities, and to CAD and circuit engineers and engineering managers. |
You may like...
Creativity in Computing and DataFlow…
Suyel Namasudra, Veljko Milutinovic
Hardcover
R4,204
Discovery Miles 42 040
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|