![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
We are extremely pleased to present a comprehensive book comprising a collection of research papers which is basically an outcome of the Second IFIP TC 13.6 Working Group conference on Human Work Interaction Design, HWID2009. The conference was held in Pune, India during October 7-8, 2009. It was hosted by the Centre for Development of Advanced Computing, India, and jointly organized with Copenhagen Business School, Denmark; Aarhus University, Denmark; and Indian Institute of Technology, Guwahati, India. The theme of HWID2009 was Usability in Social, C- tural and Organizational Contexts. The conference was held under the auspices of IFIP TC 13 on Human-Computer Interaction. 1 Technical Committee TC13 on Human-Computer Interaction The committees under IFIP include the Technical Committee TC13 on Human-Computer Interaction within which the work of this volume has been conducted. TC13 on Human-Computer Interaction has as its aim to encourage theoretical and empirical human science research to promote the design and evaluation of human-oriented ICT. Within TC13 there are different working groups concerned with different aspects of human- computer interaction. The flagship event of TC13 is the bi-annual international conference called INTERACT at which both invited and contributed papers are presented. Contributed papers are rigorously refereed and the rejection rate is high.
This is the first book to treat two areas of speech synthesis: natural language processing and the inherent problems it presents for speech synthesis; and digital signal processing, with an emphasis on the concatenative approach. The text guides the reader through the material in a step-by-step easy-to-follow way. The book will be of interest to researchers and students in phonetics and speech communication, in both academia and industry.
Peter A. Coming Palo Alto, CA November, 2000 This volwne represents a distillation of the plenary sessions at a unique millenniwn year event -a World Congress of the Systems Sciences in conjunction with the 44th annual meeting of the International Society for the Systems Sciences (ISSS). The overall theme of the conference was "Understanding Complexity in the New Millenniwn. " Held at Ryerson Polytechnic University in Toronto, Canada, from July 16-22,2000, the conference included some 350 participants from over 30 countries, many of whom were representatives of the 21 organizations and groups that co-hosted this landmark event. Each of these co-host organizations/groups also presented a segment of the program, including a plenary speech. In addition, the conference featured a nwnber of distinguished "keynote" speeches related to the three daily World Congress themes: (1) The Evolution of Complex Systems, (2) The Dynamics of Complex Systems, and (3) Human Systems in the 21st Century. There were also seven special plenary-level symposia on a range of timely topics, including: "The Art and Science of Forecasting in the Age of Global Wanning"; "Capitalism in the New Millenniwn: The Challenge of Sustainability"; "The Future of the Systems Sciences"; "Global Issues in the New Millenniwn"; "Resources and the Environment in the New Millenniwn"; "The Lessons of Y2K"; and "Can There be a Reconciliation Between Science and Religion?" Included in this special commemorative volume is a cross-section of these presentations."
Algorithms for VLSI Physical Design Automation, Third Edition covers all aspects of physical design. The book is a core reference for graduate students and CAD professionals. For students, concepts and algorithms are presented in an intuitive manner. For CAD professionals, the material presents a balance of theory and practice. An extensive bibliography is provided which is useful for finding advanced material on a topic. At the end of each chapter, exercises are provided, which range in complexity from simple to research level. Algorithms for VLSI Physical Design Automation, Third Edition provides a comprehensive background in the principles and algorithms of VLSI physical design. The goal of this book is to serve as a basis for the development of introductory-level graduate courses in VLSI physical design automation. It provides self-contained material for teaching and learning algorithms of physical design. All algorithms which are considered basic have been included, and are presented in an intuitive manner. Yet, at the same time, enough detail is provided so that readers can actually implement the algorithms given in the text and use them. The first three chapters provide the background material, while the focus of each chapter of the rest of the book is on each phase of the physical design cycle. In addition, newer topics such as physical design automation of FPGAs and MCMs have been included. The basic purpose of the third edition is to investigate the new challenges presented by interconnect and process innovations. In 1995 when the second edition of this book was prepared, a six-layer process and 15 million transistor microprocessors were in advanced stages of design. In 1998, six metal process and 20 million transistor designs are in production. Two new chapters have been added and new material has been included in almost allother chapters. A new chapter on process innovation and its impact on physical design has been added. Another focus of the third edition is to promote use of the Internet as a resource, so wherever possible URLs have been provided for further investigation. Algorithms for VLSI Physical Design Automation, Third Edition is an important core reference work for professionals as well as an advanced level textbook for students.
Within the last 10-13 years Binary Decision Diagrams (BDDs) have become the state-of-the-art data structure in VLSI CAD for representation and manipulation of Boolean functions. Today, BDDs are widely used and in the meantime have also been integrated in commercial tools, especially in the area of verification and synthesis. The interest in BDDs results from the fact that the data structure is generally accepted as providing a good compromise between conciseness of representation and efficiency of manipulation. With increasing numbers of applications, also in non-CAD areas, classical methods of handling BDDs are being improved and new questions and problems evolve and have to be solved. Binary Decision Diagrams: Theory and Implementation is intended both for newcomers to BDDs and for researchers and practitioners who need to implement them. Apart from giving a quick start for the reader who is not familiar with BDDs (or DDs in general), it also discusses several new aspects of BDDs, e.g. with respect to minimization and implementation of a package. It is an essential bookshelf item for any CAD designer or researcher working with BDDs.
The latest edition of a classic text on concurrency and distributed programming - from a winner of the ACM/SIGCSE Award for Outstanding Contribution to Computer Science Education.
This consistently written book provides a comprehensive presentation of a multitude of results stemming from the author's as well as various researchers' work in the field. It also covers functional decomposition for incompletely specified functions, decomposition for multi-output functions and non-disjoint decomposition.
Neurobiology research suggests that information can be represented by the location of an activity spot in a population of cells (place coding'), and that this information can be processed by means of networks of interconnections. Place Coding in Analog VLSI defines a representation convention of similar flavor intended for analog-integrated circuit design. It investigates its properties and suggests ways to build circuits on the basis of this coding scheme. In this electronic version of place coding, numbers are represented by the state of an array of nodes called a map, and computation is carried out by a network of links. In the simplest case, a link is just a wire connecting a node of an input map to a node of an output map. In other cases, a link is an elementary circuit cell. Networks of links are somewhat reminiscent of look-up tables in that they hardwire an arbitrary function of one or several variables. Interestingly, these structures are also related to fuzzy rules, as well as some types of artificial neural networks. The place coding approach provides several substantial benefits over conventional analog design: Networks of links can be synthesized by a simple procedure whatever the function to be computed. Place coding is tolerant to perturbations and noise in current-mode implementations. Tolerance to noise implies that the fundamental power dissipation limits of conventional analog circuits can be overcome by using place coding. The place coding approach is illustrated by three integrated circuits computing non-linear functions of several variables. The simplest one is made up of 80 links and achieves submicrowatt power consumption in continuous operation. The most complex one incorporates about 1800 links for a power consumption of 6 milliwatts, and controls the operation of an active vision system with a moving field of view. Place Coding in Analog VLSI is primarily intended for researchers and practicing engineers involved in analog and digital hardware design (especially bio-inspired circuits). The book is also a valuable reference for researchers and students in neurobiology, neuroscience, robotics, fuzzy logic and fuzzy control.
Embedded computer systems use both off-the-shelf microprocessors and application-specific integrated circuits (ASICs) to implement specialized system functions. Examples include the electronic systems inside laser printers, cellular phones, microwave ovens, and an automobile anti-lock brake controller. Embedded computing is unique because it is a co-design problem - the hardware engine and application software architecture must be designed simultaneously. Hardware-Software Co-Synthesis of Distributed Embedded Systems proposes new techniques such as fixed-point iterations, phase adjustment, and separation analysis to efficiently estimate tight bounds on the delay required for a set of multi-rate processes preemptively scheduled on a real-time reactive distributed system. Based on the delay bounds, a gradient-search co-synthesis algorithm with new techniques such as sensitivity analysis, priority prediction, and idle- processing elements elimination are developed to select the number and types of processing elements in a distributed engine, and determine the allocation and scheduling of processes to processing elements. New communication modeling is also presented to analyze communication delay under interaction of computation and communication, allocate interprocessor communication links, and schedule communication. Hardware-Software Co-Synthesis of Distributed Embedded Systems is the first book to describe techniques for the design of distributed embedded systems, which have arbitrary hardware and software topologies. The book will be of interest to: academic researchers for personal libraries and advanced-topics courses in co-design as well as industrial designers who are building high-performance, real-time embedded systems with multiple processors.
This volume includes chapters presenting applications of different metaheuristics in reliability engineering, including ant colony optimization, great deluge algorithm, cross-entropy method and particle swarm optimization. It also presents chapters devoted to cellular automata and support vector machines, and applications of artificial neural networks, a powerful adaptive technique that can be used for learning, prediction and optimization. Several chapters describe aspects of imprecise reliability and applications of fuzzy and vague set theory.
This volume is a post-conference publication of the 4th World Congress on Social Simulation (WCSS), with contents selected from among the 80 papers originally presented at the conference. WCSS is a biennial event, jointly organized by three scientific communities in computational social science, namely, the Pacific-Asian Association for Agent-Based Approach in Social Systems Sciences (PAAA), the European Social Simulation Association (ESSA), and the Computational Social Science Society of the Americas (CSSSA). It is, therefore, currently the most prominent conference in the area of agent-based social simulation. The papers selected for this volume give a holistic view of the current development of social simulation, indicating the directions for future research and creating an important archival document and milestone in the history of computational social science. Specifically, the papers included here cover substantial progress in artificial financial markets, macroeconomic forecasting, supply chain management, bank networks, social networks, urban planning, social norms and group formation, cross-cultural studies, political party competition, voting behavior, computational demography, computational anthropology, evolution of languages, public health and epidemics, AIDS, security and terrorism, methodological and epistemological issues, empirical-based agent-based modeling, modeling of experimental social science, gaming simulation, cognitive agents, and participatory simulation. Furthermore, pioneering studies in some new research areas, such as the theoretical foundations of social simulation and categorical social science, also are included in the volume.
The second volume of this work contains Parts 2 and 3 of the "Handbook of Coding Theory". Part 2, "Connections", is devoted to connections between coding theory and other branches of mathematics and computer science. Part 3, "Applications", deals with a variety of applications for coding.
Public and situated display technologies can have an important impact on individual and social behaviour and present us with particular interesting new design considerations and challenges. While there is a growing body of research exploring these design considerations and social impact this work remains somewhat disparate, making it difficult to assimilate in a coherent manner. This book brings together the perspectives of key researchers in the area of public and situated display technology. The chapters detail research representing the social, technical and interactional aspects of public and situated display technologies. The underlying concern common to these chapters is how these displays can be best designed for collaboration, coordination, community building and mobility. Presenting them together allows the reader to examine everyday display activities within the context of emerging technological possibilities.
Mathematical Visualization is a young new discipline. It offers
efficient visualization tools to the classical subjects of
mathematics, and applies mathematical techniques to problems in
computer graphics and scientific visualization. Originally, it
started in the interdisciplinary area of differential geometry,
numerical mathematics, and computer graphics. In recent years, the
methods developed have found important applications.
Knowledge Discovery today is a significant study and research area. In finding answers to many research questions in this area, the ultimate hope is that knowledge can be extracted from various forms of data around us. This book covers recent advances in unsupervised and supervised data analysis methods in Computational Intelligence for knowledge discovery. In its first part the book provides a collection of recent research on distributed clustering, self organizing maps and their recent extensions. If labeled data or data with known associations are available, we may be able to use supervised data analysis methods, such as classifying neural networks, fuzzy rule-based classifiers, and decision trees. Therefore this book presents a collection of important methods of supervised data analysis. "Classification and Clustering for Knowledge Discovery" also includes variety of applications of knowledge discovery in health, safety, commerce, mechatronics, sensor networks, and telecommunications.
Probabilistic and Statistical Methods in Computer Science
Protein informatics is a newer name for an already existing discipline. It encompasses the techniques used in bioinformatics and molecular modeling that are related to proteins. While bioinformatics is mainly concerned with the collection, organization, and analysis of biological data, molecular modeling is devoted to representation and manipulation of the structure of proteins. Protein informatics requires substantial prerequisites on computer science, mathematics, and molecular biology. The approach chosen here, allows a direct and rapid grasp on the subject starting from basic knowledge of algorithm design, calculus, linear algebra, and probability theory. An Introduction to Protein Informatics, a professional monograph will provide the reader a comprehensive introduction to the field of protein informatics. The text emphasizes mathematical and computational methods to tackle the central problems of alignment, phylogenetic reconstruction, and prediction and sampling of protein structure. An Introduction to Protein Informatics is designed for a professional audience, composed of researchers and practitioners within bioinformatics, molecular modeling, algorithm design, optimization, and pattern recognition. This book is also suitable as a graduate-level text for students in computer science, mathematics, and biomedicine.
This volume brings together recent theoretical work in Learning Classifier Systems (LCS), which is a Machine Learning technique combining Genetic Algorithms and Reinforcement Learning. It includes self-contained background chapters on related fields (reinforcement learning and evolutionary computation) tailored for a classifier systems audience and written by acknowledged authorities in their area - as well as a relevant historical original work by John Holland.
This book provides a framework for the design of competent optimization techniques by combining advanced evolutionary algorithms with state-of-the-art machine learning techniques. The primary focus of the book is on two algorithms that replace traditional variation operators of evolutionary algorithms, by learning and sampling Bayesian networks: the Bayesian optimization algorithm (BOA) and the hierarchical BOA (hBOA). They provide a scalable solution to a broad class of problems. The book provides an overview of evolutionary algorithms that use probabilistic models to guide their search, motivates and describes BOA and hBOA in a way accessible to a wide audience, and presents numerous results confirming that they are revolutionary approaches to black-box optimization.
Rem tene, verba sequentur (Gaius J. Victor, Rome VI century b.c.) The ultimate goal of this book is to bring the fundamental issues of information granularity, inference tools and problem solving procedures into a coherent, unified, and fully operational framework. The objective is to offer the reader a comprehensive, self-contained, and uniform exposure to the subject.The strategy is to isolate some fundamental bricks of Computational Intelligence in terms of key problems and methods, and discuss their implementation and underlying rationale within a well structured and rigorous conceptual framework as well as carefully related to various application facets. The main assumption is that a deep understanding of the key problems will allow the reader to compose into a meaningful mosaic the puzzle pieces represented by the immense varieties of approaches present in the literature and in the computational practice. All in all, the main approach advocated in the monograph consists of a sequence of steps offering solid conceptual fundamentals, presenting a carefully selected collection of design methodologies, discussing a wealth of development guidelines, and exemplifying them with a pertinent, accurately selected illustrative material.
The evolution of modern computers began more than 50 years ago and has been driven to a large extend by rapid advances in electronic technology during that period. The first computers ran one application (user) at a time. Without the benefit of operating systems or compilers, the application programmers were responsible for managing all aspects of the hardware. The introduction of compilers allowed programmers to express algorithms in abstract terms without being concerned with the bit level details of their implementation. Time sharing operating systems took computing systems one step further and allowed several users and/or applications to time share the computing services of com puters. With the advances of networks and software tools, users and applications were able to time share the logical and physical services that are geographically dispersed across one or more networks. Virtual Computing (VC) concept aims at providing ubiquitous open computing services in analogous way to the services offered by Telephone and Elec trical (utility) companies. The VC environment should be dynamically setup to meet the requirements of a single user and/or application. The design and development of a dynamically programmable virtual comput ing environments is a challenging research problem. However, the recent advances in processing and network technology and software tools have successfully solved many of the obstacles facing the wide deployment of virtual computing environments as will be outlined next."
Organizational Semiotics occupies an important niche in the research community of human communication and information systems. It opens up new ways of understanding the functioning of information and information resources in organised behaviour. In recent years, a numberof workshops and conferences have provided researchers and practitioners opportunities to discuss their theories, methods and practices and to assess the benefits and potential of this approach. Literature in this field is much in demand but still difficult to find, so we are pleased to offer a third volume in the miniseries of Studies in Organizational Semiotics. This book is based on the papers and discussions of the fifth workshop on Organizational Semiotics held in Delft, June 13-15, 2002, hosted by Groningen University and Delft Technical University in the Netherlands. The topic of this workshop was the dynamics and change in organizations. The chapters in this book reflect recent developments in theory and applications and demonstrate the significance of Organizational Semiotics to information systems, human communication and coordination, organizational analysis and modelling. In particular, it provides a framework that accommodates both the technical and social aspects of information systems. The mini-series presents the frontier of the research in this area and shows how the theory and techniques enhance the quality of work on information systems.
Games and simulations are not only a rapidly growing source of entertainment in today's world; they are also quite beneficial. They enable players to develop quick-reaction and motor skills, engage cognitive processes, and interact with peers around the globe, thereby enhancing social skills. However, as a result of the rise of games and simulations, educators are struggling to engage their students through more traditional ways of learning. Educational Gameplay and Simulation Environments: Case Studies and Lessons Learned presents a remarkable collection of cases demonstrating how to conceptualize, design, and implement games and simulations effectively for learning. This paramount publication will aid educators, researchers, and game developers in broadening their work to effectively create and implement engaging learning environments for present and future students.
The application of Computational Intelligence in emerging research areas such as Granular Computing, Mechatronics, and Bioinformatics shows its usefulness often emphasized by Prof Lotfi Zadeh, the inventor of fuzzy logic and many others. This book contains recent advances in Computational Intelligence methods for modeling, optimization and prediction and covers a large number of applications. The book presents new Computational Intelligence theory and methods for modeling and prediction. The range of the various applications is captured with 5 chapters in image processing, 2 chapters in audio processing, 3 chapters in commerce and finance, 2 chapters in communication networks and 6 chapters containing other applications. |
You may like...
Exam Ref AZ-500 Microsoft Azure Security…
Yuri Diogenes, Orin Thomas
Paperback
R783
Discovery Miles 7 830
Microsoft Azure Security Infrastructure
Yuri Diogenes, Tom Shinder, …
Paperback
R611
Discovery Miles 6 110
|