![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike.
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
Facing the challenge of the fast changing technological environment, many companies are developing an interest in the field of technology intelligence. Their aim is to support the decision-making process by taking advantage of a well-timed preparation of relevant information by means of systematic identification, collection, analysis, dissemination, and application of this information. This book covers the gap in literature by showing how a technology intelligence system can be designed and implemented.
With the development of networked computing and the increased complexity of applications and software systems development, the importance of computer-supported collaborative work CSCW] has dramatically increased. Globalization has further accentuated the necessity of collaboration, while the Web has made geographically distributed collaborative systems technologically feasible in a manner that was impossible until recently. The software environments needed to support such distributed teams are referred to as Groupware. Groupware is intended to address the logistical, managerial, social, organizational and cognitive difficulties that arise in the application of distributed expertise. These issues represent the fundamental challenges to the next generation of process management. Computer-Supported Collaboration with Applications to Software Development reviews the theory of collaborative groups and the factors that affect collaboration, particularly collaborative software development. The influences considered derive from diverse sources: social and cognitive psychology, media characteristics, the problem-solving behavior of groups, process management, group information processing, and organizational effects. It also surveys empirical studies of computer-supported problem solving, especially for software development. The concluding chapter describes a collaborative model for program development. Computer-Supported Collaboration with Applications to Software Development is designed for an academic and professional market in software development, professionals and researchers in the areas of software engineering, collaborative development, management information systems, problem solving, cognitive and social psychology. This book also meets the needs of graduate-level students in computer science and information systems.
Do Smart Adaptive Systems Exist? is intended as a reference and a guide summarising and focusing on best practices when using intelligent techniques and building systems requiring a degree of adaptation and intelligence. It is therefore not intended as a collection of the most recent research results, but as a practical guide for experts from other areas and industrial users interested in building solutions to their problems using intelligent techniques. One of the main issues covered is an attempt to answer the question of how to select and/or combine suitable intelligent techniques from a large pool of potential solutions. Another attractive feature of the book is that it brings together experts from neural network, fuzzy, machine learning, evolutionary and hybrid systems communities who will provide their views on how these different intelligent technologies have contributed and will contribute to creation of smart adaptive systems of the future.
As more and more hardware platforms support parallelism, parallel programming is gaining momentum. Applications can only leverage the performance of multi-core processors or graphics processing units if they are able to split a problem into smaller ones that can be solved in parallel. The challenges emerging from the development of parallel applications have led to the development of a great number of tools for debugging, performance analysis and other tasks. The proceedings of the 3rd International Workshop on Parallel Tools for High Performance Computing provide a technical overview in order to help engineers, developers and computer scientists decide which tools are best suited to enhancing their current development processes.
We are extremely pleased to present a comprehensive book comprising a collection of research papers which is basically an outcome of the Second IFIP TC 13.6 Working Group conference on Human Work Interaction Design, HWID2009. The conference was held in Pune, India during October 7-8, 2009. It was hosted by the Centre for Development of Advanced Computing, India, and jointly organized with Copenhagen Business School, Denmark; Aarhus University, Denmark; and Indian Institute of Technology, Guwahati, India. The theme of HWID2009 was Usability in Social, C- tural and Organizational Contexts. The conference was held under the auspices of IFIP TC 13 on Human-Computer Interaction. 1 Technical Committee TC13 on Human-Computer Interaction The committees under IFIP include the Technical Committee TC13 on Human-Computer Interaction within which the work of this volume has been conducted. TC13 on Human-Computer Interaction has as its aim to encourage theoretical and empirical human science research to promote the design and evaluation of human-oriented ICT. Within TC13 there are different working groups concerned with different aspects of human- computer interaction. The flagship event of TC13 is the bi-annual international conference called INTERACT at which both invited and contributed papers are presented. Contributed papers are rigorously refereed and the rejection rate is high.
Information engineering and applications is the field of study concerned with constructing information computing, intelligent systems, mathematical models, numerical solution techniques, and using computers and other electronic devices to analyze and solve natural scientific, social scientific and engineering problems. Information engineering is an important underpinning for techniques used in information and computational science and there are many unresolved problems worth studying. The Proceedings of the 2nd International Conference on Information Engineering and Applications (IEA 2012), which was held in Chongqing, China, from October 26-28, 2012, discusses the most innovative research and developments including technical challenges and social, legal, political, and economic issues. A forum for engineers and scientists in academia, industry, and government, the Proceedings of the 2nd International Conference on Information Engineering and Applications presents ideas, results, works in progress, and experience in all aspects of information engineering and applications.
Technology in today's world has continued to develop into multifaceted structures. The performance of computers, specifically, has significantly increased leading to various and complex problems regarding the dependability of these systems. Recently, solutions for these issues have been based on soft computing methods; however, there lacks a considerable amount of research on the applications of these techniques within system dependability. Soft Computing Methods for System Dependability is a collection of innovative research on the applications of these processing techniques for solving problems within the dependability of computer system performance. This book will feature comparative experiences shared by researchers regarding the development of these technological solutions. While highlighting topics including evolutionary computing, chaos theory, and artificial neural networks, this book is ideally designed for researchers, data scientists, computing engineers, industrialists, students, and academicians in the field of computer science.
This volume is a post-conference publication of the 4th World Congress on Social Simulation (WCSS), with contents selected from among the 80 papers originally presented at the conference. WCSS is a biennial event, jointly organized by three scientific communities in computational social science, namely, the Pacific-Asian Association for Agent-Based Approach in Social Systems Sciences (PAAA), the European Social Simulation Association (ESSA), and the Computational Social Science Society of the Americas (CSSSA). It is, therefore, currently the most prominent conference in the area of agent-based social simulation. The papers selected for this volume give a holistic view of the current development of social simulation, indicating the directions for future research and creating an important archival document and milestone in the history of computational social science. Specifically, the papers included here cover substantial progress in artificial financial markets, macroeconomic forecasting, supply chain management, bank networks, social networks, urban planning, social norms and group formation, cross-cultural studies, political party competition, voting behavior, computational demography, computational anthropology, evolution of languages, public health and epidemics, AIDS, security and terrorism, methodological and epistemological issues, empirical-based agent-based modeling, modeling of experimental social science, gaming simulation, cognitive agents, and participatory simulation. Furthermore, pioneering studies in some new research areas, such as the theoretical foundations of social simulation and categorical social science, also are included in the volume.
Computer-based information technologies have been extensively used to help industries manage their processes and information systems hereby - come their nervous center. More specially, databases are designed to s- port the data storage, processing, and retrieval activities related to data management in information systems. Database management systems p- vide efficient task support and database systems are the key to impleme- ing industrial data management. Industrial data management requires da- base technique support. Industrial applications, however, are typically data and knowledge intensive applications and have some unique character- tics that makes their management difficult. Besides, some new techniques such as Web, artificial intelligence, and etc. have been introduced into - dustrial applications. These unique characteristics and usage of new te- nologies have put many potential requirements on industrial data mana- ment, which challenge today's database systems and promote their evolvement. Viewed from database technology, information modeling in databases can be identified at two levels: (conceptual) data modeling and (logical) database modeling. This results in conceptual (semantic) data model and logical database model. Generally a conceptual data model is designed and then the designed conceptual data model will be transformed into a chosen logical database schema. Database systems based on logical database model are used to build information systems for data mana- ment. Much attention has been directed at conceptual data modeling of - dustrial information systems. Product data models, for example, can be views as a class of semantic data models (i. e.
The field of high performance computing achieved prominence through
advances in electronic and integrated technologies beginning in the
1940s. Current times are very exciting and the years to come will
witness a proliferation of the use of parallel and distributed
systems. The scientific and engineering application domains have a
key role in shaping future research and development activities in
academia and industry, especially when the solution of large and
complex problems must cope with harder and harder timing.
Peter A. Coming Palo Alto, CA November, 2000 This volwne represents a distillation of the plenary sessions at a unique millenniwn year event -a World Congress of the Systems Sciences in conjunction with the 44th annual meeting of the International Society for the Systems Sciences (ISSS). The overall theme of the conference was "Understanding Complexity in the New Millenniwn. " Held at Ryerson Polytechnic University in Toronto, Canada, from July 16-22,2000, the conference included some 350 participants from over 30 countries, many of whom were representatives of the 21 organizations and groups that co-hosted this landmark event. Each of these co-host organizations/groups also presented a segment of the program, including a plenary speech. In addition, the conference featured a nwnber of distinguished "keynote" speeches related to the three daily World Congress themes: (1) The Evolution of Complex Systems, (2) The Dynamics of Complex Systems, and (3) Human Systems in the 21st Century. There were also seven special plenary-level symposia on a range of timely topics, including: "The Art and Science of Forecasting in the Age of Global Wanning"; "Capitalism in the New Millenniwn: The Challenge of Sustainability"; "The Future of the Systems Sciences"; "Global Issues in the New Millenniwn"; "Resources and the Environment in the New Millenniwn"; "The Lessons of Y2K"; and "Can There be a Reconciliation Between Science and Religion?" Included in this special commemorative volume is a cross-section of these presentations."
Over the last five to six years, ontology has received increased attention within the information systems field. Ontology provides a basis for evaluating, analyzing, and engineering business analysis methods. It is that type of theology that has allowed many organizations utilizing ontology to become more competitive within today's global environment. Business Systems Analysis with Ontologies examines, thoroughly, the area of ontologies. All aspects of ontologies are covered; analysis, evaluation, and engineering of business systems analysis methods. Readers are shown the world of ontologies through a number of research methods. For example, survey methodologies, case studies, experimental methodologies, analytical modeling, and field studies are all used within this book to help the reader understand the usefulness of ontologies.
This book represents the compilation of papers presented at the IFIP Working Group 8. 2 conference entitled "Information Technology in the Service Economy: Challenges st and Possibilities for the 21 Century. " The conference took place at Ryerson University, Toronto, Canada, on August 10 13, 2008. Par ticipation in the conference spanned the continents from Asia to Europe with paper submissions global in focus as well. Conference submissions included complete d research papers and research in progress reports. Papers submitted to the conference went through a double blind review process in which the program co chairs, an associate editor, and reviewers provided assessments and recommendations. The editor ial efforts of the associate editors and reviewers in this process were outstanding. To foster high quality research publications in this field of study, authors of accepted pape rs were then invited to revise and resubmit their work. Through this rigorous review and revision process, 12 completed research papers and 11 research in progress reports were accepted for presentation and publica tion. Paper workshop sessions were also esta blished to provide authors of emergent work an opportunity to receive feedback fromthe IF IP 8. 2 community. Abstracts of these new projects are included in this volume. Four panels were presented at the conference to provide discussion forums for the varied aspect s of IT, service, and globalization. Panel abstracts are also included here.
Over the past years, business schools have been experimenting with distance learning and online education. In many cases this new technology has not brought the anticipated results. Questions raised by online education can be linked to the fundamental problem of education and teaching, and more specifically to the models and philosophy of education and teaching. Virtual Corporate Universities: A Matrix of Knowledge and Learning for the New Digital Dawn offers a source for new thoughts about those processes in view of the use of new technologies. Learning is considered as a key-strategic tool for new strategies, innovation, and significantly improving organizational effectiveness. The book blends the elements of knowledge management, as well as organizational and individual learning. The book is not just a treatment of technology, but a fusion of a novel dynamic learner (student)-driven learning concept, the management and creation of dynamic knowledge, and next-generation technologies to generic business, organizational and managerial processes, and the development of human capital. Obviously, the implications of online learning go far beyond the field of business as presented in this book.
In the last few decades, multiscale algorithms have become a dominant trend in large-scale scientific computation. Researchers have successfully applied these methods to a wide range of simulation and optimization problems. This book gives a general overview of multiscale algorithms; applications to general combinatorial optimization problems such as graph partitioning and the traveling salesman problem; and VLSICAD applications, including circuit partitioning, placement, and VLSI routing. Additional chapters discuss optimization in reconfigurable computing, convergence in multilevel optimization, and model problems with PDE constraints. Audience Written at the graduate level, the book is intended for engineers and mathematical and computational scientists studying large-scale optimization in electronic design automation.
The building blocks of today's embedded systems-on-a-chip are complex IP components and programmable processor cores. This means that more and more system functionality is implemented in software rather than in custom hardware. In turn, this indicates a growing need for high-level language compilers, capable of generating efficient code for embedded processors. However, traditional compiler technology hardly keeps pace with new developments in embedded processor architectures. Many existing compilers for DSPs and multimedia processors therefore produce code of insufficient quality with respect to performance and/or code size, and a large part of software for embedded systems is still being developed in assembly languages. As both embedded software as well as processors architectures are getting more and more complex, assembly programming clearly violates the demands for a short time-to-market and high dependability in embedded system design. The goal of this book is to provide new methods and techniques to software and compiler developers, that help to make the necessary step from assembly programming to the use of compilers also in embedded system design. Code Optimization Techniques for Embedded Processors discusses the state-of-the-art in the area of compilers for embedded processors. It presents a collection of new code optimization techniques, dedicated to DSP and multimedia processors. These include: compiler support for DSP address generation units, efficient mapping of data flow graphs to irregular architectures, exploitation of SIMD and conditional instructions, as well as function inlining under code size constraints. Comprehensive experimental evaluations are given forreal-life processors, that indicate the code quality improvements which can be achieved as compared to earlier techniques. In addition, C compiler frontend issues are discussed from a practical viewpoint. Code Optimization Techniques for Embedded Processors is intended for researchers and engineers active in software development for embedded systems, and for compiler developers in academia and industry.
A presentation of the central and basic concepts, techniques, and tools of computer science, with the emphasis on presenting a problem-solving approach and on providing a survey of all of the most important topics covered in degree programmes. Scheme is used throughout as the programming language and the author stresses a functional programming approach to create simple functions so as to obtain the desired programming goal. Such simple functions are easily tested individually, which greatly helps in producing programs that work correctly first time. Throughout, the author aids to writing programs, and makes liberal use of boxes with "Mistakes to Avoid." Programming examples include: * abstracting a problem; * creating pseudo code as an intermediate solution; * top-down and bottom-up design; * building procedural and data abstractions; * writing progams in modules which are easily testable. Numerous exercises help readers test their understanding of the material and develop ideas in greater depth, making this an ideal first course for all students coming to computer science for the first time.
This is the first book to treat two areas of speech synthesis: natural language processing and the inherent problems it presents for speech synthesis; and digital signal processing, with an emphasis on the concatenative approach. The text guides the reader through the material in a step-by-step easy-to-follow way. The book will be of interest to researchers and students in phonetics and speech communication, in both academia and industry.
With the ever-increasing speed of integrated circuits, violations of the performance specifications are becoming a major factor affecting the product quality level. The need for testing timing defects is further expected to grow with the current design trend of moving towards deep submicron devices. After a long period of prevailing belief that high stuck-at fault coverage is sufficient to guarantee high quality of shipped products, the industry is now forced to rethink other types of testing. Delay testing has been a topic of extensive research both in industry and in academia for more than a decade. As a result, several delay fault models and numerous testing methodologies have been proposed. Delay Fault Testing for VLSI Circuits presents a selection of existing delay testing research results. It combines introductory material with state-of-the-art techniques that address some of the current problems in delay testing. Delay Fault Testing for VLSI Circuits covers some basic topics such as fault modeling and test application schemes for detecting delay defects. It also presents summaries and conclusions of several recent case studies and experiments related to delay testing. A selection of delay testing issues and test techniques such as delay fault simulation, test generation, design for testability and synthesis for testability are also covered. Delay Fault Testing for VLSI Circuits is intended for use by CAD and test engineers, researchers, tool developers and graduate students. It requires a basic background in digital testing. The book can used as supplementary material for a graduate-level course on VLSI testing.
Algorithms for VLSI Physical Design Automation, Third Edition covers all aspects of physical design. The book is a core reference for graduate students and CAD professionals. For students, concepts and algorithms are presented in an intuitive manner. For CAD professionals, the material presents a balance of theory and practice. An extensive bibliography is provided which is useful for finding advanced material on a topic. At the end of each chapter, exercises are provided, which range in complexity from simple to research level. Algorithms for VLSI Physical Design Automation, Third Edition provides a comprehensive background in the principles and algorithms of VLSI physical design. The goal of this book is to serve as a basis for the development of introductory-level graduate courses in VLSI physical design automation. It provides self-contained material for teaching and learning algorithms of physical design. All algorithms which are considered basic have been included, and are presented in an intuitive manner. Yet, at the same time, enough detail is provided so that readers can actually implement the algorithms given in the text and use them. The first three chapters provide the background material, while the focus of each chapter of the rest of the book is on each phase of the physical design cycle. In addition, newer topics such as physical design automation of FPGAs and MCMs have been included. The basic purpose of the third edition is to investigate the new challenges presented by interconnect and process innovations. In 1995 when the second edition of this book was prepared, a six-layer process and 15 million transistor microprocessors were in advanced stages of design. In 1998, six metal process and 20 million transistor designs are in production. Two new chapters have been added and new material has been included in almost allother chapters. A new chapter on process innovation and its impact on physical design has been added. Another focus of the third edition is to promote use of the Internet as a resource, so wherever possible URLs have been provided for further investigation. Algorithms for VLSI Physical Design Automation, Third Edition is an important core reference work for professionals as well as an advanced level textbook for students. |
You may like...
Continued Fractions with Applications…
L. Lorentzen, H. Waadeland
Hardcover
R1,386
Discovery Miles 13 860
PVD for Microelectronics: Sputter…
Stephen M. Rossnagel, Ronald Powell, …
Hardcover
R3,338
Discovery Miles 33 380
Physical Examination Procedures for…
Zoe Rawles, Beth Griffiths, …
Hardcover
R5,478
Discovery Miles 54 780
Foundations of Finitely Supported…
Andrei Alexandru, Gabriel Ciobanu
Hardcover
R2,660
Discovery Miles 26 600
Routledge Handbook of Health and Media
Lester D. Friedman, Therese Jones
Hardcover
R5,944
Discovery Miles 59 440
Disability in Pregnancy and Childbirth
Stella Frances McKay-Moffat
Paperback
R993
Discovery Miles 9 930
|