![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
This book addresses the question of how to achieve social coordination in Socio-Cognitive Technical Systems (SCTS). SCTS are a class of Socio-Technical Systems that are complex, open, systems where several humans and digital entities interact in order to achieve some collective endeavour. The book approaches the question from the conceptual background of regulated open multiagent systems, with the question being motivated by their design and construction requirements. The book captures the collective effort of eight groups from leading research centres and universities, each of which has developed a conceptual framework for the design of regulated multiagent systems and most have also developed technological artefacts that support the processes from specification to implementation of that type of systems. The first, introductory part of the book describes the challenge of developing frameworks for SCTS and articulates the premises and the main concepts involved in those frameworks. The second part discusses the eight frameworks and contrasts their main components. The final part maps the new field by discussing the types of activities in which SCTS are likely to be used, the features that such uses will exhibit, and the challenges that will drive the evolution of this field.
Tearing and interconnecting methods, such as FETI, FETI-DP, BETI, etc., are among the most successful domain decomposition solvers for partial differential equations. The purpose of this book is to give a detailed and self-contained presentation of these methods, including the corresponding algorithms as well as a rigorous convergence theory. In particular, two issues are addressed that have not been covered in any monograph yet: the coupling of finite and boundary elements within the tearing and interconnecting framework including exterior problems, and the case of highly varying (multiscale) coefficients not resolved by the subdomain partitioning. In this context, the book offers a detailed view to an active and up-to-date area of research.
This research volume presents a sample of recent contributions related to the issue of quality-assessment for Web Based information in the context of information access, retrieval, and filtering systems. The advent of the Web and the uncontrolled process of documents' generation have raised the problem of declining quality assessment to information on the Web, by considering both the nature of documents (texts, images, video, sounds, and so on), the genre of documents ( news, geographic information, ontologies, medical records, products records, and so on), the reputation of information sources and sites, and, last but not least the actions performed on documents (content indexing, retrieval and ranking, collaborative filtering, and so on). The volume constitutes a compendium of both heterogeneous approaches and sample applications focusing specific aspects of the quality assessment for Web-based information for researchers, PhD students and practitioners carrying out their research activity in the field of Web information retrieval and filtering, Web information mining, information quality representation and management.
This book is a compilation of Mark Pelczarski's "Graphically Speaking" tutorial columns that appeared in Softalk magazine. Using the included programs, you will be able to create art, do animation for games, and have a bunch of fun on your Apple II computer. Once you learn the fundamentals of creating hi-res, 3-D, and animation, you will be limited only by your imagination. Originally published in 1983, this Enhanced Edition features a new preface from the author and a refreshed design with a lot of art from Penguin Software. Mark taught computer science, programming, and mathematics at the high school and university levels. He founded Penguin Software and created such classic software as Graphics Magician, Complete Graphics System, and Special Effects. Penguin published over 45 popular titles such as: The Coveted Mirror, Expedition Amazon, Oo-Topos, The Quest, The Spy's Adventures Around the World, Spy's Demise, Sword of Kadash, Transylvania, and Xyphus.
Increasing numbers of businesses and Information Technology firms are outsourcing their software and Web development tasks. It is has been estimated that currently half of the Fortune 500 companies have utilized outsourcing for their development needs and estimates that by the end of 2008, 40% of U.S. companies will either develop, test, support, or store software overseas, with another 40% considering doing the same. Several industries, from computer software to telemarketing, have begun aggressively shifting white-collar work out of the United States. The United States currently accounts for more than half of worldwide spending on IT outsourcing, with a growing portion of this spending going to countries such as India, Russia, and the Philippines, and this trend will continue. Research has indicated that the primary problem is language because of idiomatic expressions and subtle cultural nuances associated with the use of particular words. Thus communication frequently breaks down when dealing with overseas companies.
Enabling information interoperability, fostering legal knowledge usability and reuse, enhancing legal information search, in short, formalizing the complexity of legal knowledge to enhance legal knowledge management are challenging tasks, for which different solutions and lines of research have been proposed. During the last decade, research and applications based on the use of legal ontologies as a technique to represent legal knowledge has raised a very interesting debate about their capacity and limitations to represent conceptual structures in the legal domain. Making conceptual legal knowledge explicit would support the development of a web of legal knowledge, improve communication, create trust and enable and support open data, e-government and e-democracy activities. Moreover, this explicit knowledge is also relevant to the formalization of software agents and the shaping of virtual institutions and multi-agent systems or environments. This book explores the use of ontologism in legal knowledge
representation for semantically-enhanced legal knowledge systems or
web-based applications. In it, current methodologies, tools and
languages used for ontology development are revised, and the book
includes an exhaustive revision of existing ontologies in the legal
domain. The development of the Ontology of Professional Judicial
Knowledge (OPJK) is presented as a case study.
Scheduling theory has received a growing interest since its origins in the second half of the 20th century. Developed initially for the study of scheduling problems with a single objective, the theory has been recently extended to problems involving multiple criteria. However, this extension has still left a gap between the classical multi-criteria approaches and some real-life problems in which not all jobs contribute to the evaluation of each criterion. In this book, we close this gap by presenting and developing multi-agent scheduling models in which subsets of jobs sharing the same resources are evaluated by different criteria. Several scenarios are introduced, depending on the definition and the intersection structure of the job subsets. Complexity results, approximation schemes, heuristics and exact algorithms are discussed for single-machine and parallel-machine scheduling environments. Definitions and algorithms are illustrated with the help of examples and figures.
"Computer Science and Convergence"is proceedings of the 3rd FTRA International Conference on Computer Science and its Applications (CSA-11) and The 2011 FTRA World Convergence Conference (FTRA WCC 2011). The topics of CSA and WCC cover the current hot topics satisfying the world-wide ever-changing needs. CSA-11 will be the most comprehensive conference focused on the various aspects of advances in computer science and its applicationsand will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of CSA. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in CSA. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. The main scope of CSA-11 is as follows: - Mobile and ubiquitous computing WCC-11 is a major conference for scientists, engineers, and practitioners throughout the world to present the latest research, results, ideas, developments and applications in all areas of convergence technologies. The main scope of WCC-11 is as follows: - Cryptography and Security for Converged environments -
Wireless sensor network for Converged environments
'Rough Computing' explores the application of rough set theory, which has attracted attention because of the ability to enhance databases by allowing for the management of uncertainty, a comparative analysis between rough sets, and other intelligent data analysis.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
In recent years, IT standardization has become increasingly complex as a result of globalization, widespread Internet use, and the economic importance of standards. New Applications in IT Standards: Developments and Progress unites contributions on all facets of standards research, providing essential research on developing, teaching, and implementing standards in global organizations and institutions. Researchers can benefit from specific cases, frameworks, and new theories in IT standards studies.
Organizations of all types are consistently working on new initiatives, product lines, or implementation of new workflows as a way to remain competitive in the modern business environment. No matter the type of project, employing the best methods for effective execution and timely completion of the task at hand is essential to project success. The implementation of computer technology has provided further opportunities for innovation and progress in the daily operations and initiatives of corporations. Knowledge Management and Innovation in Network Organizations: Emerging Research and Opportunities is an essential scholarly resource that explores the use of information communication technologies in management models and the development of network organizations operating in various sectors of the economy. Highlighting coverage on a wide range of topics such as cloud computing, organizational development, and business management, this book is ideal for business professionals, organizational researchers, and academicians interested in the latest research on network organizations.
Modelling for Business Improvement contains the proceedings of the First International Conference on Process Modelling and Process Management (MMEP 2010) held in Cambridge, England, in March 2010. It contains contributions from an international group of leading researchers in the fields of process modelling and process management. This conference will showcase recent trends in the modelling and management of engineering processes, explore potential synergies between different modelling approaches, gather and discuss future challenges for the management of engineering processes and discuss future research areas and topics. Modelling for Business Improvement is divided into three main parts: theoretical foundation of modelling and management of engineering processes, and achievements in theory; experiences from management practice using various modelling methods and tools, and their future challenges; and, new perspectives on modelling methods, techniques and tools. Based on the latest achievements in this and related fields, the editors aim to landmark the research map for modelling and management of engineering processes for 2020.
As population increases, the need for energy becomes a crisis of great importance. Technologies for Electrical Power Conversion, Efficiency, and Distribution: Methods and Processes combines unparalleled research, contemporary achievements, and emerging trends within electrical energy conversion technologies and renewable energy sources. The scholarly findings compiled provide a background for discussion of the problems and opportunities of power efficiency and energy conversion in order to develop innovative ways to implement such cutting-edge technologies in the future.
Prolog Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of Parallel, and what's more convenient than getting to know both sides! Want to know More? Buy Now!
Biological and biomedical studies have entered a new era over the past two decades thanks to the wide use of mathematical models and computational approaches. A booming of computational biology, which sheerly was a theoretician's fantasy twenty years ago, has become a reality. Obsession with computational biology and theoretical approaches is evidenced in articles hailing the arrival of what are va- ously called quantitative biology, bioinformatics, theoretical biology, and systems biology. New technologies and data resources in genetics, such as the International HapMap project, enable large-scale studies, such as genome-wide association st- ies, which could potentially identify most common genetic variants as well as rare variants of the human DNA that may alter individual's susceptibility to disease and the response to medical treatment. Meanwhile the multi-electrode recording from behaving animals makes it feasible to control the animal mental activity, which could potentially lead to the development of useful brain-machine interfaces. - bracing the sheer volume of genetic, genomic, and other type of data, an essential approach is, ?rst of all, to avoid drowning the true signal in the data. It has been witnessed that theoretical approach to biology has emerged as a powerful and st- ulating research paradigm in biological studies, which in turn leads to a new - search paradigm in mathematics, physics, and computer science and moves forward with the interplays among experimental studies and outcomes, simulation studies, and theoretical investigations.
This book is for developers who are looking for an overview of basic concepts in Natural Language Processing using R. It casts a wide net of techniques to help developers who have a range of technical backgrounds. Numerous code samples and listings are included to support myriad topics. The final chapter presents the Transformer Architecture, BERT-based models, and the GPT family of models, all of which were developed during the past three years. Companion files with source code and figures are included. Features Covers extensive topics related to natural language processing using R Features companion files with source code and figures from the book
New Edition: Introduction to Computational Earthquake Engineering (3rd Edition)Introduction to Computational Earthquake Engineering covers solid continuum mechanics, finite element method and stochastic modeling comprehensively, with the second and third chapters explaining the numerical simulation of strong ground motion and faulting, respectively. Stochastic modeling is used for uncertain underground structures, and advanced analytical methods for linear and non-linear stochastic models are presented. The verification of these methods by comparing the simulation results with observed data is then presented, and examples of numerical simulations which apply these methods to practical problems are generously provided. Furthermore three advanced topics of computational earthquake engineering are covered, detailing examples of applying computational science technology to earthquake engineering problems. |
You may like...
Polymer Gels - Science and Fundamentals
Vijay Kumar Thakur, Manju Kumari Thakur
Hardcover
R5,227
Discovery Miles 52 270
Melt Rheology and its Applications in…
John M. Dealy, Jian Wang
Hardcover
R5,659
Discovery Miles 56 590
Sustainable Polylactide-Based Composites
Suprakas Sinha Ray, Ritima Banerjee
Paperback
R4,663
Discovery Miles 46 630
Lignin - Biosynthesis and Transformation…
Swati Sharma, Ashok Kumar
Hardcover
R4,722
Discovery Miles 47 220
Handbook of Research on Designing User…
Abhijit Narayanrao Banubakode, Haris Abd Wahab, …
Hardcover
R6,648
Discovery Miles 66 480
Self Organized Nanostructures of…
Axel H.E. Muller, Oleg Borisov
Hardcover
R5,165
Discovery Miles 51 650
New Perspectives in End-User Development
Fabio Paterno, Volker Wulf
Hardcover
R4,882
Discovery Miles 48 820
Acrylate Polymers for Advanced…
Angel Serrano-Aroca, Sanjukta Deb
Hardcover
R3,050
Discovery Miles 30 500
Advances in Production Management…
Bruno Vallespir, Thecle Alix
Hardcover
R2,787
Discovery Miles 27 870
Information Systems for eGovernment - A…
Gianluigi Viscusi, Carlo Batini, …
Hardcover
R1,437
Discovery Miles 14 370
|