![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > Artificial intelligence > Knowledge-based systems / expert systems
The book presents the proceedings of two conferences: the 16th International Conference on Data Science (ICDATA 2020) and the 19th International Conference on Information & Knowledge Engineering (IKE 2020), which took place in Las Vegas, NV, USA, July 27-30, 2020. The conferences are part of the larger 2020 World Congress in Computer Science, Computer Engineering, & Applied Computing (CSCE'20), which features 20 major tracks. Papers cover all aspects of Data Science, Data Mining, Machine Learning, Artificial and Computational Intelligence (ICDATA) and Information Retrieval Systems, Information & Knowledge Engineering, Management and Cyber-Learning (IKE). Authors include academics, researchers, professionals, and students. Presents the proceedings of the 16th International Conference on Data Science (ICDATA 2020) and the 19th International Conference on Information & Knowledge Engineering (IKE 2020); Includes papers on topics from data mining to machine learning to informational retrieval systems; Authors include academics, researchers, professionals and students.
Reachable Sets of Dynamic Systems: Uncertainty, Sensitivity, and Complex Dynamics introduces differential inclusions, providing an overview as well as multiple examples of its interdisciplinary applications. The design of dynamic systems of any type is an important issue as is the influence of uncertainty in model parameters and model sensitivity. The possibility of calculating the reachable sets may be a powerful additional tool in such tasks. This book can help graduate students, researchers, and engineers working in the field of computer simulation and model building, in the calculation of reachable sets of dynamic models.
Cooperating Heterogeneous Systems provides an in-depth introduction to the issues and techniques surrounding the integration and control of diverse and independent software components. Organizations increasingly rely upon diverse computer systems to perform a variety of knowledge-based tasks. This presents technical issues of interoperability and integration, as well as philosophical issues of how cooperation and interaction between computational entities is to be realized. Cooperating systems are systems that work together towards a common end. The concepts of cooperation must be realized in technically sound system architectures, having a uniform meta-layer between knowledge sources and the rest of the system. The layer consists of a family of interpreters, one for each knowledge source, and meta-knowledge. A system architecture to integrate and control diverse knowledge sources is presented. The architecture is based on the meta-level properties of the logic programming language Prolog. An implementation of the architecture is described, a Framework for Logic Programming Systems with Distributed Execution (FLiPSiDE). Knowledge-based systems play an important role in any up-to-date arsenal of decision support tools. The tremendous growth of computer communications infrastructure has made distributed computing a viable option, and often a necessity in geographically distributed organizations. It has become clear that to take knowledge-based systems to their next useful level, it is necessary to get independent knowledge-based systems to work together, much as we put together ad hoc work groups in our organizations to tackle complex problems. The book is for scientists and software engineers who have experience in knowledge-based systems and/or logic programming and seek a hands-on introduction to cooperating systems. Researchers investigating autonomous agents, distributed computation, and cooperating systems will find fresh ideas and new perspectives on well-established approaches to control, organization, and cooperation.
I want to express my sincere thanks to all authors who submitted research papers to support the Third IFIP International Conference on Computer and Computing Te- nologies in Agriculture and the Third Symposium on Development of Rural Infor- tion (CCTA 2009) held in China, during October 14-17, 2009. This conference was hosted by the CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University), China National En- neering Research Center for Information Technology in Agriculture, Asian Conf- ence on Precision Agriculture, International Federation for Information Processing, Chinese Society of Agricultural Engineering, Beijing Society for Information Te- nology in Agriculture, and the Chinese Society for Agricultural Machinery. The pla- num sponsor includes the Ministry of Science and Technology of China, Ministry of Agriculture of China, Ministry of Education of China, among others. The CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University) focuses on research and development of advanced and practical technologies applied in agriculture and on promoting international communi- tion and cooperation. It has successfully held three International Conferences on C- puter and Computing Technologies in Agriculture, namely CCTA 2007, CCTA 2008 and CCTA 2009. Sustainable agriculture is the focus of the whole world currently, and therefore the application of information technology in agriculture is becoming more and more - portant. 'Informatized agriculture' has been sought by many countries recently in order to scientifically manage agriculture to achieve low costs and high incomes.
Initially conceived as a methodology for the representation and
manipulation of imprecise and vague information, fuzzy computation
has found wide use in problems that fall well beyond its originally
intended scope of application. Many scientists and engineers now
use the paradigms of fuzzy computation to tackle problems that are
either intractable or unrealistically time consuming. The
extraordinary growth of fuzzy computation in recent years has led
to the development of numerous systems of major practical
importance in fields ranging from medical diagnosis and automated
learning to image understanding and systems control.
In 1985 it was 20 years since Nobel Laureate Herbert A. Simon published: 'THE SHAPE OF AUTOMATION: For Men and Management'. This short but important and still topical book dwells on three subjects: - The Long-Range Economic Effects of Automation; - Will the Corporation be Managed by Machines? - The New Science of Management Decision. In contrast with George Orwell, who was a critic of contemporary political systems rather than a prophet, Simon portrays a far more rosy picture of our 'brave new world'. Simon's work breathes optimism. First, computer technology; looking back it is aoubtful whether even the professor expected the hardware development ~e have wittnessed. Secondly, our ability to 'tame the beast'; there is now not much reason for complacency and satisfaction. Offices and factories can by no means be called automated, at most semi-automated. Thirdly the organizational and social implications of these rapid technological developments; referring to what he then called: 'The Computer and the new decision making techniques ..* ' Concerning this last point, there is little need to emphasize that had been less practical application in organizations than the often impressive theoretical developments would lead one to believe. In Europe this situation is even more accute than in the USA and Japan. The ESPRIT programme of the ECC and many similar national programs intend to bridge the gap.
This book addresses current technology trends and requirements leading towards the next era in mobile communication handsets; and beyond that the book proposes innovative solutions that could be candidate solutions for 5G phones. It adopts a multidisciplinary and interdisciplinary stance towards handset design, a necessary ingredient if 5th Generation handset and services are to really take-off. Therefore the scope of the book targets a broad range of subjects, including energy efficiency, RF design, cooperation, context-aware systems, roaming, and short-range networking, all of which working in synergy to provide seamless mobility and high speed connectivity within a HetNet environment. Specifically, the authors investigate how we can exploit the cooperation paradigm and context-aware mechanism working in synergy to provide energy compliant phones that can introduce power savings of up to 50% on state-of-the-art. Going beyond this, a chapter on business modeling approaches is also included, based on incentive mechanisms for cooperation that will provide the necessary leverage to promote the up-take of the proposed technology.
In today's business arena information is one of the most important resources possessed by enterprises. In order to support proper information flow, businesses deploy transactional systems, build decision support systems or launch management information systems. Unfortunately, the majority of information systems do not take advantage of recent developments in knowledge management, thus exposing companies to the risk of missing important information, or what is even worse, leading them to misinterpret information. Knowledge-Based Information Retrieval and Filtering from the Web
contains fifteen chapters, contributed by leading international
researchers, addressing the matter of information retrieval,
filtering and management of the information on the Internet. The
research presented in these chapters deals with the need to find
proper solutions for the description of the information found on
the Internet, the description of the information consumers need,
the algorithms for retrieving documents (and indirectly, the
information embedded in them), and the presentation of the
information found. The chapters include:
This volume investigates our ability to capture, and then apply,
expertise. In recent years, expertise has come to be regarded as an
increasingly valuable and surprisingly elusive resource. Experts,
who were the sole active dispensers of certain kinds of knowledge
in the days before AI, have themselves become the objects of
empirical inquiry, in which their knowledge is elicited and studied
-- by knowledge engineers, experimental psychologists, applied
psychologists, or other experts -- involved in the development of
expert systems. This book achieves a marriage between
experimentalists, applied scientists, and theoreticians who deal
with expertise. It envisions the benefits to society of an advanced
technology for capturing and disseminating the knowledge and skills
of the best corporate managers, the most seasoned pilots, and the
most renowned medical diagnosticians. This book should be of
interest to psychologists as well as to knowledge engineers who are
"out in the trenches" developing expert systems, and anyone
pondering the nature of expertise and the question of how it can be
elicited and studied scientifically. The book's scope and the
pivotal concepts that it elucidates and appraises, as well as the
extensive categorized bibliographies it includes, make this volume
a landmark in the field of expert systems and AI as well as the
field of applied experimental psychology.
Thiseditedbookispublishedin honorofDr. GeorgeJ. Vachtsevanos, ourDr. V, c- rently Professor Emeritus, School of Electrical and Computer Engineering, Georgia Institute of Technology, on the occasion of his 70th birthday and for his more than 30 years of contribution to the discipline of Intelligent Control and its application to a wide spectrum of engineering and bioengineering systems. The book is nothing but a very small token of appreciation from Dr. V's former graduate students, his peers and colleagues in the profession - and not only - to the Scientist, the Engineer, the Professor, the mentor, but most important of all, to the friend and human being. All those who have met Dr. V over the years and haveinteractedwith himin someprofessionaland/orsocial capacityunderstandthis statement: Georgenevermadeanybodyfeelinferiortohim, hehelpedandsupported everybody, and he was there when anybody needed him I was not Dr. V's student. I rst met him and his wife Athena more than 26 years ago during one of their visits to RPI, in the house of my late advisor, Dr. George N. Saridis. Since then, I have been very fortunate to have had and continue to have interactions with him. It is not an exaggeration if I say that we all learned a lot from him.
Embedded systems have long become essential in application areas in which human control is impossible or infeasible. The development of modern embedded systems is becoming increasingly difficult and challenging because of their overall system complexity, their tighter and cross-functional integration, the increasing requirements concerning safety and real-time behavior, and the need to reduce development and operation costs. This book provides a comprehensive overview of the Software Platform Embedded Systems (SPES) modeling framework and demonstrates its applicability in embedded system development in various industry domains such as automation, automotive, avionics, energy, and healthcare. In SPES 2020, twenty-one partners from academia and industry have joined forces in order to develop and evaluate in different industrial domains a modeling framework that reflects the current state of the art in embedded systems engineering. The content of this book is structured in four parts. Part I "Starting Point" discusses the status quo of embedded systems development and model-based engineering, and summarizes the key requirements faced when developing embedded systems in different application domains. Part II "The SPES Modeling Framework" describes the SPES modeling framework. Part III "Application and Evaluation of the SPES Modeling Framework" reports on the validation steps taken to ensure that the framework met the requirements discussed in Part I. Finally, Part IV "Impact of the SPES Modeling Framework" summarizes the results achieved and provides an outlook on future work. The book is mainly aimed at professionals and practitioners who deal with the development of embedded systems on a daily basis. Researchers in academia and industry may use it as a compendium for the requirements and state-of-the-art solution concepts for embedded systems development.
Creating scientific workflow applications is a very challenging task due to the complexity of the distributed computing environments involved, the complex control and data flow requirements of scientific applications, and the lack of high-level languages and tools support. Particularly, sophisticated expertise in distributed computing is commonly required to determine the software entities to perform computations of workflow tasks, the computers on which workflow tasks are to be executed, the actual execution order of workflow tasks, and the data transfer between them. Qin and Fahringer present a novel workflow language called Abstract Workflow Description Language (AWDL) and the corresponding standards-based, knowledge-enabled tool support, which simplifies the development of scientific workflow applications. AWDL is an XML-based language for describing scientific workflow applications at a high level of abstraction. It is designed in a way that allows users to concentrate on specifying such workflow applications without dealing with either the complexity of distributed computing environments or any specific implementation technology. This research monograph is organized into five parts: overview, programming, optimization, synthesis, and conclusion, and is complemented by an appendix and an extensive reference list. The topics covered in this book will be of interest to both computer science researchers (e.g. in distributed programming, grid computing, or large-scale scientific applications) and domain scientists who need to apply workflow technologies in their work, as well as engineers who want to develop distributed and high-throughput workflow applications, languages and tools.
The Knowledge Seeker is a useful system to develop various intelligent applications such as ontology-based search engine, ontology-based text classification system, ontological agent system, and semantic web system etc. The Knowledge Seeker contains four different ontological components. First, it defines the knowledge representation model !V Ontology Graph. Second, an ontology learning process that based on chi-square statistics is proposed for automatic learning an Ontology Graph from texts for different domains. Third, it defines an ontology generation method that transforms the learning outcome to the Ontology Graph format for machine processing and also can be visualized for human validation. Fourth, it defines different ontological operations (such as similarity measurement and text classification) that can be carried out with the use of generated Ontology Graphs. The final goal of the KnowledgeSeeker system framework is that it can improve the traditional information system with higher efficiency. In particular, it can increase the accuracy of a text classification system, and also enhance the search intelligence in a search engine. This can be done by enhancing the system with machine processable ontology.
This book constitutes the refereed post-conference proceedings of the Second IFIP International Cross-Domain Conference on Internet of Things, IFIPIoT 2019, held in Tampa, USA, in October/ November 2019. The 11 full papers presented were carefully reviewed and selected from 22 submissions. Also included in this volume are 8 invited papers. The papers are organized in the following topical sections: IoT applications; context reasoning and situational awareness; IoT security; smart and low power IoT; smart network architectures; and smart system design and IoT education.
This open access book provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries. To this end, the book is structured in four parts: Part I "Foundations and Contexts" provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II "Data Space Technologies" subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various "Use Cases and Data Ecosystems" from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several "Solutions and Applications", eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more. Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty.
The rapid advances in performance and miniaturisation in microtechnology are constantly opening up new markets for the programmable logic controller (PLC). Specially designed controller hardware or PC-based controllers, extended by hardware and software with real-time capability, now control highly complex automation processes. This has been extended by the new subject of "safe- related controllers," aimed at preventing injury by machines during the production process. The different types of PLC cover a wide task spectrum - ranging from small network node computers and distributed compact units right up to modular, fau- tolerant, high-performance PLCs. They differ in performance characteristics such as processing speed, networking ability or the selection of I/O modules they support. Throughout this book, the term PLC is used to refer to the technology as a whole, both hardware and software, and not merely to the hardware architecture. The IEC61131 programming languages can be used for programming classical PLCs, embedded controllers, industrial PCs and even standard PCs, if suitable hardware (e.g. fieldbus board) for connecting sensors and actors is available.
Frontiers in Belief Revision is a unique collection of leading edge research in Belief Revision. It contains the latest innovative ideas of highly respected and pioneering experts in the area, including Isaac Levi, Krister Segerberg, Sven Ove Hansson, Didier Dubois, and Henri Prade. The book addresses foundational issues of inductive reasoning and minimal change, generalizations of the standard belief revision theories, strategies for iterated revisions, probabilistic beliefs, multiagent environments and a variety of data structures and mechanisms for implementations. This book is suitable for students and researchers interested in knowledge representation and in the state of the art of the theory and practice of belief revision.
Speech--to--Speech Translation: a Massively Parallel Memory-Based Approach describes one of the world's first successful speech--to--speech machine translation systems. This system accepts speaker-independent continuous speech, and produces translations as audio output. Subsequent versions of this machine translation system have been implemented on several massively parallel computers, and these systems have attained translation performance in the milliseconds range. The success of this project triggered several massively parallel projects, as well as other massively parallel artificial intelligence projects throughout the world. Dr. Hiroaki Kitano received the distinguished Computers and Thought Award' from the International Joint Conferences on Artificial Intelligence in 1993 for his work in this area, and that work is reported in this book.
With this book, Christopher Kormanyos delivers a highly practical guide to programming real-time embedded microcontroller systems in C++. It is divided into three parts plus several appendices. Part I provides a foundation for real-time C++ by covering language technologies, including object-oriented methods, template programming and optimization. Next, part II presents detailed descriptions of a variety of C++ components that are widely used in microcontroller programming. It details some of C++'s most powerful language elements, such as class types, templates and the STL, to develop components for microcontroller register access, low-level drivers, custom memory management, embedded containers, multitasking, etc. Finally, part III describes mathematical methods and generic utilities that can be employed to solve recurring problems in real-time C++. The appendices include a brief C++ language tutorial, information on the real-time C++ development environment and instructions for building GNU GCC cross-compilers and a microcontroller circuit. For this fourth edition, the most recent specification of C++20 is used throughout the text. Several sections on new C++20 functionality have been added, and various others reworked to reflect changes in the standard. Also several new example projects ranging from introductory to advanced level are included and existing ones extended, and various reader suggestions have been incorporated. Efficiency is always in focus and numerous examples are backed up with runtime measurements and size analyses that quantify the true costs of the code down to the very last byte and microsecond. The target audience of this book mainly consists of students and professionals interested in real-time C++. Readers should be familiar with C or another programming language and will benefit most if they have had some previous experience with microcontroller electronics and the performance and size issues prevalent in embedded systems programming.
Pervasive healthcare is the conceptual system of providing healthcare to anyone, at anytime, and anywhere by removing restraints of time and location while increasing both the coverage and the quality of healthcare. Pervasive Healthcare Monitoring is at the forefront of this research, and presents the ways in which mobile and wireless technologies can be used to implement the vision of pervasive healthcare. This vision includes prevention, healthcare maintenance and checkups; short-term monitoring (home healthcare monitoring), long-term monitoring (nursing home), and personalized healthcare monitoring; and incidence detection and management, emergency intervention, and transportation and treatment. The pervasive healthcare applications include pervasive health monitoring, intelligent emergency management system, pervasive healthcare data access, and ubiquitous mobile telemedicine. Pervasive Healthcare Monitoring fills the need for a research-oriented book on the wide array of emerging healthcare applications and services, including the treatment of several new wireless technologies and the ways in which they will implement the vision of pervasive healthcare. This book is written primarily for university faculty and graduate students in the field of healthcare technologies, and industry professionals involved in healthcare IT research, design, and development.
Thinking in terms of facts and rules is perhaps one of the most common ways of approaching problem de?nition and problem solving both in everyday life and under more formal circumstances. The best known set of rules, the Ten Commandments have been accompanying us since the times of Moses; the Decalogue proved to be simple but powerful, concise and universal. It is logically consistent and complete. There are also many other attempts to impose rule-based regulations in almost all areas of life, including professional work, education, medical services, taxes, etc. Some most typical examples may include various codes (e.g. legal or tra?c code), regulations (especially military ones), and many systems of customary or informal rules. The universal nature of rule-based formulation of behavior or inference principles follows from the concept of rules being a simple and intuitive yet powerful concept of very high expressive power. Moreover, rules as such encode in fact functional aspects of behavior and can be used for modeling numerous phenomena.
Blockchain Technology for Emerging Applications: A Comprehensive Approach explores recent theories and applications of the execution of blockchain technology. Chapters look at a wide range of application areas, including healthcare, digital physical frameworks, web of-things, smart transportation frameworks, interruption identification frameworks, ballot-casting, architecture, smart urban communities, and digital rights administration. The book addresses the engineering, plan objectives, difficulties, constraints, and potential answers for blockchain-based frameworks. It also looks at blockchain-based design perspectives of these intelligent architectures for evaluating and interpreting real-world trends. Chapters expand on different models which have shown considerable success in dealing with an extensive range of applications, including their ability to extract complex hidden features and learn efficient representation in unsupervised environments for blockchain security pattern analysis.
This open access book provides an in-depth description of the EU project European Language Grid (ELG). Its motivation lies in the fact that Europe is a multilingual society with 24 official European Union Member State languages and dozens of additional languages including regional and minority languages. The only meaningful way to enable multilingualism and to benefit from this rich linguistic heritage is through Language Technologies (LT) including Natural Language Processing (NLP), Natural Language Understanding (NLU), Speech Technologies and language-centric Artificial Intelligence (AI) applications. The European Language Grid provides a single umbrella platform for the European LT community, including research and industry, effectively functioning as a virtual home, marketplace, showroom, and deployment centre for all services, tools, resources, products and organisations active in the field. Today the ELG cloud platform already offers access to more than 13,000 language processing tools and language resources. It enables all stakeholders to deposit, upload and deploy their technologies and datasets. The platform also supports the long-term objective of establishing digital language equality in Europe by 2030 - to create a situation in which all European languages enjoy equal technological support. This is the very first book dedicated to Language Technology and NLP platforms. Cloud technology has only recently matured enough to make the development of a platform like ELG feasible on a larger scale. The book comprehensively describes the results of the ELG project. Following an introduction, the content is divided into four main parts: (I) ELG Cloud Platform; (II) ELG Inventory of Technologies and Resources; (III) ELG Community and Initiative; and (IV) ELG Open Calls and Pilot Projects. |
![]() ![]() You may like...
Accelerating MATLAB with GPU Computing…
Jung Suh, Youngmin Kim
Paperback
R1,547
Discovery Miles 15 470
A Practical Introduction to Fuzzy Logic…
Luis Arguelles Mendez
Hardcover
Introduction to Data Systems - Building…
Thomas Bressoud, David White
Hardcover
R2,423
Discovery Miles 24 230
|