Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Applications of computing > Artificial intelligence > Knowledge-based systems / expert systems
This book presents the technical program of the International Embedded Systems Symposium (IESS) 2009. Timely topics, techniques and trends in embedded system design are covered by the chapters in this volume, including modelling, simulation, verification, test, scheduling, platforms and processors. Particular emphasis is paid to automotive systems and wireless sensor networks. Sets of actual case studies in the area of embedded system design are also included. Over recent years, embedded systems have gained an enormous amount of proce- ing power and functionality and now enter numerous application areas, due to the fact that many of the formerly external components can now be integrated into a single System-on-Chip. This tendency has resulted in a dramatic reduction in the size and cost of embedded systems. As a unique technology, the design of embedded systems is an essential element of many innovations. Embedded systems meet their performance goals, including real-time constraints, through a combination of special-purpose hardware and software components tailored to the system requirements. Both the development of new features and the reuse of existing intellectual property components are essential to keeping up with ever more demanding customer requirements. Furthermore, design complexities are steadily growing with an increasing number of components that have to cooperate properly. Embedded system designers have to cope with multiple goals and constraints simul- neously, including timing, power, reliability, dependability, maintenance, packaging and, last but not least, price.
This book is dedicated to Prof. Dr. Heinz Gerhauser on the occasion of his retirement both from the position of Executive Director of the Fraunhofer Institute for Integrated Circuits IIS and from the Endowed Chair of Information Technologies with a Focus on Communication Electronics (LIKE) at the Friedrich-Alexander-Universitat Erlangen-Nurnberg. Heinz Gerhauser's vision and entrepreneurial spirit have made the Fraunhofer IIS one of the most successful and renowned German research institutions. He has been Director of the Fraunhofer IIS since 1993, and under his leadership it has grown to become the largest of Germany's 60 Fraunhofer Institutes, a position it retains to this day, currently employing over 730 staff. Likely his most important scientific as well as application-related contribution was his pivotal role in the development of the mp3 format, which would later become a worldwide success. The contributions to this Festschrift were written by both Fraunhofer IIS staff and external project team members in appreciation of Prof. Dr. Gerhauser's lifetime academic achievements and his inspiring leadership at the Fraunhofer IIS. The papers reflect the broad spectrum of the institute's research activities and are grouped into sections on circuits, information systems, visual computing, and audio and multimedia. They provide academic and industrial researchers in fields like signal processing, sensor networks, microelectronics, and integrated circuits with an up-to-date overview of research results that have a huge potential for cutting-edge industrial applications.
I3E 2009 was held in Nancy, France, during September 23-25, hosted by Nancy University and INRIA Grand-Est at LORIA. The conference provided scientists andpractitionersofacademia, industryandgovernmentwithaforumwherethey presented their latest ?ndings concerning application of e-business, e-services and e-society, and the underlying technology to support these applications. The 9th IFIP Conference on e-Business, e-Services and e-Society, sponsored by IFIP WG 6.1. of Technical Committees TC6 in cooperation with TC11, and TC8 represents the continuation of previous events held in Zurich (Switzerland) in 2001, Lisbon (Portugal) in 2002, Sao Paulo (Brazil) in 2003, Toulouse (France) in 2004, Poznan (Poland) in 2005, Turku (Finland) in 2006, Wuhan (China) in 2007 and Tokyo (Japan) in 2008. The call for papers attracted papers from 31 countries from the ?ve con- nents. As a result, the I3E 2009 programo?ered 12 sessions of full-paper pres- tations. The 31 selected papers cover a wide and important variety of issues in e-Business, e-servicesande-society, including security, trust, andprivacy, ethical and societal issues, business organization, provision of services as software and software as services, and others. Extended versions of selected papers submitted to I3E 2009 will be published in the International Journal of e-Adoption and in AIS Transactions on Enterprise Systems. In addition, a 500-euros prize was awarded to the authors of the best paper selected by the Program Comm- tee. We thank all authors who submitted their papers, the Program Committee members and external reviewers for their excellent
Intelligent Systems can be defined as systems whose design, mainly based on computational techniques, is supported, in some parts, by operations and processing skills inspired by human reasoning and behaviour. Intelligent Systems must typically operate in a scenario in which non-linearities are the rule and not as a disturbing effect to be corrected. Finally, Intelligent Systems also have to incorporate advanced sensory technology in order to simplify man-machine interactions. Several algorithms are currently the ordinary tools of Intelligent Systems. This book contains a selection of contributions regarding Intelligent Systems by experts in diverse fields. Topics discussed in the book are: Applications of Intelligent Systems in Modelling and Prediction of Environmental Changes, Cellular Neural Networks for NonLinear Filtering, NNs for Signal Processing, Image Processing, Transportation Intelligent Systems, Intelligent Techniques in Power Electronics, Applications in Medicine and Surgery, Hardware Implementation and Learning of NNs.
In this book, the authors first address the research issues by providing a motivating scenario, followed by the exploration of the principles and techniques of the challenging topics. Then they solve the raised research issues by developing a series of methodologies. More specifically, the authors study the query optimization and tackle the query performance prediction for knowledge retrieval. They also handle unstructured data processing, data clustering for knowledge extraction. To optimize the queries issued through interfaces against knowledge bases, the authors propose a cache-based optimization layer between consumers and the querying interface to facilitate the querying and solve the latency issue. The cache depends on a novel learning method that considers the querying patterns from individual's historical queries without having knowledge of the backing systems of the knowledge base. To predict the query performance for appropriate query scheduling, the authors examine the queries' structural and syntactical features and apply multiple widely adopted prediction models. Their feature modelling approach eschews the knowledge requirement on both the querying languages and system. To extract knowledge from unstructured Web sources, the authors examine two kinds of Web sources containing unstructured data: the source code from Web repositories and the posts in programming question-answering communities. They use natural language processing techniques to pre-process the source codes and obtain the natural language elements. Then they apply traditional knowledge extraction techniques to extract knowledge. For the data from programming question-answering communities, the authors make the attempt towards building programming knowledge base by starting with paraphrase identification problems and develop novel features to accurately identify duplicate posts. For domain specific knowledge extraction, the authors propose to use a clustering technique to separate knowledge into different groups. They focus on developing a new clustering algorithm that uses manifold constraints in the optimization task and achieves fast and accurate performance. For each model and approach presented in this dissertation, the authors have conducted extensive experiments to evaluate it using either public dataset or synthetic data they generated.
This book contains a selection of papers from the 16th International Symposium on Spatial Data Handling (SDH), the premier long-running forum in geographical information science. This collection offers readers exemplary contributions to geospatial scholarship and practice from the conference's 30th anniversary.
This book focuses on the implementation of Quality Function Deployment (QFD) in the construction industry as a tool to help building designers arrive at optimal decisions for external envelope systems with sustainable and buildable design goals. In particular, the book integrates special features into the conventional QFD tool to enhance its performance. These features include a fuzzy multi-criteria decision-making method, fuzzy consensus scheme, and Knowledge Management System (KMS). This integration results in a more robust decision support tool, known as the Knowledge-based Decision Support System QFD (KBDSS-QFD) tool. As an example, the KBDSS-QFD tool is used for the assessment of building envelope materials and designs for high-rise residential buildings in Singapore in the early design stage. The book provides the reader with a conceptual framework for understanding the development of the KBDSS-QFD tool. The framework is presented in a generalized form in order to benefit building professionals, decision makers, analysts, academics and researchers, who can use the findings as guiding principles to achieve optimal solutions and boost efficiency.
The tremendous growth in the availability of inexpensive computing power and easy availability of computers have generated tremendous interest in the design and imp- mentation of Complex Systems. Computer-based solutions offer great support in the design of Complex Systems. Furthermore, Complex Systems are becoming incre- ingly complex themselves. This research book comprises a selection of state-of-the-art contributions to topics dealing with Complex Systems in a Knowledge-based En- ronment. Complex systems are ubiquitous. Examples comprise, but are not limited to System of Systems, Service-oriented Approaches, Agent-based Systems, and Complex Distributed Virtual Systems. These are application domains that require knowledge of engineering and management methods and are beyond the scope of traditional systems. The chapters in this book deal with a selection of topics which range from unc- tainty representation, management and the use of ontological means which support and are large-scale business integration. All contributions were invited and are based on the recognition of the expertise of the contributing authors in the field. By colle- ing these sources together in one volume, the intention was to present a variety of tools to the reader to assist in both study and work. The second intention was to show how the different facets presented in the chapters are complementary and contribute towards this emerging discipline designed to aid in the analysis of complex systems.
With the proliferation of VHDL, the reference material also grew in the same order. Today there is good amount of scholarly literature including many books describing various aspects of VHDL. However, an indepth review of these books reveals a different story. Many of them have emerged simply as an improved version of the manual. While some of them deal with the system design issues, they lack appropriate exemplifying to illustrate the concepts. Others give large number of examples, but lack the VLSI system design issues. In nutshell, the fact which gone unnoticed by most of the books, is the growth of the VLSI is not merely due to the language itself, but more due to the development of large number of third party tools useful from the FPGA or semicustom ASIC realization point of view. In the proposed book, the authors have synergized the VHDL programming with appropriate EDA tools so as to present a full proof system design to the readers. In this book along with the VHDL coding issues, the simulation and synthesis with the various toolsets enables the potential reader to visualize the final design. The VHDL design codes have been synthesized using different third party tools such as Xilinx Web pack Ver.11, Modelsim PE, Leonrado Spectrum and Synplify Pro. Mixed flow illustrated by using the above mentioned tools presents an insight to optimize the design with reference to the spatial, temporal and power metrics.
Computerarchitecturepresentlyfacesanunprecedentedrevolution: Thestep from monolithic processors towards multi-core ICs, motivated by the ever - creasingneedforpowerandenergyef ciencyinnanoelectronics. Whetheryou prefer to call it MPSoC (multi-processor system-on-chip) or CMP (chip mul- processor), no doubt this revolution affects large domains of both computer science and electronics, and it poses many new interdisciplinary challenges. For instance, ef cient programming models and tools for MPSoC are largely an open issue: "Multi-core platforms are a reality - but where is the software support" (R. Lauwereins, IMEC). Solving it will require enormous research efforts as well as the education of a whole new breed of software engineers that bring the results from universities into industrial practice. Atthesametime, thedesignofcomplexMPSoCarchitecturesisanextremely time-consuming task, particularly in the wireless and multimedia application domains, where heterogeneous architectures are predominant. Due to the - ploding NRE and mask costs most companies are now following a platform approach: Invest a large (but one-time) design effort into a proper core - chitecture, and create easy-to-design derivatives for new standards or product features. Needless to say, only the most ef cient MPSoC platforms have a real chance to enjoy a multi-year lifetime on the highly competitive semiconductor market for embedded systems.
The main aim of this volume has been to gather together a selection of recent papers providing new ideas and solutions for a wide spectrum of Knowledge-Driven Computing approaches. More precisely, the ultimate goal has been to collect new knowledge representation, processing and computing paradigms which could be useful to practitioners involved in the area of discussion. To this end, contributions covering both theoretical aspects and practical solutions were preferred.
This is the first monograph on the emerging area of linguistic linked data. Presenting a combination of background information on linguistic linked data and concrete implementation advice, it introduces and discusses the main benefits of applying linked data (LD) principles to the representation and publication of linguistic resources, arguing that LD does not look at a single resource in isolation but seeks to create a large network of resources that can be used together and uniformly, and so making more of the single resource. The book describes how the LD principles can be applied to modelling language resources. The first part provides the foundation for understanding the remainder of the book, introducing the data models, ontology and query languages used as the basis of the Semantic Web and LD and offering a more detailed overview of the Linguistic Linked Data Cloud. The second part of the book focuses on modelling language resources using LD principles, describing how to model lexical resources using Ontolex-lemon, the lexicon model for ontologies, and how to annotate and address elements of text represented in RDF. It also demonstrates how to model annotations, and how to capture the metadata of language resources. Further, it includes a chapter on representing linguistic categories. In the third part of the book, the authors describe how language resources can be transformed into LD and how links can be inferred and added to the data to increase connectivity and linking between different datasets. They also discuss using LD resources for natural language processing. The last part describes concrete applications of the technologies: representing and linking multilingual wordnets, applications in digital humanities and the discovery of language resources. Given its scope, the book is relevant for researchers and graduate students interested in topics at the crossroads of natural language processing / computational linguistics and the Semantic Web / linked data. It appeals to Semantic Web experts who are not proficient in applying the Semantic Web and LD principles to linguistic data, as well as to computational linguists who are used to working with lexical and linguistic resources wanting to learn about a new paradigm for modelling, publishing and exploiting linguistic resources.
Cyberspace security is a critical subject of our times. On the one hand the development of Internet, mobile communications, distributed computing, computer software and databases storing essential enterprise information has helped to conduct business and personal communication between individual people. On the other hand it has created many opportunities for abuse, fraud and expensive damage. This book is a selection of the best papers presented at the NATO Advanced Research Workshop dealing with the Subject of Cyberspace Security and Defense. The level of the individual contributions in the volume is advanced and suitable for senior and graduate students, researchers and technologists who wish to get some feeling of the state of the art in several sub-disciplines of Cyberspace security. Several papers provide a broad-brush description of national security issues and brief summaries of technology states. These papers can be read and appreciated by technically enlightened managers and executives who want to understand security issues and approaches to technical solutions. An important question of our times is not "Should we do something for enhancing our digital assets security," the question is "How to do it."
This book contains innovative research from leading researchers who presented their work at the 17th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2013, held in Kitakyusha, Japan, in September 2013. The conference provided a competitive field of 236 contributors, from which 38 authors expanded their contributions and only 21 published. A plethora of techniques and innovative applications are represented within this volume. The chapters are organized using four themes. These topics include: data mining, knowledge management, advanced information processes and system modelling applications. Each topic contains multiple contributions and many offer case studies or innovative examples. Anyone that wants to work with information repositories or process knowledge should consider reading one or more chapters focused on their technique of choice. They may also benefit from reading other chapters to assess if an alternative technique represents a more suitable approach. This book will benefit anyone already working with Knowledge-Based or Intelligent Information Systems, however is suitable for students and researchers seeking to learn more about modern Artificial Intelligence techniques.
The book presents a conceptually novel oscillations based paradigm, the Oscillation-Based Multi-Agent System (OSIMAS), aimed at the modelling of agents and their systems as coherent, stylized, neurodynamic processes. This paradigm links emerging research domains via coherent neurodynamic oscillation based representations of the individual human mind and society (as a coherent collective mind) states. Thus, this multidisciplinary paradigm delivers an empirical and simulation research framework that provides a new way of modelling the complex dynamics of individual and collective mind states. This book addresses a conceptual problem - the lack of a multidisciplinary, connecting paradigm, which could link fragmented research in the fields of neuroscience, artificial intelligence (AI), multi-agent system (MAS) and the social network domains. The need for a common multidisciplinary research framework essentially arises because these fields share a common object of investigation and simulation, i.e., individual and collective human behavior. Although the fields of research mentioned above all approach this from different perspectives, their common object of investigation unites them. By putting the various pathways of research as they are interrelated into perspective, this book provides a philosophical underpinning, experimental background and modelling tools that the author anticipates will reveal new frontiers in multidisciplinary research. Fundamental investigation of the implicit oscillatory nature of agents' mind states and social mediums in general can reveal some new ways of understanding the periodic and nonperiodic fluctuations taking place in real life. For example, via agent states-related diffusion properties, we could investigate complex economic phenomena like the spread of stock market crashes, currency crises, speculative oscillations (bubbles and crashes), social unrest, recessionary effects, sovereign defaults, etc. All these effects are closely associated with social fragility, which follows and is affected by cycles such as production, political, business and financial. Thus, the multidisciplinary OSIMAS paradigm can yield new knowledge and research perspectives, allowing for a better understanding of social agents and their social organization principles.
This book shows you - through examples and puzzles and intriguing questions - how to make your computer reason logically. To help you, the book includes a CD-ROM with OTTER, the world's most powerful general-purpose reasoning program. The automation of reasoning has advanced markedly in the past few decades, and this book discusses some of the remarkable successes that automated reasoning programs have had in tackling challenging problems in mathematics, logic, program verification, and circuit design. Because the intended audience includes students and teachers, the book provides many exercises (with hints and also answers), as well as tutorial chapters that gently introduce readers to the field of logic and to automated reasoning in general. For more advanced researchers, the book presents challenging questions, many of which are still unsolved.
Solving modern biological problems requires advanced computational methods. Bioinformatics evolved from the active interaction of two fast-developing disciplines, biology and information technology. The central issue of this emerging field is the transformation of often distributed and unstructured biological data into meaningful information. This book describes the application of well-established concepts and techniques from areas like data mining, machine learning, database technologies, and visualization techniques to problems like protein data analysis, genome analysis and sequence databases. Chen has collected contributions from leading researchers in each area. The chapters can be read independently, as each offers a complete overview of its specific area, or, combined, this monograph is a comprehensive treatment that will appeal to students, researchers, and R&D professionals in industry who need a state-of-the-art introduction into this challenging and exciting young field.
In the course of fuzzy technological development, fuzzy graph theory was identified quite early on for its importance in making things work. Two very important and useful concepts are those of granularity and of nonlinear ap proximations. The concept of granularity has evolved as a cornerstone of Lotfi A.Zadeh's theory of perception, while the concept of nonlinear approx imation is the driving force behind the success of the consumer electronics products manufacturing. It is fair to say fuzzy graph theory paved the way for engineers to build many rule-based expert systems. In the open literature, there are many papers written on the subject of fuzzy graph theory. However, there are relatively books available on the very same topic. Professors' Mordeson and Nair have made a real contribution in putting together a very com prehensive book on fuzzy graphs and fuzzy hypergraphs. In particular, the discussion on hypergraphs certainly is an innovative idea. For an experienced engineer who has spent a great deal of time in the lab oratory, it is usually a good idea to revisit the theory. Professors Mordeson and Nair have created such a volume which enables engineers and design ers to benefit from referencing in one place. In addition, this volume is a testament to the numerous contributions Professor John N. Mordeson and his associates have made to the mathematical studies in so many different topics of fuzzy mathematics."
This book demonstrates how to apply modern approaches to complex system control in practical applications involving knowledge-based systems. The dimensions of knowledge-based systems are extended by incorporating new perspectives from control theory, multimodal systems and simulation methods. The book is divided into three parts: theory, production system and information system applications. One of its main focuses is on an agent-based approach to complex system analysis. Moreover, specialised forms of knowledge-based systems (like e-learning, social network, and production systems) are introduced with a new formal approach to knowledge system modelling. The book, which offers a valuable resource for researchers engaged in complex system analysis, is the result of a unique cooperation between scientists from applied computer science (mainly from Poland) and leading system control theory researchers from the Russian Academy of Sciences' Trapeznikov Institute of Control Sciences.
Knowledge-Based Software Engineering brings together in one place important contributions and up-to-date research results in this important area. Knowledge-Based Software Engineering serves as an excellent reference, providing insight into some of the most important research issues in the field.
This book is about Granular Computing (GC) - an emerging conceptual and of information processing. As the name suggests, GC concerns computing paradigm processing of complex information entities - information granules. In essence, information granules arise in the process of abstraction of data and derivation of knowledge from information. Information granules are everywhere. We commonly use granules of time (seconds, months, years). We granulate images; millions of pixels manipulated individually by computers appear to us as granules representing physical objects. In natural language, we operate on the basis of word-granules that become crucial entities used to realize interaction and communication between humans. Intuitively, we sense that information granules are at the heart of all our perceptual activities. In the past, several formal frameworks and tools, geared for processing specific information granules, have been proposed. Interval analysis, rough sets, fuzzy sets have all played important role in knowledge representation and processing. Subsequently, information granulation and information granules arose in numerous application domains. Well-known ideas of rule-based systems dwell inherently on information granules. Qualitative modeling, being one of the leading threads of AI, operates on a level of information granules. Multi-tier architectures and hierarchical systems (such as those encountered in control engineering), planning and scheduling systems all exploit information granularity. We also utilize information granules when it comes to functionality granulation, reusability of information and efficient ways of developing underlying information infrastructures.
This is the first book to provide a step-by-step guide to the methods and practical aspects of acquiring, modelling, storing and sharing knowledge. The reader is led through 47 steps from the inception of a project to its conclusion. Each is described in terms of reasons, required resources, activities, and solutions to common problems. In addition, each step has a checklist which tracks the key items that should be achieved.
Brain-computer interfaces (BCIs) are devices that enable people to communicate via thought alone. Brain signals can be directly translated into messages or commands. Until recently, these devices were used primarily to help people who could not move. However, BCIs are now becoming practical tools for a wide variety of people, in many different situations. What will BCIs in the future be like? Who will use them, and why? This book, written by many of the top BCI researchers and developers, reviews the latest progress in the different components of BCIs. Chapters also discuss practical issues in an emerging BCI enabled community. The book is intended both for professionals and for interested laypeople who are not experts in BCI research.
Since the publication of the first edition of this book, advances in algorithms, logic and software tools have transformed the field of data fusion. The latest edition covers these areas as well as smart agents, human computer interaction, cognitive aides to analysis and data system fusion control. Besides aiding you in selecting the appropriate algorithm for implementing a data fusion system, this book guides you through the process of determining the trade-offs among competing data fusion algorithms, selecting commercial off-the-shelf (COTS) tools, and understanding when data fusion improves systems processing. Completely new chapters in this second edition explain data fusion system control, DARPA's recently developed TRIP model, and the latest applications of data fusion in data warehousing and medical equipment, as well as defence systems.
Reasoning with Complex Cases emphasizes case retrieval methods based on structured cases as they are relevant for planning, configuration, and design, and provides a systematic view of the case reuse phase, centering on complex situations. So far, books on case-based reasoning considered comparatively simple situations only. This book is a coherent work, not a selection of separate contributions, and consists largely of original research results using examples taken from industrial design, biology, medicine, jurisprudence and other areas. Reasoning with Complex Cases is suitable as a secondary text for graduate-level courses on case-based reasoning and as a reference for practitioners applying conventional CBR systems or techniques. |
You may like...
Systems Engineering and Artificial…
William F. Lawless, Ranjeev Mittu, …
Hardcover
R4,301
Discovery Miles 43 010
Information Technology Trends for a…
Francisco J. Garcia Penalvo
Hardcover
R5,616
Discovery Miles 56 160
Deep Learning Applications for…
Monica R. Mundada, Seema S., …
Hardcover
R7,022
Discovery Miles 70 220
AI and Robotics in Disaster Studies
T. V. Vijay Kumar, Keshav Sud
Hardcover
R3,517
Discovery Miles 35 170
Probabilistic and Causal Inference - The…
Hector Geffner, Rina Dechter, …
Hardcover
R4,097
Discovery Miles 40 970
Computational and Methodological…
Andriette Bekker, (Din) Ding-Geng Chen, …
Hardcover
R3,979
Discovery Miles 39 790
Research Anthology on Artificial Neural…
Information R Management Association
Hardcover
R13,702
Discovery Miles 137 020
|