![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence > General
Intelligent systems are required to facilitate the use of information provided by the internet and other computer based technologies. This book describes the state-of-the-art in Intelligent Automation and Systems Engineering. Topics covered include Intelligent decision making, Automation, Robotics, Expert systems, Fuzzy systems, Knowledge-based systems, Knowledge extraction, Large database management, Data analysis tools, Computational biology, Optimization algorithms, Experimental designs, Complex system identification, Computational modeling, Systems simulation, Decision modeling, and industrial applications.
In reinforcement learning (RL) problems, learning agents sequentially execute actions with the goal of maximizing a reward signal. The RL framework has gained popularity with the development of algorithms capable of mastering increasingly complex problems, but learning difficult tasks is often slow or infeasible when RL agents begin with no prior knowledge. The key insight behind "transfer learning" is that generalization may occur not only within tasks, but also across tasks. While transfer has been studied in the psychological literature for many years, the RL community has only recently begun to investigate the benefits of transferring knowledge. This book provides an introduction to the RL transfer problem and discusses methods which demonstrate the promise of this exciting area of research. The key contributions of this book are:
By way of the research presented in this book, the author has established himself as the pre-eminent worldwide expert on transfer learning in sequential decision making tasks. A particular strength of the research is its very thorough and methodical empirical evaluation, which Matthew presents, motivates, and analyzes clearly in prose throughout the book. Whether this is your initial introduction to the concept of transfer learning, or whether you are a practitioner in the field looking for nuanced details, I trust that you will find this book to be an enjoyable and enlightening read. Peter Stone, Associate Professor of Computer Science
This book is a delight for academics, researchers and professionals working in evolutionary and swarm computing, computational intelligence, machine learning and engineering design, as well as search and optimization in general. It provides an introduction to the design and development of a number of popular and recent swarm and evolutionary algorithms with a focus on their applications in engineering problems in diverse domains. The topics discussed include particle swarm optimization, the artificial bee colony algorithm, Spider Monkey optimization algorithm, genetic algorithms, constrained multi-objective evolutionary algorithms, genetic programming, and evolutionary fuzzy systems. A friendly and informative treatment of the topics makes this book an ideal reference for beginners and those with experience alike.
This quite simply superb book focuses on various techniques of computational intelligence, both single ones and those which form hybrid methods. These techniques are today commonly applied to issues of artificial intelligence. The book presents methods of knowledge representation using different techniques, namely the rough sets, type-1 fuzzy sets and type-2 fuzzy sets. Next up, various neural network architectures are presented and their learning algorithms are derived. Then, the family of evolutionary algorithms is discussed, including connections between these techniques and neural networks and fuzzy systems. Finally, various methods of data partitioning and algorithms of automatic data clustering are given and new neuro-fuzzy architectures are studied and compared.
This book describes models of the neuron and multilayer neural structures, with a particular focus on mathematical models. It also discusses electronic circuits used as models of the neuron and the synapse, and analyses the relations between the circuits and mathematical models in detail. The first part describes the biological foundations and provides a comprehensive overview of the artificial neural networks. The second part then presents mathematical foundations, reviewing elementary topics, as well as lesser-known problems such as topological conjugacy of dynamical systems and the shadowing property. The final two parts describe the models of the neuron, and the mathematical analysis of the properties of artificial multilayer neural networks. Combining biological, mathematical and electronic approaches, this multidisciplinary book it useful for the mathematicians interested in artificial neural networks and models of the neuron, for computer scientists interested in formal foundations of artificial neural networks, and for the biologists interested in mathematical and electronic models of neural structures and processes.
Incorporating intelligence in industrial systems can help to increase productivity, cut-off production costs, and to improve working conditions and safety in industrial environments. This need has resulted in the rapid development of modeling and control methods for industrial systems and robots, of fault detection and isolation methods for the prevention of critical situations in industrial work-cells and production plants, of optimization methods aiming at a more profitable functioning of industrial installations and robotic devices and of machine intelligence methods aiming at reducing human intervention in industrial systems operation. To this end, the book analyzes and extends some main directions of research in modeling and control for industrial systems. These are: (i) industrial robots, (ii) mobile robots and autonomous vehicles, (iii) adaptive and robust control of electromechanical systems, (iv) filtering and stochastic estimation for multisensor fusion and sensorless control of industrial systems (iv) fault detection and isolation in robotic and industrial systems, (v) optimization in industrial automation and robotic systems design, and (vi) machine intelligence for robots autonomy. The book will be a useful companion to engineers and researchers since it covers a wide spectrum of problems in the area of industrial systems. Moreover, the book is addressed to undergraduate and post-graduate students, as an upper-level course supplement of automatic control and robotics courses.
The field of Soft Computing in Humanities and Social Sciences is at a turning point. The strong distinction between "science" and "humanities" has been criticized from many fronts and, at the same time, an increasing cooperation between the so-called "hard sciences" and "soft sciences" is taking place in a wide range of scientific projects dealing with very complex and interdisciplinary topics. In the last fifteen years the area of Soft Computing has also experienced a gradual rapprochement to disciplines in the Humanities and Social Sciences, and also in the field of Medicine, Biology and even the Arts, a phenomenon that did not occur much in the previous years. The collection of this book presents a generous sampling of the new and burgeoning field of Soft Computing in Humanities and Social Sciences, bringing together a wide array of authors and subject matters from different disciplines. Some of the contributors of the book belong to the scientific and technical areas of Soft Computing while others come from various fields in the humanities and social sciences such as Philosophy, History, Sociology or Economics. Rudolf Seising received a Ph.D. degree in philosophy of science and a postdoctoral lecture qualification (PD) in history of science from the Ludwig Maximilians University of Munich. He is an Adjoint Researcher at the European Centre for Soft Computing in Mieres (Asturias), Spain. Veronica Sanz earned a Ph.D. in Philosophy at the University Complutense of Madrid (Spain). At the moment she is a Postdoctoral Researcher at the Science, Technology and Society Center in the University of California at Berkeley. Veronica Sanz earned a Ph.D. in Philosophy at the University Complutense of Madrid (Spain). At the moment she is a Postdoctoral Researcher at the Science, Technology and Society Center in the University of California at Berkeley.
This volume draws mostly on papers presented at the TRENTO 2009 international workshop on Preferences and Decisions, jointly organized by the University of Trento and the University of Sannio at Benevento (Italy). Since its first edition in 1997, the renowned international workshop series TRENTO aims at providing an informal but effective opportunity for sharing and discussing the recent research developments in the field of preference modeling and decision theory, bringing together some of the world's leading experts in this active interdisciplinary area of research. In particular, the scope of the international workshop TRENTO 2009 covered a wide range of topics, such as preference representation and rationality, machine intelligence and automation in decision making, uncertainty modeling, probabilistic and possibilistic decision models, cooperative game theory and coalition formation, aggregation functions and multicriteria decision making, fuzzy set theory and fuzzy logic for decision making, algebraic structures, quantum dynamics, complex network models and negotiation, interactive dynamics and consensus reaching in multiagent decisions, optimization and operational research for decision making. The contributes have been proposed by authors that are among the most recognized scientists in the respective research domains. This volume also provides an opportunity, to colleagues and friends of Mario Fedrizzi, Benedetto Matarazzo, and Aldo Ventre, for celebrating and thanking them for their continuing and stimulating scientific work.
During the past decades scheduling has been among the most studied op- mization problemsanditisstillanactiveareaofresearch!Schedulingappears in many areas of science, engineering and industry and takes di?erent forms depending on the restrictions and optimization criteria of the operating en- ronments [8]. For instance, in optimization and computer science, scheduling has been de?ned as "the allocation of tasks to resources over time in order to achieve optimality in one or more objective criteria in an e?cient way" and in production as "production schedule, i. e. , the planning of the production or the sequence of operations according to which jobs pass through machines and is optimal with respect to certain optimization criteria. " Although there is a standardized form of stating any scheduling problem, namely "e?cient allocation ofn jobs onm machines -which can process no more than one activity at a time- with the objective to optimize some - jective function of the job completion times", scheduling is in fact a family of problems. Indeed, several parameters intervene in the problem de?nition: (a) job characteristics (preemptive or not, precedence constraints, release dates, etc. ); (b) resource environment (single vs. parallel machines, un- lated machines, identical or uniform machines, etc. ); (c) optimization criteria (minimize total tardiness, the number of late jobs, makespan, ?owtime, etc. ; maximize resource utilization, etc. ); and, (d) scheduling environment (static vs. dynamic,intheformerthenumberofjobstobeconsideredandtheirready times are available while in the later the number of jobs and their charact- istics change over time).
Humans and machines are very di?erent in their approaches to game pl- ing. Humans use intuition, perception mechanisms, selective search, creat- ity, abstraction, heuristic abilities and other cognitive skills to compensate their (comparably) slow information processing speed, relatively low m- ory capacity, and limited search abilities. Machines, on the other hand, are extremely fast and infallible in calculations, capable of e?ective brute-for- type search, use "unlimited" memory resources, but at the same time are poor at using reasoning-based approaches and abstraction-based methods. The above major discrepancies in the human and machine problem solving methods underlined the development of traditional machine game playing as being focused mainly on engineering advances rather than cognitive or psychological developments. In other words, as described by Winkler and F] urnkranz 347, 348] with respect to chess, human and machine axes of game playing development are perpendicular, but the most interesting, most promising, and probably also most di?cult research area lies on the junction between human-compatible knowledge and machine compatible processing.I undoubtedly share this point of view and strongly believe that the future of machine game playing lies in implementation of human-type abilities (- straction, intuition, creativity, selectiveattention, andother)whilestilltaking advantage of intrinsic machine skills. Thebookisfocusedonthedevelopmentsandprospectivechallengingpr- lems in the area of mind gameplaying (i.e. playinggames that require mental skills) using Computational Intelligence (CI) methods, mainly neural n- works, genetic/evolutionary programming and reinforcement learning."
This book covers the underlying science and application issues related to aggregation operators, focusing on tools used in practical applications that involve numerical information. It will thus be required reading for engineers, statisticians and computer scientists of all kinds. Starting with detailed introductions to information fusion and integration, measurement and probability theory, fuzzy sets, and functional equations, the authors then cover numerous topics in detail, including the synthesis of judgements, fuzzy measures, weighted means and fuzzy integrals.
Biological and natural processes have been a continuous source of inspiration for the sciences and engineering. For instance, the work of Wiener in cybernetics was influenced by feedback control processes observable in biological systems; McCulloch and Pitts description of the artificial neuron was instigated by biological observations of neural mechanisms; the idea of survival of the fittest inspired the field of evolutionary algorithms and similarly, artificial immune systems, ant colony optimisation, automated self-assembling programming, membrane computing, etc. also have their roots in natural phenomena. The second International Workshop on Nature Inspired Cooperative Strategies for Optimization (NICSO), was held in Acireale, Italy, during November 8-10, 2007. The aim for NICSO 2007 was to provide a forum were the latest ideas and state of the art research related to cooperative strategies for problem solving arising from Nature could be discussed. The contributions collected in this book were strictly peer reviewed by at least three members of the international programme committee, to whom we are indebted for their support and assistance. The topics covered by the contributions include several well established nature inspired techniques like Genetic Algorithms, Ant Colonies, Artificial Immune Systems, Evolutionary Robotics, Evolvable Systems, Membrane Computing, Quantum Computing, Software Self Assembly, Swarm Intelligence, etc.
In this book, a generic model in as far as possible mathematical closed-formis developed that predicts the behavior of large self-organizing robot groups (robot swarms) based on their control algorithm. In addition, an extensive subsumption of the relatively young and distinctive interdisciplinary research field of swarm robotics is emphasized. The connection to many related fields is highlighted and the concepts and methods borrowed from these fields are described shortly.
The World Wide Web can be considered a huge library that in consequence needs a capable librarian responsible for the classification and retrieval of documents as well as the mediation between library resources and users. Based on this idea, the concept of the "Librarian of the Web" is introduced which comprises novel, librarian-inspired methods and technical solutions to decentrally search for text documents in the web using peer-to-peer technology. The concept's implementation in the form of an interactive peer-to-peer client, called "WebEngine", is elaborated on in detail. This software extends and interconnects common web servers creating a fully integrated, decentralised and self-organising web search system on top of the existing web structure. Thus, the web is turned into its own powerful search engine without the need for any central authority. This book is intended for researchers and practitioners having a solid background in the fields of Information Retrieval and Web Mining.
This book provides a thorough treatment of privacy and security issues for researchers in the fields of smart grids, engineering, and computer science. It presents comprehensive insight to understanding the big picture of privacy and security challenges in both physical and information aspects of smart grids. The authors utilize an advanced interdisciplinary approach to address the existing security and privacy issues and propose legitimate countermeasures for each of them in the standpoint of both computing and electrical engineering. The proposed methods are theoretically proofed by mathematical tools and illustrated by real-world examples.
Evolutionary Computation (EC) includes a number of techniques such as Genetic Algorithms which have been used in a diverse range of highly successful applications. This book brings together some of these EC applications in fields including electronics, telecommunications, health, bioinformatics, supply chain and other engineering domains, to give the audience, including both EC researchers and practitioners, a glimpse of this exciting and rapidly-evolving field.
"Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "provides an in-depth look at Web intelligence, and how advanced mathematics and modern computing technology can influence the insights we have on terrorist groups. This book primarily focuses on one famous terrorist group known as Lashkar-e-Taiba (or LeT), and how it operates.After 10 years of counter Al Qaeda operations, LeT is considered by many in the counter-terrorism community to be an even greater threat to the US and world peace than Al Qaeda. "Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "is the first book that demonstrates how to use modern computational analysis techniques including methods for "big data" analysis. This book presents how to quantify both the environment in which LeT operate, and the actions it took over a 20-year period, and represent it as a relational database table. This table is then mined using sophisticated data mining algorithms in order to gain detailed, mathematical, computational and statistical insights into LeT and its operations.This book also provides a detailed history of Lashkar-e-Taiba based on extensive analysis conducted by using open source information and public statements. Each chapter includes a case study, as well as a slide describing the key results which are available on the authors' web sites. "Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "is designed for a professional market composed of government or military workers, researchers and computer scientists working in the web intelligence field. Advanced-level students in computer science will also find this valuable as a reference book."
Knowledge existing in modern information systems usually comes from many sources and is mapped in many ways. There is a real need for representing "knowledge pieces" as rather universal objects that should fit to multi-purpose a- ing systems. According to great number of information system's tasks, knowledge representation is more or less detailed (e.g. some level of its granularity is - sumed). The main goal of this paper is to present chosen aspects of expressing granularity of knowledge implemented in intelligent systems. One of the main r- sons of granularity phenomena is diversification of knowledge sources, therefore the next section is devoted to this issue. 2. Heterogeneous Knowledge as a Source for Intelligent Systems Knowledge, the main element of so-called intelligent applications and systems, is very often heterogeneous. This heterogeneity concerns the origin of knowledge, its sources as well as its final forms of presentation. In this section the selected c- teria of knowledge differentiation will be presented, in the context of potential sources of knowledge acquisition. In Fig. 1 an environment of intelligent systems is shown, divided into different knowledge sources for the system. Fig. 1. Potential knowledge sources for intelligent information/reasoning system. Source: own elaboration based on (Mach, 2007) p. 24.
Metaheuristics are a relatively new but already established approachto c- binatorial optimization. A metaheuristic is a generic algorithmic template that can be used for ?nding high quality solutions of hard combinatorial - timization problems. To arrive at a functioning algorithm, a metaheuristic needs to be con?gured: typically some modules need to be instantiated and someparametersneedto betuned.Icallthese twoproblems"structural"and "parametric" tuning, respectively. More generally, I refer to the combination of the two problems as "tuning." Tuning is crucial to metaheuristic optimization both in academic research andforpracticalapplications.Nevertheless, relativelylittle researchhasbeen devoted to the issue. This book shows that the problem of tuning a me- heuristic can be described and solved as a machine learning problem. Using the machine learning perspective, it is possible to give a formal de?nitionofthetuningproblemandtodevelopagenericalgorithmfortuning metaheuristics.Moreover, fromthemachinelearningperspectiveitispossible tohighlightsome?awsinthecurrentresearchmethodologyandtostatesome guidelines for future empirical analysis in metaheuristics research. This book is based on my doctoral dissertation and contains results I have obtained starting from 2001 while working within the Metaheuristics Net- 1 work. During these years I have been a?liated with two research groups: INTELLEKTIK, Technische Universitat Darmstadt, Darmstadt, Germany and IRIDIA, Universite Libre de Bruxelles, Brussels, Belgium. I am the- fore grateful to the research directors of these two groups: Prof. Wolfgang Bibel, Dr. Thomas Stutzle, Prof. Philippe Smets, Prof. Hugues Bersini, and Prof. Marco Dorigo."
In this book, the following three approaches to data analysis are presented: - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, - Rough Sets, founded by Zdzis aw I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected."
This book is devoted to the state-of-the-art in all aspects of fireworks algorithm (FWA), with particular emphasis on the efficient improved versions of FWA. It describes the most substantial theoretical analysis including basic principle and implementation of FWA and modeling and theoretical analysis of FWA. It covers exhaustively the key recent significant research into the improvements of FWA so far. In addition, the book describes a few advanced topics in the research of FWA, including multi-objective optimization (MOO), discrete FWA (DFWA) for combinatorial optimization, and GPU-based FWA for parallel implementation. In sequels, several successful applications of FWA on non-negative matrix factorization (NMF), text clustering, pattern recognition, and seismic inversion problem, and swarm robotics, are illustrated in details, which might shed new light on more real-world applications in future. Addressing a multidisciplinary topic, it will appeal to researchers and professionals in the areas of metahuristics, swarm intelligence, evolutionary computation, complex optimization solving, etc.
Psychophysics and Experimental Phenomenology of Pattern Cognition examines the cognitive transformations that underly this cognitive system and the specialized subsystems for processing these transformations. Sections cover symmetry cognition, contour perception and geometric illusion. Weight sensation is also discussed, as are repetitive and dot patterns. By incorporating elements of both psychophysics and experimental phenomenology, pattern cognition is examined from both the physical and mental sensory perspective, thus providing a comprehensive view of this cognitive system.
This book introduces a novel approach to discrete optimization, providing both theoretical insights and algorithmic developments that lead to improvements over state-of-the-art technology. The authors present chapters on the use of decision diagrams for combinatorial optimization and constraint programming, with attention to general-purpose solution methods as well as problem-specific techniques. The book will be useful for researchers and practitioners in discrete optimization and constraint programming. "Decision Diagrams for Optimization is one of the most exciting developments emerging from constraint programming in recent years. This book is a compelling summary of existing results in this space and a must-read for optimizers around the world." [Pascal Van Hentenryck]
The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results. The purpose of the first ACIS International Symposium on Software and Network Engineering held on December 19-20, 2012 on the Seoul National University campus, Seoul, Korea is to bring together scientist, engineers, computer users, students to share their experiences and exchange new ideas, and research results about all aspects (theory, applications and tools) of software & network engineering, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them The symposium organizers selected the best 12 papers from those papers accepted for presentation at the symposium in order to publish them in this volume. The papers were chosen based on review scores submitted by members of the program committee, and underwent further rigorous rounds of review. The symposium organizers selected the best 12 papers from those papers accepted for presentation at the symposium in order to publish them in this volume. The papers were chosen based on review scores submitted by members of the program committee, and underwent further rigorous rounds of review." |
You may like...
Edge/Fog Computing Paradigm: The…
Pethuru Raj, Kavita Saini, …
Hardcover
R3,966
Discovery Miles 39 660
Embedded and Real Time System…
Mohammad Ayoub Khan, Saqib Saeed, …
Hardcover
R3,445
Discovery Miles 34 450
Edsger Wybe Dijkstra - His Life, Work…
Krzysztof R. Apt, Tony Hoare
Hardcover
R2,920
Discovery Miles 29 200
Data Prefetching Techniques in Computer…
Pejman Lotfi-Kamran, Hamid Sarbazi-Azad
Hardcover
R3,923
Discovery Miles 39 230
Clean Architecture - A Craftsman's Guide…
Robert Martin
Paperback
(1)
Code Nation - Personal Computing and the…
Michael J. Halvorson
Hardcover
R1,602
Discovery Miles 16 020
Essential Java for Scientists and…
Brian Hahn, Katherine Malan
Paperback
R1,266
Discovery Miles 12 660
|