![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence > General
Evolutionary Algorithms, in particular Evolution Strategies, Genetic Algorithms, or Evolutionary Programming, have found wide acceptance as robust optimization algorithms in the last ten years. Compared with the broad propagation and the resulting practical prosperity in different scientific fields, the theory has not progressed as much.This monograph provides the framework and the first steps toward the theoretical analysis of Evolution Strategies (ES). The main emphasis is on understanding the functioning of these probabilistic optimization algorithms in real-valued search spaces by investigating the dynamical properties of some well-established ES algorithms. The book introduces the basic concepts of this analysis, such as progress rate, quality gain, and self-adaptation response, and describes how to calculate these quantities. Based on the analysis, functioning principles are derived, aiming at a qualitative understanding of why and how ES algorithms work.
This volume showcases contributions from internationally-known researchers in the field of information management. Most of the approaches presented here make use of fuzzy logic, introduced by L.A. Zadeh almost 50 years ago, which constitute a powerful tool to model and handle gradual concepts. What all of these contributions have in common is placing the user at the center of the information system, be it for helping him/her to query a data set, to handle imperfect information, or to discover useful knowledge from a massive collection of data. Researchers working in data and knowledge management will greatly benefit from this collection of up-to-date studies. This may be also an invaluable source of information for postgraduate students interested in advanced information management techniques.
Recently, cryptology problems, such as designing good cryptographic systems and analyzing them, have been challenging researchers. Many algorithms that take advantage of approaches based on computational intelligence techniques, such as genetic algorithms, genetic programming, and so on, have been proposed to solve these issues. Implementing Computational Intelligence Techniques for Security Systems Design is an essential research book that explores the application of computational intelligence and other advanced techniques in information security, which will contribute to a better understanding of the factors that influence successful security systems design. Featuring a range of topics such as encryption, self-healing systems, and cyber fraud, this book is ideal for security analysts, IT specialists, computer engineers, software developers, technologists, academicians, researchers, practitioners, and students.
This book focuses on metaheuristic methods and its applications to real-world problems in Engineering. The first part describes some key metaheuristic methods, such as Bat Algorithms, Particle Swarm Optimization, Differential Evolution, and Particle Collision Algorithms. Improved versions of these methods and strategies for parameter tuning are also presented, both of which are essential for the practical use of these important computational tools. The second part then applies metaheuristics to problems, mainly in Civil, Mechanical, Chemical, Electrical, and Nuclear Engineering. Other methods, such as the Flower Pollination Algorithm, Symbiotic Organisms Search, Cross-Entropy Algorithm, Artificial Bee Colonies, Population-Based Incremental Learning, Cuckoo Search, and Genetic Algorithms, are also presented. The book is rounded out by recently developed strategies, or hybrid improved versions of existing methods, such as the Lightning Optimization Algorithm, Differential Evolution with Particle Collisions, and Ant Colony Optimization with Dispersion - state-of-the-art approaches for the application of computational intelligence to engineering problems. The wide variety of methods and applications, as well as the original results to problems of practical engineering interest, represent the primary differentiation and distinctive quality of this book. Furthermore, it gathers contributions by authors from four countries - some of which are the original proponents of the methods presented - and 18 research centers around the globe.
The book presents automatic and reproducible methods for the analysis of medical infrared images. All methods highlighted here have been practically implemented in Matlab, and the source code is presented and discussed in detail. Further, all methods have been verified with medical specialists, making the book an ideal resource for all IT specialists, bioengineers and physicians who wish to broaden their knowledge of tailored methods for medical infrared image analysis and processing.
Reflects a decade of leading-edge research on intelligence and security informatics. Dr Chen is researcher at the Artificial Intelligence Laboratory and the NSF COPLINK Center for Homeland Security Information Technology Research. Describes real-world community situations. Targets wide-ranging audience: from researchers in computer science, information management and information science via analysts and policy makers in federal departments and national laboratories to consultants in IT hardware, communication, and software companies.
A detailed description of a new approach to perceptual analysis and processing of medical images is given. Instead of traditional pattern recognition a new method of image analysis is presented, based on a syntactic description of the shapes selected on the image and graph-grammar parsing algorithms. This method of "Image Understanding" can be found as a model of mans' cognitive image understanding processes. The usefulness for the automatic understanding of the merit of medical images is demonstrated as well as the ability for giving useful diagnostic descriptions of the illnesses. As an application, the production of a content-based, automatically generated index for arranging and for searching medical images in multimedia medical databases is presented.
This book highlights novel research in Knowledge Discovery and Management (KDM), gathering the extended, peer-reviewed versions of outstanding papers presented at the annual conferences EGC'2017 & EGC'2018. The EGC conference cycle was founded by the International French-speaking EGC society ("Extraction et Gestion des Connaissances") in 2003, and has since become a respected fixture among the French-speaking community. In addition to the annual conference, the society organizes various other events in order to promote exchanges between researchers and companies concerned with KDM and its applications to business, administration, industry and public organizations. Addressing novel research in data science, semantic Web, clustering, and classification, the content presented here will chiefly benefit researchers interested in these fields, including Ph.D./M.Sc. students, at public and private laboratories alike.
This book is the outcome of a decade's research into a speci?c architecture and associated learning mechanism for an arti?cial neural network: the - chitecture involves negative feedback and the learning mechanism is simple Hebbian learning. The research began with my own thesis at the University of Strathclyde, Scotland, under Professor Douglas McGregor which culminated with me being awarded a PhD in 1995 [52], the title of which was "Negative Feedback as an Organising Principle for Arti?cial Neural Networks". Naturally enough, having established this theme, when I began to sup- vise PhD students of my own, we continued to develop this concept and this book owes much to the research and theses of these students at the Applied Computational Intelligence Research Unit in the University of Paisley. Thus we discuss work from * Dr. Darryl Charles [24] in Chapter 5. * Dr. Stephen McGlinchey [127] in Chapter 7. * Dr. Donald MacDonald [121] in Chapters 6 and 8. * Dr. Emilio Corchado [29] in Chapter 8. We brie?y discuss one simulation from the thesis of Dr. Mark Girolami [58] in Chapter 6 but do not discuss any of the rest of his thesis since it has already appeared in book form [59]. We also must credit Cesar Garcia Osorio, a current PhD student, for the comparative study of the two Exploratory Projection Pursuit networks in Chapter 8. All of Chapters 3 to 8 deal with single stream arti?cial neural networks.
This book discusses the development of a theory of info-statics as a sub-theory of the general theory of information. It describes the factors required to establish a definition of the concept of information that fixes the applicable boundaries of the phenomenon of information, its linguistic structure and scientific applications. The book establishes the definitional foundations of information and how the concepts of uncertainty, data, fact, evidence and evidential things are sequential derivatives of information as the primary category, which is a property of matter and energy. The sub-definitions are extended to include the concepts of possibility, probability, expectation, anticipation, surprise, discounting, forecasting, prediction and the nature of past-present-future information structures. It shows that the factors required to define the concept of information are those that allow differences and similarities to be established among universal objects over the ontological and epistemological spaces in terms of varieties and identities. These factors are characteristic and signal dispositions on the basis of which general definitional foundations are developed to construct the general information definition (GID). The book then demonstrates that this definition is applicable to all types of information over the ontological and epistemological spaces. It also defines the concepts of uncertainty, data, fact, evidence and knowledge based on the GID. Lastly, it uses set-theoretic analytics to enhance the definitional foundations, and shows the value of the theory of info-statics to establish varieties and categorial varieties at every point of time and thus initializes the construct of the theory of info-dynamics.
Evolutionary computing paradigms offer robust and powerful adaptive search mechanisms for system design. This book's thirteen chapters cover a wide area of topics in evolutionary computing and applications, including an introduction to evolutionary computing in system design; evolutionary neuro-fuzzy systems; and evolution of fuzzy controllers. The book will be useful to researchers in intelligent systems with interest in evolutionary computing, as well as application engineers and system designers.
This book is about model-based diagnosis of a class of discrete-event systems called active systems. Roughly, model-based diagnosis is the task of finding out the faulty components of a physical system based on the observed behavior and the system model. An active system is the abstraction of a physical artefact that is modeled as a network of com municating automata. For example, the protection apparatus of a power transmission network can be conveniently modeled as an active system, where breakers, protection devices, and lines are naturally described by finite state machines. The asynchronous occurrence of a short circuit on a line or a bus-bar causes the reaction of the protection devices, which aims to isolate the shorted line. This reaction can be faulty and several lines might be eventually isolated, rather than the shorted line only. The diagnostic problem to be solved is uncovering the faulty devices based the visible part of the reaction. Once the diagnosis task has been on accomplished, the produced results are exploited to fix the apparatus (and also to localize the short circuit, in this sample case). Interestingly, the research presented in this book was triggered a decade ago by a project 011 short circuit localization, conducted by ENEL, the Italian electricity board, along with other industrial and academic European partners."
The book provides a sample of research on the innovative theory and applications of soft computing paradigms. The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. In the final analysis, the role model for soft computing is the human mind. We hope that the reader will share our excitement and find our volume both useful and inspiring.
This book gives an overview of constraint satisfaction problems (CSPs), adapts related search algorithms and consistency algorithms for applications to multi-agent systems, and consolidates recent research devoted to cooperation in such systems. The techniques introduced are applied to various problems in multi-agent systems. Among the new approaches is a hybrid-type algorithm for weak-commitment search combining backtracking and iterative improvement; also, an extension of the basic CSP formalization called partial CSP is introduced in order to handle over-constrained CSPs.The book is written for advanced students and professionals interested in multi-agent systems or, more generally, in distributed artificial intelligence and constraint satisfaction. Researchers active in the area will appreciate this book as a valuable source of reference.
This book describes the struggle to introduce a mechanism that enables next-generation information systems to maintain themselves. Our generation observed the birth and growth of information systems, and the Internet in particular. Surprisingly information systems are quite different from conventional (energy, material-intensive) artificial systems, and rather resemble biological systems (information-intensive systems). Many artificial systems are designed based on (Newtonian) physics assuming that every element obeys simple and static rules; however, the experience of the Internet suggests a different way of designing where growth cannot be controlled but self-organized with autonomous and selfish agents. This book suggests using game theory, a mechanism design in particular, for designing next-generation information systems which will be self-organized by collective acts with autonomous components. The challenge of mapping a probability to time appears repeatedly in many forms throughout this book. The book contains interdisciplinary research encompassing game theory, complex systems, reliability theory and particle physics. All devoted to its central theme: what happens if systems self-repair themselves?
The articles in this book present advanced soft methods related to genetic and evolutionary algorithms, immune systems, formulation of deterministic neural networks and Bayesian NN. Many attention is paid to hybrid systems for inverse analysis fusing soft methods and the finite element method. Numerical efficiency of these soft methods is illustrated on the analysis and design of complex engineering structures.
Global capital markets have undergone fundamental transformations in recent years and, as a result, have become extraordinarily complex and opaque. Trading space is no longer measured in minutes or seconds but in time units beyond human perception: milliseconds, microseconds, and even nanoseconds. Technological advances have thus scaled up imperceptible and previously irrelevant time differences into operationally manageable and enormously profitable business opportunities for those with the proper high-tech trading tools. These tools include the fastest private communication and trading lines, the most powerful computers and sophisticated algorithms capable of speedily analysing incoming news and trading data and determining optimal trading strategies in microseconds, as well as the possession of gigantic collections of historic and real-time market data. Fragmented capital markets are also becoming a rapidly growing reality in Europe and Asia, and are an established feature of U.S. trading. This raises urgent market governance issues that have largely been overlooked. Global Algorithmic Capital Markets seeks to understand how recent market transformations are affecting core public policy objectives such as investor protection and reduction of systemic risk, as well as fairness, efficiency, and transparency. The operation and health of capital markets affect all of us and have profound implications for equality and justice in society. This unique set of chapters by leading scholars, industry insiders, and regulators discusses ways to strengthen market governance for the benefit of society at whole.
Sloshing causes liquid to fluctuate, making accurate level readings difficult to obtain in dynamic environments. The measurement system described uses a single-tube capacitive sensor to obtain an instantaneous level reading of the fluid surface, thereby accurately determining the fluid quantity in the presence of slosh. A neural network based classification technique has been applied to predict the actual quantity of the fluid contained in a tank under sloshing conditions. In "A neural network approach to fluid quantity measurement in dynamic environments," effects of temperature variations and contamination on the capacitive sensor are discussed, and the authors propose that these effects can also be eliminated with the proposed neural network based classification system. To examine the performance of the classification system, many field trials were carried out on a running vehicle at various tank volume levels that range from 5 L to 50 L. The effectiveness of signal enhancement on the neural network based signal classification system is also investigated. Results obtained from the investigation are compared with traditionally used statistical averaging methods, and proves that the neural network based measurement system can produce highly accurate fluid quantity measurements in a dynamic environment. Although in this case a capacitive sensor was used to demonstrate measurement system this methodology is valid for all types of electronic sensors. The approach demonstrated in "A neural network approach to fluid quantity measurement in dynamic environments "can be applied to a wide range of fluid quantity measurement applications in the automotive, naval and aviation industries to produce accurate fluid level readings. Students, lecturers, and experts will find the description of current research about accurate fluid level measurement in dynamic environments using neural network approach useful."
This book presents the synthesis and analysis of fuzzy controllers and its application to a class of mechanical systems. It mainly focuses on the use of type-2 fuzzy controllers to account for disturbances known as hard or nonsmooth nonlinearities. The book, which summarizes the authors' research on type-2 fuzzy logic and control of mechanical systems, presents models, simulation and experiments towards the control of servomotors with dead-zone and Coulomb friction, and the control of both wheeled mobile robots and a biped robot. Closed-loop systems are analyzed in the framework of smooth and nonsmooth Lyapunov functions.
Safety is a paradoxical system property. It remains immaterial, intangible and invisible until a failure, an accident or a catastrophy occurs and, too late, reveals its absence. And yet, a system cannot be relied upon unless its safety can be explained, demonstrated and certified. The practical and difficult questions which motivate this study concern the evidence and the arguments needed to justify the safety of a computer based system, or more generally its dependability. Dependability is a broad concept integrating properties such as safety, reliability, availability, maintainability and other related characteristics of the behaviour of a system in operation. How can we give the users the assurance that the system enjoys the required dependability? How should evidence be presented to certification bodies or regulatory authorities? What best practices should be applied? How should we decide whether there is enough evidence to justify the release of the system? To help answer these daunting questions, a method and a framework are proposed for the justification of the dependability of a computer-based system. The approach specifically aims at dealing with the difficulties raised by the validation of software. Hence, it should be of wide applicability despite being mainly based on the experience of assessing Nuclear Power Plant instrumentation and control systems important to safety. To be viable, a method must rest on a sound theoretical background.
Super-Intelligent Machines combines neuroscience and computer science to analyze future intelligent machines. It describes how they will mimic the learning structures of human brains to serve billions of people via the network, and the superior level of consciousness this will give them. Whereas human learning is reinforced by self-interests, this book describes the selfless and compassionate values that must drive machine learning in order to protect human society. Technology will change life much more in the twenty-first century than it has in the twentieth, and Super-Intelligent Machines explains how that can be an advantage.
This book is a collection of selected papers presented at the Annual Meeting of the European Academy of Management and Business Economics (AEDEM), held at the Faculty of Economics and Business of the University of Barcelona, 05 - 07 June, 2012. This edition of the conference has been presented with the slogan "Creating new opportunities in an uncertain environment". There are different ways for assessing uncertainty in management but this book mainly focused on soft computing theories and their role in assessing uncertainty in a complex world. The present book gives a comprehensive overview of general management topics and discusses some of the most recent developments in all the areas of business and management including management, marketing, business statistics, innovation and technology, finance, sports and tourism. This book might be of great interest for anyone working in the area of management and business economics and might be especially useful for scientists and graduate students doing research in these fields. |
You may like...
Becoming a Reading Teacher - Connecting…
Jane Spiro, Amos Paran
Hardcover
R4,197
Discovery Miles 41 970
Critical Pedagogy in the Language and…
Gloria Park, Sarah Bogdan, …
Hardcover
R4,197
Discovery Miles 41 970
How Young Adult Literature Gets Taught…
Steven Bickmore, T. Hunter Strickland, …
Paperback
R1,216
Discovery Miles 12 160
|