![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
The need of video compression in the modern age of visual communication cannot be over-emphasized. This monograph will provide useful information to the postgraduate students and researchers who wish to work in the domain of VLSI design for video processing applications. In this book, one can find an in-depth discussion of several motion estimation algorithms and their VLSI implementation as conceived and developed by the authors. It records an account of research done involving fast three step search, successive elimination, one-bit transformation and its effective combination with diamond search and dynamic pixel truncation techniques. Two appendices provide a number of instances of proof of concept through Matlab and Verilog program segments. In this aspect, the book can be considered as first of its kind. The architectures have been developed with an eye to their applicability in everyday low-power handheld appliances including video camcorders and smartphones.
This book explores various renewal processes in the context of probability theory, uncertainty theory and chance theory. It also covers the applications of these renewal processes in maintenance models and insurance risk models. The methods used to derive the limit of the renewal rate, the reward rate, and the availability rate are of particular interest, as they can easily be extended to the derivation of other models. Its comprehensive and systematic treatment of renewal processes, renewal reward processes and the alternating renewal process is one of the book's major features, making it particularly valuable for readers who are interested in learning about renewal theory. Given its scope, the book will benefit researchers, engineers, and graduate students in the fields of mathematics, information science, operations research, industrial engineering, etc.
Machine learning is concerned with the analysis of large data and multiple variables. However, it is also often more sensitive than traditional statistical methods to analyze small data. The first volume reviewed subjects like optimal scaling, neural networks, factor analysis, partial least squares, discriminant analysis, canonical analysis, and fuzzy modeling. This second volume includes various clustering models, support vector machines, Bayesian networks, discrete wavelet analysis, genetic programming, association rule learning, anomaly detection, correspondence analysis, and other subjects. Both the theoretical bases and the step by step analyses are described for the benefit of non-mathematical readers. Each chapter can be studied without the need to consult other chapters. Traditional statistical tests are, sometimes, priors to machine learning methods, and they are also, sometimes, used as contrast tests. To those wishing to obtain more knowledge of them, we recommend to additionally study (1) Statistics Applied to Clinical Studies 5th Edition 2012, (2) SPSS for Starters Part One and Two 2012, and (3) Statistical Analysis of Clinical Data on a Pocket Calculator Part One and Two 2012, written by the same authors, and edited by Springer, New York.
The book reports on the latest advances and applications of nonlinear control systems. It consists of 30 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought out in the broad areas of nonlinear control systems such as robotics, nonlinear circuits, power systems, memristors, underwater vehicles, chemical processes, observer design, output regulation, backstepping control, sliding mode control, time-delayed control, variables structure control, robust adaptive control, fuzzy logic control, chaos, hyperchaos, jerk systems, hyperjerk systems, chaos control, chaos synchronization, etc. Special importance was given to chapters offering practical solutions, modeling and novel control methods for the recent research problems in nonlinear control systems. This book will serve as a reference book for graduate students and researchers with a basic knowledge of electrical and control systems engineering. The resulting design procedures on the nonlinear control systems are emphasized using MATLAB software.
This book focuses on computational intelligence techniques and its applications fast-growing and promising research topics that have drawn a great deal of attention from researchers over the years. It brings together many different aspects of the current research on intelligence technologies such as neural networks, support vector machines, fuzzy logic and evolutionary computation, and covers a wide range of applications from pattern recognition and system modeling, to intelligent control problems and biomedical applications. Fundamental concepts and essential analysis of various computational techniques are presented to offer a systematic and effective tool for better treatment of different applications, and simulation and experimental results are included to illustrate the design procedure and the effectiveness of the approaches.
In the mid 1990s, Tim Berners-Lee had the idea of developing the World Wide Web into a "Semantic Web", a web of information that could be interpreted by machines in order to allow the automatic exploitation of data, which until then had to be done by humans manually. One of the first people to research topics related to the Semantic Web was Professor Rudi Studer. From the beginning, Rudi drove projects like ONTOBROKER and On-to-Knowledge, which later resulted in W3C standards such as RDF and OWL. By the late 1990s, Rudi had established a research group at the University of Karlsruhe, which later became the nucleus and breeding ground for Semantic Web research, and many of today's well-known research groups were either founded by his disciples or benefited from close cooperation with this think tank. In this book, published in celebration of Rudi's 60th birthday, many of his colleagues look back on the main research results achieved during the last 20 years. Under the editorship of Dieter Fensel, once one of Rudi's early PhD students, an impressive list of contributors and contributions has been collected, covering areas like Knowledge Management, Ontology Engineering, Service Management, and Semantic Search. Overall, this book provides an excellent overview of the state of the art in Semantic Web research, by combining historical roots with the latest results, which may finally make the dream of a "Web of knowledge, software and services" come true.
The book consists of 19 extended and revised chapters based on original works presented during a poster session organized within the 5th International Conference on Computational Collective Intelligence that was held between 11 and 13 of September 2013 in Craiova, Romania. The book is divided into three parts. The first part is titled "Agents and Multi-Agent Systems" and consists of 8 chapters that concentrate on many problems related to agent and multi-agent systems, including: formal models, agent autonomy, emergent properties, agent programming, agent-based simulation and planning. The second part of the book is titled "Intelligent Computational Methods" and consists of 6 chapters. The authors present applications of various intelligent computational methods like neural networks, mathematical optimization and multistage decision processes in areas like cooperation, character recognition, wireless networks, transport, and metal structures. The third part of the book is titled "Language and Knowledge Processing Systems," and consists of 5 papers devoted to processing methods for knowledge and language information in various applications, including: language identification, corpus comparison, opinion classification, group decision making, and rule bases.
Recent advancements in the field of telecommunications, medical imaging and signal processing deal with signals that are inherently time varying, nonlinear and complex-valued. The time varying, nonlinear characteristics of these signals can be effectively analyzed using artificial neural networks. Furthermore, to efficiently preserve the physical characteristics of these complex-valued signals, it is important to develop complex-valued neural networks and derive their learning algorithms to represent these signals at every step of the learning process. This monograph comprises a collection of new supervised learning algorithms along with novel architectures for complex-valued neural networks. The concepts of meta-cognition equipped with a self-regulated learning have been known to be the best human learning strategy. In this monograph, the principles of meta-cognition have been introduced for complex-valued neural networks in both the batch and sequential learning modes. For applications where the computation time of the training process is critical, a fast learning complex-valued neural network called as a fully complex-valued relaxation network along with its learning algorithm has been presented. The presence of orthogonal decision boundaries helps complex-valued neural networks to outperform real-valued networks in performing classification tasks. This aspect has been highlighted. The performances of various complex-valued neural networks are evaluated on a set of benchmark and real-world function approximation and real-valued classification problems.
One of the most successful methodology that arose from the worldwide diffusion of Fuzzy Logic is Fuzzy Control. After the first attempts dated in the seventies, this methodology has been widely exploited for controlling many industrial components and systems. At the same time, and very independently from Fuzzy Logic or Fuzzy Control, the birth of the Web has impacted upon almost all aspects of computing discipline. Evolution of Web, Web2.0 and Web 3.0 has been making scenarios of ubiquitous computing much more feasible; consequently information technology has been thoroughly integrated into everyday objects and activities. What happens when Fuzzy Logic meets Web technology? Interesting results might come out, as you will discover in this book. Fuzzy Mark-up Language is a son of this synergistic view, where some technological issues of Web are re-interpreted taking into account the transparent notion of Fuzzy Control, as discussed here. The concept of a Fuzzy Control that is conceived and modeled in terms of a native web wisdom represents another step towards the last picture of Pervasive Web Intelligence.
This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data projections. Topics and features: discusses machine learning frameworks based on artificial neural networks, statistical learning theory and kernel-based methods, and tree-based methods; examines the application of machine learning to steady state and dynamic operations, with a focus on unsupervised learning; describes the use of spectral methods in process fault diagnosis.
This book presents mathematical models of mob control with threshold (conformity) collective decision-making of the agents. Based on the results of analysis of the interconnection between the micro- and macromodels of active network structures, it considers the static (deterministic, stochastic and game-theoretic) and dynamic (discrete- and continuous-time) models of mob control, and highlights models of informational confrontation. Many of the results are applicable not only to mob control problems, but also to control problems arising in social groups, online social networks, etc. Aimed at researchers and practitioners, it is also a valuable resource for undergraduate and postgraduate students as well as doctoral candidates specializing in the field of collective behavior modeling.
This book chiefly presents a novel approach referred to as backward fuzzy rule interpolation and extrapolation (BFRI). BFRI allows observations that directly relate to the conclusion to be inferred or interpolated from other antecedents and conclusions. Based on the scale and move transformation interpolation, this approach supports both interpolation and extrapolation, which involve multiple hierarchical intertwined fuzzy rules, each with multiple antecedents. As such, it offers a means of broadening the applications of fuzzy rule interpolation and fuzzy inference. The book deals with the general situation, in which there may be more than one antecedent value missing for a given problem. Two techniques, termed the parametric approach and feedback approach, are proposed in an attempt to perform backward interpolation with multiple missing antecedent values. In addition, to further enhance the versatility and potential of BFRI, the backward fuzzy interpolation method is extended to support -cut based interpolation by employing a fuzzy interpolation mechanism for multi-dimensional input spaces (IMUL). Finally, from an integrated application analysis perspective, experimental studies based upon a real-world scenario of terrorism risk assessment are provided in order to demonstrate the potential and efficacy of the hierarchical fuzzy rule interpolation methodology.
Posited by Professor Leon Chua at UC Berkeley more than 40 years ago, memristors, a nonlinear element in electrical circuitry, are set to revolutionize computing technology. Finally discovered by scientists at Hewlett-Packard in 2008, memristors generate huge interest because they can facilitate nanoscale, real-time computer learning, as well as due to their potential of serving as instant memories. This edited volume bottles some of the excitement about memristors, providing a state-of-the-art overview of neuromorphic memristor theory, as well as its technological and practical aspects. Based on work presented to specialist memristor seminars organized by the editors, the volume takes readers from a general introduction the fundamental concepts involved, to specialized analysis of computational modeling, hardware, and applications. The latter include the ground-breaking potential of memristors in facilitating hybrid wetware-hardware technologies for in-vitro experiments. The book evinces, and devotes space to the discussion of, the socially transformative potential of memristors, which could be as pervasive as was the invention of the silicon chip: machines that learn in the style of brains, are a computational Holy Grail. With contributions from key players in a fast-moving field, this edited volume is the first to cover memristors in the depth needed to trigger the further advances that surely lie around the corner.
This book presents an overview of a variety of contemporary statistical, mathematical and computer science techniques which are used to further the knowledge in the medical domain. The authors focus on applying data mining to the medical domain, including mining the sets of clinical data typically found in patient's medical records, image mining, medical mining, data mining and machine learning applied to generic genomic data and more. This work also introduces modeling behavior of cancer cells, multi-scale computational models and simulations of blood flow through vessels by using patient-specific models. The authors cover different imaging techniques used to generate patient-specific models. This is used in computational fluid dynamics software to analyze fluid flow. Case studies are provided at the end of each chapter. Professionals and researchers with quantitative backgrounds will find Computational Medicine in Data Mining and Modeling useful as a reference. Advanced-level students studying computer science, mathematics, statistics and biomedicine will also find this book valuable as a reference or secondary text book.
Walmsley offers a succinct introduction to major philosophical issues in artificial intelligence for advanced students of philosophy of mind, cognitive science and psychology. Whilst covering essential topics, it also provides the student with the chance to engage with cutting edge debates.
Computer and Machine Vision: Theory, Algorithms, Practicalities (previously entitled Machine Vision) clearly and systematically presents the basic methodology of computer and machine vision, covering the essential elements of the theory while emphasizing algorithmic and practical design constraints. This fully revised fourth edition has brought in more of the concepts and applications of computer vision, making it a very comprehensive and up-to-date tutorial text suitable for graduate students, researchers and R&D engineers working in this vibrant subject. Key features include: Practical examples and case studies give the ins and outs of developing real-world vision systems, giving engineers the realities of implementing the principles in practice. New chapters containing case studies on surveillance and driver assistance systems give practical methods on these cutting-edge applications in computer vision. Necessary mathematics and essential theory are made approachable by careful explanations and well-illustrated examples. Updated content and new sections cover topics such as human iris location, image stitching, line detection using RANSAC, performance measures, and hyperspectral imaging. The recent developments section now included in each chapter will be useful in bringing students and practitioners up to date with the subject. Roy Davies is Emeritus Professor of Machine Vision at Royal Holloway, University of London. He has worked on many aspects of vision, from feature detection to robust, real-time implementations of practical vision tasks. His interests include automated visual inspection, surveillance, vehicle guidance and crime detection. He has published more than 200 papers, and three books - Machine Vision: Theory, Algorithms, Practicalities (1990), Electronics, Noise and Signal Recovery (1993), and Image Processing for the Food Industry (2000); the first of these has been widely used internationally for more than 20 years, and is now out in this much enhanced fourth edition. Roy holds a DSc at the University of London, and has been awarded Distinguished Fellow of the British Machine Vision Association, and Fellow of the International Association of Pattern Recognition. Mathematics and essential theory are made approachable by careful explanations and well-illustrated examples.Updated content and new sections cover topics such as human iris location, image stitching, line detection using RANSAC, performance measures, and hyperspectral imaging.The recent developments section now included in each chapter will be useful in bringing students and practitioners up to date with the subject."
Counting belongs to the most elementary and frequent mental activities of human beings. Its results are a basis for coming to a decision in a lot of situations and dimensions of our life. This book presents a novel approach to the advanced and sophisticated case, called intelligent counting, in which the objects of counting are imprecisely, fuzzily specified. Formally, this collapses to counting in fuzzy sets, interval-valued fuzzy sets or I-fuzzy sets (Atanassov's intuitionistic fuzzy sets). The monograph is the first one showing and emphasizing that the presented methods of intelligent counting are human-consistent: are reflections and formalizations of real, human counting procedures performed under imprecision and, possibly, incompleteness of information. Other applications of intelligent counting in various areas of intelligent systems and decision support will be discussed, too. The whole presentation is self-contained, systematic, and equipped with many examples, figures and tables. Computer and information scientists, researchers, engineers and practitioners, applied mathematicians, and postgraduate students interested in information imprecision are the target readers.
This book lies at the interface of machine learning - a subfield of computer science that develops algorithms for challenging tasks such as shape or image recognition, where traditional algorithms fail - and photonics - the physical science of light, which underlies many of the optical communications technologies used in our information society. It provides a thorough introduction to reservoir computing and field-programmable gate arrays (FPGAs). Recently, photonic implementations of reservoir computing (a machine learning algorithm based on artificial neural networks) have made a breakthrough in optical computing possible. In this book, the author pushes the performance of these systems significantly beyond what was achieved before. By interfacing a photonic reservoir computer with a high-speed electronic device (an FPGA), the author successfully interacts with the reservoir computer in real time, allowing him to considerably expand its capabilities and range of possible applications. Furthermore, the author draws on his expertise in machine learning and FPGA programming to make progress on a very different problem, namely the real-time image analysis of optical coherence tomography for atherosclerotic arteries.
Imagine yourself as a military officer in a conflict zone trying to identify locations of weapons caches supporting road-side bomb attacks on your country's troops. Or imagine yourself as a public health expert trying to identify the location of contaminated water that is causing diarrheal diseases in a local population. Geospatial abduction is a new technique introduced by the authors that allows such problems to be solved. Geospatial Abduction provides the mathematics underlying geospatial abduction and the algorithms to solve them in practice; it has wide applicability and can be used by practitioners and researchers in many different fields. Real-world applications of geospatial abduction to military problems are included. Compelling examples drawn from other domains as diverse as criminology, epidemiology and archaeology are covered as well. This book also includes access to a dedicated website on geospatial abduction hosted by University of Maryland. Geospatial Abduction targets practitioners working in general AI, game theory, linear programming, data mining, machine learning, and more. Those working in the fields of computer science, mathematics, geoinformation, geological and biological science will also find this book valuable.
Is meaningful communication possible between two intelligent parties who share no common language or background? In this work, a theoretical framework is proposed in which it is possible to address when and to what extent such semantic communication is possible: such problems can be rigorously addressed by explicitly focusing on the goals of the communication. Under this framework, it is possible to show that for many goals, communication without any common language or background is possible using universal protocols. This work should be accessible to anyone with an undergraduate-level knowledge of the theory of computation. The theoretical framework presented here is of interest to anyone wishing to design systems with flexible interfaces, either among computers or between computers and their users.
In Artificial Intelligence in Finance and Investing, authors Robert Trippi and Jae Lee explain this fascinating new technology in terms that portfolio managers, institutional investors, investment analysis, and information systems professionals can understand. Using real-life examples and a practical approach, this rare and readable volume discusses the entire field of artificial intelligence of relevance to investing, so that readers can realize the benefits and evaluate the features of existing or proposed systems, and ultimately construct their own systems. Topics include using Expert Systems for Asset Allocation, Timing Decisions, Pattern Recognition, and Risk Assessment; overview of Popular Knowledge-Based Systems; construction of Synergistic Rule Bases for Securities Selection; incorporating the Markowitz Portfolio Optimization Model into Knowledge-Based Systems; Bayesian Theory and Fuzzy Logic System Components; Machine Learning in Portfolio Selection and Investment Timing, including Pattern-Based Learning and Fenetic Algorithms; and Neural Network-Based Systems. To illustrate the concepts presented in the book, the authors conclude with a valuable practice session and analysis of a typical knowledge-based system for investment management, K-FOLIO. For those who want to stay on the cutting edge of the "application" revolution, Artificial Intelligence in Finance and Investing offers a pragmatic introduction to the use of knowledge-based systems in securities selection and portfolio management.
The volume, complexity, and irregularity of computational data in modern algorithms and simulations necessitates an unorthodox approach to computing. Understanding the facets and possibilities of soft computing algorithms is necessary for the accurate and timely processing of complex data. Research Advances in the Integration of Big Data and Smart Computing builds on the available literature in the realm of Big Data while providing further research opportunities in this dynamic field. This publication provides the resources necessary for technology developers, scientists, and policymakers to adopt and implement new paradigms in computational methods across the globe. The chapters in this publication advance the body of knowledge on soft computing techniques through topics such as transmission control protocol for mobile ad hoc networks, feature extraction, comparative analysis of filtering techniques, big data in economic policy, and advanced dimensionality reduction methods.
This research volume presents a sample of recent contributions related to the issue of quality-assessment for Web Based information in the context of information access, retrieval, and filtering systems. The advent of the Web and the uncontrolled process of documents' generation have raised the problem of declining quality assessment to information on the Web, by considering both the nature of documents (texts, images, video, sounds, and so on), the genre of documents ( news, geographic information, ontologies, medical records, products records, and so on), the reputation of information sources and sites, and, last but not least the actions performed on documents (content indexing, retrieval and ranking, collaborative filtering, and so on). The volume constitutes a compendium of both heterogeneous approaches and sample applications focusing specific aspects of the quality assessment for Web-based information for researchers, PhD students and practitioners carrying out their research activity in the field of Web information retrieval and filtering, Web information mining, information quality representation and management.
This volume introduces new approaches in intelligent control area from both the viewpoints of theory and application. It consists of eleven contributions by prominent authors from all over the world and an introductory chapter. This volume is strongly connected to another volume entitled "New Approaches in Intelligent Image Analysis" (Eds. Roumen Kountchev and Kazumi Nakamatsu). The chapters of this volume are self-contained and include summary, conclusion and future works. Some of the chapters introduce specific case studies of various intelligent control systems and others focus on intelligent theory based control techniques with applications. A remarkable specificity of this volume is that three chapters are dealing with intelligent control based on paraconsistent logics. |
You may like...
Creativity in Computing and DataFlow…
Suyel Namasudra, Veljko Milutinovic
Hardcover
R4,204
Discovery Miles 42 040
Role of 6g Wireless Networks in AI and…
Malaya Dutta Borah, Steven A. Wright, …
Hardcover
R6,206
Discovery Miles 62 060
Hardware Accelerator Systems for…
Shiho Kim, Ganesh Chandra Deka
Hardcover
R3,950
Discovery Miles 39 500
Icle Publications Plc-Powered Data…
Polly Patrick, Angela Peery
Paperback
R705
Discovery Miles 7 050
|