![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
Sensorimotor integration, the dynamic process by which the sensory and motor systems communicate with each other, is crucial to humans' and animals' ability to explore and react to their environment. This book summarizes the main aspects of our current understanding of sensorimotor integration in 10 chapters written by leading scientists in this active and ever-growing field. This volume focuses on the whisker system, which is an exquisite model to experimentally approach sensorimotor integration in the mammalian brain. In this book, authors examine the whisker system on many different levels, ranging from the building blocks and neuronal circuits to sensorimotor behavior. Neuronal coding strategies, comparative analysis as well as robotics illustrate the multiple facets of this research and its broad impact on fundamental questions about the neurobiology of the mammalian brain.
This book explores the significant role of granular computing in advancing machine learning towards in-depth processing of big data. It begins by introducing the main characteristics of big data, i.e., the five Vs-Volume, Velocity, Variety, Veracity and Variability. The book explores granular computing as a response to the fact that learning tasks have become increasingly more complex due to the vast and rapid increase in the size of data, and that traditional machine learning has proven too shallow to adequately deal with big data. Some popular types of traditional machine learning are presented in terms of their key features and limitations in the context of big data. Further, the book discusses why granular-computing-based machine learning is called for, and demonstrates how granular computing concepts can be used in different ways to advance machine learning for big data processing. Several case studies involving big data are presented by using biomedical data and sentiment data, in order to show the advances in big data processing through the shift from traditional machine learning to granular-computing-based machine learning. Finally, the book stresses the theoretical significance, practical importance, methodological impact and philosophical aspects of granular-computing-based machine learning, and suggests several further directions for advancing machine learning to fit the needs of modern industries. This book is aimed at PhD students, postdoctoral researchers and academics who are actively involved in fundamental research on machine learning or applied research on data mining and knowledge discovery, sentiment analysis, pattern recognition, image processing, computer vision and big data analytics. It will also benefit a broader audience of researchers and practitioners who are actively engaged in the research and development of intelligent systems.
The Semantic Web is characterized by the existence of a very large number of distributed semantic resources, which together define a network of ontologies. These ontologies in turn are interlinked through a variety of different meta-relationships such as versioning, inclusion, and many more. This scenario is radically different from the relatively narrow contexts in which ontologies have been traditionally developed and applied, and thus calls for new methods and tools to effectively support the development of novel network-oriented semantic applications. This book by Suarez-Figueroa et al. provides the necessary methodological and technological support for the development and use of ontology networks, which ontology developers need in this distributed environment. After an introduction, in its second part the authors describe the NeOn Methodology framework. The book's third part details the key activities relevant to the ontology engineering life cycle. For each activity, a general introduction, methodological guidelines, and practical examples are provided. The fourth part then presents a detailed overview of the NeOn Toolkit and its plug-ins. Lastly, case studies from the pharmaceutical and the fishery domain round out the work. The book primarily addresses two main audiences: students (and their lecturers) who need a textbook for advanced undergraduate or graduate courses on ontology engineering, and practitioners who need to develop ontologies in particular or Semantic Web-based applications in general. Its educational value is maximized by its structured approach to explaining guidelines and combining them with case studies and numerous examples. The description of the open source NeOn Toolkit provides an additional asset, as it allows readers to easily evaluate and apply the ideas presented."
This book highlights technical advances in knowledge management and their applications across a diverse range of domains. It explores the applications of knowledge computing methodologies in image processing, pattern recognition, health care and industrial contexts. The chapters also examine the knowledge engineering process involved in information management. Given its interdisciplinary nature, the book covers methods for identifying and acquiring valid, potentially useful knowledge sources. The ideas presented in the respective chapters illustrate how to effectively apply the perspectives of knowledge computing in specialized domains.
This book is an authoritative handbook of current topics, technologies and methodological approaches that may be used for the study of scholarly impact. The included methods cover a range of fields such as statistical sciences, scientific visualization, network analysis, text mining, and information retrieval. The techniques and tools enable researchers to investigate metric phenomena and to assess scholarly impact in new ways. Each chapter offers an introduction to the selected topic and outlines how the topic, technology or methodological approach may be applied to metrics-related research. Comprehensive and up-to-date, Measuring Scholarly Impact: Methods and Practice is designed for researchers and scholars interested in informetrics, scientometrics, and text mining. The hands-on perspective is also beneficial to advanced-level students in fields from computer science and statistics to information science.
Walmsley offers a succinct introduction to major philosophical issues in artificial intelligence for advanced students of philosophy of mind, cognitive science and psychology. Whilst covering essential topics, it also provides the student with the chance to engage with cutting edge debates.
Complex Automated Negotiations represent an important, emerging area in the field of Autonomous Agents and Multi-Agent Systems. Automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. These factors include the number of issues, dependencies between these issues, representation of utilities, the negotiation protocol, the number of parties in the negotiation (bilateral or multi-party), time constraints, etc. Software agents can support automation or simulation of such complex negotiations on the behalf of their owners, and can provide them with efficient bargaining strategies. To realize such a complex automated negotiation, we have to incorporate advanced Artificial Intelligence technologies includes search, CSP, graphical utility models, Bayes nets, auctions, utility graphs, predicting and learning methods. Applications could include e-commerce tools, decision-making support tools, negotiation support tools, collaboration tools, etc. This book aims to provide a description of the new trends in Agent-based, Complex Automated Negotiation, based on the papers from leading researchers. Moreover, it gives an overview of the latest scientific efforts in this field, such as the platform and strategies of automated negotiating techniques.
The ink and stylus tablets discovered at the Roman fort of Vindolanda are a unique resource for scholars of ancient history. However, the stylus tablets in particular are extremely difficult to read. This book details the development of what appears to be the first system constructed to aid experts in the process of reading an ancient document, exploring the extent to which techniques from Artificial Intelligence can be used to develop a system that could aid historians in reading the stylus texts. Image to Interpretation includes a model of how experts read ancient texts, a corpora of letter forms from the Vindolanda text corpus, and a detailed description of the architecture of the system. It will be of interest to papyrologists, researchers in Roman history and palaeography, computer and engineering scientists working in the field of Artificial Intelligence and image processing, and those interested in the use of computing in the humanities.
The need of video compression in the modern age of visual communication cannot be over-emphasized. This monograph will provide useful information to the postgraduate students and researchers who wish to work in the domain of VLSI design for video processing applications. In this book, one can find an in-depth discussion of several motion estimation algorithms and their VLSI implementation as conceived and developed by the authors. It records an account of research done involving fast three step search, successive elimination, one-bit transformation and its effective combination with diamond search and dynamic pixel truncation techniques. Two appendices provide a number of instances of proof of concept through Matlab and Verilog program segments. In this aspect, the book can be considered as first of its kind. The architectures have been developed with an eye to their applicability in everyday low-power handheld appliances including video camcorders and smartphones.
The book reports on the latest advances and applications of nonlinear control systems. It consists of 30 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought out in the broad areas of nonlinear control systems such as robotics, nonlinear circuits, power systems, memristors, underwater vehicles, chemical processes, observer design, output regulation, backstepping control, sliding mode control, time-delayed control, variables structure control, robust adaptive control, fuzzy logic control, chaos, hyperchaos, jerk systems, hyperjerk systems, chaos control, chaos synchronization, etc. Special importance was given to chapters offering practical solutions, modeling and novel control methods for the recent research problems in nonlinear control systems. This book will serve as a reference book for graduate students and researchers with a basic knowledge of electrical and control systems engineering. The resulting design procedures on the nonlinear control systems are emphasized using MATLAB software.
This book focuses on computational intelligence techniques and its applications fast-growing and promising research topics that have drawn a great deal of attention from researchers over the years. It brings together many different aspects of the current research on intelligence technologies such as neural networks, support vector machines, fuzzy logic and evolutionary computation, and covers a wide range of applications from pattern recognition and system modeling, to intelligent control problems and biomedical applications. Fundamental concepts and essential analysis of various computational techniques are presented to offer a systematic and effective tool for better treatment of different applications, and simulation and experimental results are included to illustrate the design procedure and the effectiveness of the approaches.
This book provides a description of advanced multi-agent and artificial intelligence technologies for the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field. A complex system features a large number of interacting components, whose aggregate activities are nonlinear and self-organized. A multi-agent system is a group or society of agents which interact with others cooperatively and/or competitively in order to reach their individual or common goals. Multi-agent systems are suitable for modeling and simulation of complex systems, which is difficult to accomplish using traditional computational approaches.
This book covers a wide spectrum of systems such as linear and nonlinear multivariable systems as well as control problems such as disturbance, uncertainty and time-delays. The purpose of this book is to provide researchers and practitioners a manual for the design and application of advanced discrete-time controllers. The book presents six different control approaches depending on the type of system and control problem. The first and second approaches are based on Sliding Mode control (SMC) theory and are intended for linear systems with exogenous disturbances. The third and fourth approaches are based on adaptive control theory and are aimed at linear/nonlinear systems with periodically varying parametric uncertainty or systems with input delay. The fifth approach is based on Iterative learning control (ILC) theory and is aimed at uncertain linear/nonlinear systems with repeatable tasks and the final approach is based on fuzzy logic control (FLC) and is intended for highly uncertain systems with heuristic control knowledge. Detailed numerical examples are provided in each chapter to illustrate the design procedure for each control method. A number of practical control applications are also presented to show the problem solving process and effectiveness with the advanced discrete-time control approaches introduced in this book.
This book covers the latest advances in Big Data technologies and provides the readers with a comprehensive review of the state-of-the-art in Big Data processing, analysis, analytics, and other related topics. It presents new models, algorithms, software solutions and methodologies, covering the full data cycle, from data gathering to their visualization and interaction, and includes a set of case studies and best practices. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data are also identified and presented throughout the book, which is intended for researchers, scholars, advanced students, software developers and practitioners working at the forefront in their field.
Is meaningful communication possible between two intelligent parties who share no common language or background? In this work, a theoretical framework is proposed in which it is possible to address when and to what extent such semantic communication is possible: such problems can be rigorously addressed by explicitly focusing on the goals of the communication. Under this framework, it is possible to show that for many goals, communication without any common language or background is possible using universal protocols. This work should be accessible to anyone with an undergraduate-level knowledge of the theory of computation. The theoretical framework presented here is of interest to anyone wishing to design systems with flexible interfaces, either among computers or between computers and their users.
This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data projections. Topics and features: discusses machine learning frameworks based on artificial neural networks, statistical learning theory and kernel-based methods, and tree-based methods; examines the application of machine learning to steady state and dynamic operations, with a focus on unsupervised learning; describes the use of spectral methods in process fault diagnosis.
This book presents the latest research advances in complex network structure analytics based on computational intelligence (CI) approaches, particularly evolutionary optimization. Most if not all network issues are actually optimization problems, which are mostly NP-hard and challenge conventional optimization techniques. To effectively and efficiently solve these hard optimization problems, CI based network structure analytics offer significant advantages over conventional network analytics techniques. Meanwhile, using CI techniques may facilitate smart decision making by providing multiple options to choose from, while conventional methods can only offer a decision maker a single suggestion. In addition, CI based network structure analytics can greatly facilitate network modeling and analysis. And employing CI techniques to resolve network issues is likely to inspire other fields of study such as recommender systems, system biology, etc., which will in turn expand CI's scope and applications. As a comprehensive text, the book covers a range of key topics, including network community discovery, evolutionary optimization, network structure balance analytics, network robustness analytics, community-based personalized recommendation, influence maximization, and biological network alignment. Offering a rich blend of theory and practice, the book is suitable for students, researchers and practitioners interested in network analytics and computational intelligence, both as a textbook and as a reference work.
Posited by Professor Leon Chua at UC Berkeley more than 40 years ago, memristors, a nonlinear element in electrical circuitry, are set to revolutionize computing technology. Finally discovered by scientists at Hewlett-Packard in 2008, memristors generate huge interest because they can facilitate nanoscale, real-time computer learning, as well as due to their potential of serving as instant memories. This edited volume bottles some of the excitement about memristors, providing a state-of-the-art overview of neuromorphic memristor theory, as well as its technological and practical aspects. Based on work presented to specialist memristor seminars organized by the editors, the volume takes readers from a general introduction the fundamental concepts involved, to specialized analysis of computational modeling, hardware, and applications. The latter include the ground-breaking potential of memristors in facilitating hybrid wetware-hardware technologies for in-vitro experiments. The book evinces, and devotes space to the discussion of, the socially transformative potential of memristors, which could be as pervasive as was the invention of the silicon chip: machines that learn in the style of brains, are a computational Holy Grail. With contributions from key players in a fast-moving field, this edited volume is the first to cover memristors in the depth needed to trigger the further advances that surely lie around the corner.
This book contains the proceedings of the 11th FSR (Field and Service Robotics), which is the leading single-track conference on applications of robotics in challenging environments. This conference was held in Zurich, Switzerland from 12-15 September 2017. The book contains 45 full-length, peer-reviewed papers organized into a variety of topics: Control, Computer Vision, Inspection, Machine Learning, Mapping, Navigation and Planning, and Systems and Tools. The goal of the book and the conference is to report and encourage the development and experimental evaluation of field and service robots, and to generate a vibrant exchange and discussion in the community. Field robots are non-factory robots, typically mobile, that operate in complex and dynamic environments: on the ground (Earth or other planets), under the ground, underwater, in the air or in space. Service robots are those that work closely with humans to help them with their lives. The first FSR was held in Canberra, Australia, in 1997. Since that first meeting, FSR has been held roughly every two years, cycling through Asia, Americas, and Europe.
Although they are believed to be unsolvable in general, tractability results suggest that some practical NP-hard problems can be efficiently solved. Combinatorial search algorithms are designed to efficiently explore the usually large solution space of these instances by reducing the search space to feasible regions and using heuristics to efficiently explore these regions. Various mathematical formalisms may be used to express and tackle combinatorial problems, among them the constraint satisfaction problem (CSP) and the propositional satisfiability problem (SAT). These algorithms, or constraint solvers, apply search space reduction through inference techniques, use activity-based heuristics to guide exploration, diversify the searches through frequent restarts, and often learn from their mistakes. In this book the author focuses on knowledge sharing in combinatorial search, the capacity to generate and exploit meaningful information, such as redundant constraints, heuristic hints, and performance measures, during search, which can dramatically improve the performance of a constraint solver. Information can be shared between multiple constraint solvers simultaneously working on the same instance, or information can help achieve good performance while solving a large set of related instances. In the first case, information sharing has to be performed at the expense of the underlying search effort, since a solver has to stop its main effort to prepare and commu nicate the information to other solvers; on the other hand, not sharing information can incur a cost for the whole system, with solvers potentially exploring unfeasible spaces discovered by other solvers. In the second case, sharing performance measures can be done with little overhead, and the goal is to be able to tune a constraint solver in relation to the characteristics of a new instance - this corresponds to the selection of the most suitable algorithm for solving a given instance. The book is suitable for researchers, practitioners, and graduate students working in the areas of optimization, search, constraints, and computational complexity.
This collection of papers, published in honour of Hector J. Levesque on the occasion of his 60th birthday, addresses a number of core areas in the field of knowledge representation and reasoning. In a broad sense, the book is about knowledge and belief, tractable reasoning, and reasoning about action and change. More specifically, the book contains contributions to Description Logics, the expressiveness of knowledge representation languages, limited forms of inference, satisfiablity (SAT), the logical foundations of BDI architectures, only-knowing, belief revision, planning, causation, the situation calculus, the action language Golog, and cognitive robotics.
Computer and Machine Vision: Theory, Algorithms, Practicalities (previously entitled Machine Vision) clearly and systematically presents the basic methodology of computer and machine vision, covering the essential elements of the theory while emphasizing algorithmic and practical design constraints. This fully revised fourth edition has brought in more of the concepts and applications of computer vision, making it a very comprehensive and up-to-date tutorial text suitable for graduate students, researchers and R&D engineers working in this vibrant subject. Key features include: Practical examples and case studies give the ins and outs of developing real-world vision systems, giving engineers the realities of implementing the principles in practice. New chapters containing case studies on surveillance and driver assistance systems give practical methods on these cutting-edge applications in computer vision. Necessary mathematics and essential theory are made approachable by careful explanations and well-illustrated examples. Updated content and new sections cover topics such as human iris location, image stitching, line detection using RANSAC, performance measures, and hyperspectral imaging. The recent developments section now included in each chapter will be useful in bringing students and practitioners up to date with the subject. Roy Davies is Emeritus Professor of Machine Vision at Royal Holloway, University of London. He has worked on many aspects of vision, from feature detection to robust, real-time implementations of practical vision tasks. His interests include automated visual inspection, surveillance, vehicle guidance and crime detection. He has published more than 200 papers, and three books - Machine Vision: Theory, Algorithms, Practicalities (1990), Electronics, Noise and Signal Recovery (1993), and Image Processing for the Food Industry (2000); the first of these has been widely used internationally for more than 20 years, and is now out in this much enhanced fourth edition. Roy holds a DSc at the University of London, and has been awarded Distinguished Fellow of the British Machine Vision Association, and Fellow of the International Association of Pattern Recognition. Mathematics and essential theory are made approachable by careful explanations and well-illustrated examples.Updated content and new sections cover topics such as human iris location, image stitching, line detection using RANSAC, performance measures, and hyperspectral imaging.The recent developments section now included in each chapter will be useful in bringing students and practitioners up to date with the subject."
This book lies at the interface of machine learning - a subfield of computer science that develops algorithms for challenging tasks such as shape or image recognition, where traditional algorithms fail - and photonics - the physical science of light, which underlies many of the optical communications technologies used in our information society. It provides a thorough introduction to reservoir computing and field-programmable gate arrays (FPGAs). Recently, photonic implementations of reservoir computing (a machine learning algorithm based on artificial neural networks) have made a breakthrough in optical computing possible. In this book, the author pushes the performance of these systems significantly beyond what was achieved before. By interfacing a photonic reservoir computer with a high-speed electronic device (an FPGA), the author successfully interacts with the reservoir computer in real time, allowing him to considerably expand its capabilities and range of possible applications. Furthermore, the author draws on his expertise in machine learning and FPGA programming to make progress on a very different problem, namely the real-time image analysis of optical coherence tomography for atherosclerotic arteries. |
You may like...
|