![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General
This book aims to present a viable alternative to the Hopfield Neural Network (HNN) model for analog computation. It is well known the standard HNN suffers from problems of convergence to local minima, and requirement of a large number of neurons and synaptic weights. Therefore, improved solutions are needed. The non-linear synapse neural network (NoSyNN) is one such possibility and is discussed in detail in this book. This book also discusses the applications in computationally intensive tasks like graph coloring, ranking, and linear as well as quadratic programming. The material in the book is useful to students, researchers and academician working in the area of analog computation.
Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.
This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. This book will appeal to researchers, PhD students and graduate students with multidisciplinary interests related to the areas of medical imaging, image processing and analysis, computer vision, image segmentation, image registration and fusion, scientific data visualization and image based modeling and simulation.
Case-based reasoning paradigms offer automatic reasoning capabilities which are useful for the implementation of human like machines in a limited sense. This research book is the second volume in a series devoted to presenting Case-based reasoning (CBR) applications. The first volume, published in 2010, testified the flexibility of CBR, and its applicability in all those fields where experiential knowledge is available. This second volume further witnesses the heterogeneity of the domains in which CBR can be exploited, but also reveals some common directions that are clearly emerging in recent years. This book will prove useful to the application engineers, scientists, professors and students who wish to develop successful case-based reasoning applications.
The field of data mining has made significant and far-reaching advances over the past three decades. Because of its potential power for solving complex problems, data mining has been successfully applied to diverse areas such as business, engineering, social media, and biological science. Many of these applications search for patterns in complex structural information. In biomedicine for example, modeling complex biological systems requires linking knowledge across many levels of science, from genes to disease. Further, the data characteristics of the problems have also grown from static to dynamic and spatiotemporal, complete to incomplete, and centralized to distributed, and grow in their scope and size (this is known as big data). The effective integration of big data for decision-making also requires privacy preservation. The contributions to this monograph summarize the advances of data mining in the respective fields. This volume consists of nine chapters that address subjects ranging from mining data from opinion, spatiotemporal databases, discriminative subgraph patterns, path knowledge discovery, social media, and privacy issues to the subject of computation reduction via binary matrix factorization.
The book describes a system for visual surveillance using intelligent cameras. The camera uses robust techniques for detecting and tracking moving objects. The real time capture of the objects is then stored in the database. The tracking data stored in the database is analysed to study the camera view, detect and track objects, and study object behavior. These set of models provide a robust framework for coordinating the tracking of objects between overlapping and non-overlapping cameras, and recording the activity of objects detected by the system.
In this thesis, the author explains the background of problems in quantum estimation, the necessary conditions required for estimation precision benchmarks that are applicable and meaningful for evaluating data in quantum information experiments, and provides examples of such benchmarks. The author develops mathematical methods in quantum estimation theory and analyzes the benchmarks in tests of Bell-type correlation and quantum tomography with those methods. Above all, a set of explicit formulae for evaluating the estimation precision in quantum tomography with finite data sets is derived, in contrast to the standard quantum estimation theory, which can deal only with infinite samples. This is the first result directly applicable to the evaluation of estimation errors in quantum tomography experiments, allowing experimentalists to guarantee estimation precision and verify quantitatively that their preparation is reliable.
The aim of the book is to give an accessible introduction of mathematical models and signal processing methods in speech and hearing sciences for senior undergraduate and beginning graduate students with basic knowledge of linear algebra, differential equations, numerical analysis, and probability. Speech and hearing sciences are fundamental to numerous technological advances of the digital world in the past decade, from music compression in MP3 to digital hearing aids, from network based voice enabled services to speech interaction with mobile phones. Mathematics and computation are intimately related to these leaps and bounds. On the other hand, speech and hearing are strongly interdisciplinary areas where dissimilar scientific and engineering publications and approaches often coexist and make it difficult for newcomers to enter.
This monograph by Florian Röhrbein, Germano Veiga and Ciro Natale is an edited collection of 15 authoritative contributions in the area of robot technology transfer between academia and industry. It comprises three parts on Future Industrial Robotics, Robotic Grasping as well as Human-Centered Robots. The book chapters cover almost all the topics nowadays considered ‘hot’ within the robotics community, from reliable object recognition to dexterous grasping, from speech recognition to intuitive robot programming, from mobile robot navigation to aerial robotics, from safe physical human-robot interaction to body extenders. All contributions stem from the results of ECHORD – the European Clearing House for Open Robotics Development, a large-scale integrating project funded by the European Commission within the 7th Framework Programme from 2009 to 2013. ECHORD’s two main pillars were the so-called experiments, 51 small-sized industry-driven research projects and the structured dialog a powerful interaction instrument between the stakeholders. The results described in this volume are expected to shed new light on innovation and technology transfer from academia to industry in the field of robotics.
This book contains extended versions of selected papers from the 3rd edition of the International Symposium CompIMAGE. These contributions include cover methods of signal and image processing and analysis to tackle problems found in medicine, material science, surveillance, biometric, robotics, defence, satellite data, traffic analysis and architecture, image segmentation, 2D and 3D reconstruction, data acquisition, interpolation and registration, data visualization, motion and deformation analysis and 3D vision.
This book describes recent radiotherapy technologies including tools for measuring target position during radiotherapy and tracking-based delivery systems. This book presents a customized prediction of respiratory motion with clustering from multiple patient interactions. The proposed method contributes to the improvement of patient treatments by considering breathing pattern for the accurate dose calculation in radiotherapy systems. Real-time tumor-tracking, where the prediction of irregularities becomes relevant, has yet to be clinically established. The statistical quantitative modeling for irregular breathing classification, in which commercial respiration traces are retrospectively categorized into several classes based on breathing pattern are discussed as well. The proposed statistical classification may provide clinical advantages to adjust the dose rate before and during the external beam radiotherapy for minimizing the safety margin. In the first chapter following the Introduction to this book, we review three prediction approaches of respiratory motion: model-based methods, model-free heuristic learning algorithms, and hybrid methods. In the following chapter, we present a phantom study—prediction of human motion with distributed body sensors—using a Polhemus Liberty AC magnetic tracker. Next we describe respiratory motion estimation with hybrid implementation of extended Kalman filter. The given method assigns the recurrent neural network the role of the predictor and the extended Kalman filter the role of the corrector. After that, we present customized prediction of respiratory motion with clustering from multiple patient interactions. For the customized prediction, we construct the clustering based on breathing patterns of multiple patients using the feature selection metrics that are composed of a variety of breathing features. We have evaluated the new algorithm by comparing the prediction overshoot and the tracking estimation value. The experimental results of 448 patients’ breathing patterns validated the proposed irregular breathing classifier in the last chapter.
This book provides the readers with a thorough and systematic introduction to hesitant fuzzy theory. It presents the most recent research results and advanced methods in the field. These includes: hesitant fuzzy aggregation techniques, hesitant fuzzy preference relations, hesitant fuzzy measures, hesitant fuzzy clustering algorithms and hesitant fuzzy multi-attribute decision making methods. Since its introduction by Torra and Narukawa in 2009, hesitant fuzzy sets have become more and more popular and have been used for a wide range of applications, from decision-making problems to cluster analysis, from medical diagnosis to personnel appraisal and information retrieval. This book offers a comprehensive report on the state-of-the-art in hesitant fuzzy sets theory and applications, aiming at becoming a reference guide for both researchers and practitioners in the area of fuzzy mathematics and other applied research fields (e.g. operations research, information science, management science and engineering) characterized by uncertain ("hesitant") information. Because of its clarity and self contained explanations, the book can also be adopted as a textbook from graduate and advanced undergraduate students.
FSR, the International Conference on Field and Service Robotics, is the leading single track conference of robotics for field and service applications. This book presents the results of FSR2012, the eighth conference of Field and Service Robotics, which was originally planned for 2011 with the venue of Matsushima in Tohoku region of Japan. However, on March 11, 2011, a magnitude M9.0 earthquake occurred off the Pacific coast of Tohoku, and a large-scale disaster was caused by the Tsunami which resulted, therefore the conference was postponed by one year to July, 2012. In fact, this earthquake raised issues concerning the contribution of field and service robotics technology to emergency scenarios. A number of precious lessons were learned from operation of robots in the resulting, very real and challenging, disaster environments. Up-to-date study on disaster response, relief and recovery was then featured in the conference. This book offers 43 papers on a broad range of topics including: Disaster Response, Service/Entertainment Robots, Inspection/Maintenance Robots, Mobile Robot Navigation, Agricultural Robots, Robots for Excavation, Planetary Exploration, Large Area Mapping, SLAM for Outdoor Robots, and Elemental Technology for Mobile Robots.
IAENG Transactions on Engineering Technologies contains forty-nine revised and extended research articles, written by prominent researchers participating in the conference. Topics covered include circuits, engineering mathematics, control theory, communications systems, systems engineering, manufacture engineering, computational biology, chemical engineering, and industrial applications. This book offers the state of art of tremendous advances in engineering technologies and physical science and applications, and also serves as an excellent source of reference for researchers and graduate students working with/on engineering technologies and physical science and applications.
This present book provides an alternative approach to study the pre-kernel solution of transferable utility games based on a generalized conjugation theory from convex analysis. Although the pre-kernel solution possesses an appealing axiomatic foundation that lets one consider this solution concept as a standard of fairness, the pre-kernel and its related solutions are regarded as obscure and too technically complex to be treated as a real alternative to the Shapley value. Comprehensible and efficient computability is widely regarded as a desirable feature to qualify a solution concept apart from its axiomatic foundation as a standard of fairness. We review and then improve an approach to compute the pre-kernel of a cooperative game by the indirect function. The indirect function is known as the Fenchel-Moreau conjugation of the characteristic function. Extending the approach with the indirect function, we are able to characterize the pre-kernel of the grand coalition simply by the solution sets of a family of quadratic objective functions.
The methodology described in this book is the result of many years of research experience in the field of synthesizable VHDL design targeting FPGA based platforms. VHDL was first conceived as a documentation language for ASIC designs. Afterwards, the language was used for the behavioral simulation of ASICs, and also as a design input for synthesis tools. VHDL is a rich language, but just a small subset of it can be used to write synthesizable code, from which a physical circuit can be obtained. Usually VHDL books describe both, synthesis and simulation aspects of the language, but in this book the reader is conducted just through the features acceptable by synthesis tools. The book introduces the subjects in a gradual and concise way, providing just enough information for the reader to develop their synthesizable digital systems in VHDL. The examples in the book were planned targeting an FPGA platform widely used around the world.
The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations. Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.
This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms. The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling. Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social networks, business intelligence and forming collaboration schemes. Given the subject area, this book is aimed at a broad audience of researchers and practitioners. Owing to the nature of the material being covered and a way it is organized, the volume will appeal to the well-established communities including those active in various disciplines in which social networks, their analysis and optimization are of genuine relevance. Those involved in operations research, management, various branches of engineering, and economics will benefit from the exposure to the subject matter.
Forecasting is a crucial function for companies in the fashion industry, but for many real-life forecasting applications in the, the data patterns are notorious for being highly volatile and it is very difficult, if not impossible, to analytically learn about the underlying patterns. As a result, many traditional methods (such as pure statistical models) will fail to make a sound prediction. Over the past decade, advances in artificial intelligence and computing technologies have provided an alternative way of generating precise and accurate forecasting results for fashion businesses. Despite being an important and timely topic, there is currently an absence of a comprehensive reference source that provides up-to-date theoretical and applied research findings on the subject of intelligent fashion forecasting systems. This three-part handbook fulfills this need and covers materials ranging from introductory studies and technical reviews, theoretical modeling research, to intelligent fashion forecasting applications and analysis. This book is suitable for academic researchers, graduate students, senior undergraduate students and practitioners who are interested in the latest research on fashion forecasting.
Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.
This book is concerned with recent advances in fitness landscapes. The concept of fitness landscapes originates from theoretical biology and refers to a framework for analysing and visualizing the relationships between genotypes, phenotypes and fitness. These relationships lay at the centre of attempts to mathematically describe evolutionary processes and evolutionary dynamics. The book addresses recent advances in the understanding of fitness landscapes in evolutionary biology and evolutionary computation. In the volume, experts in the field of fitness landscapes present these findings in an integrated way to make it accessible to a number of audiences: senior undergraduate and graduate students in computer science, theoretical biology, physics, applied mathematics and engineering, but also researcher looking for a reference or/and entry point into using fitness landscapes for analysing algorithms. Also practitioners wanting to employ fitness landscape techniques for evaluating bio- and nature-inspired computing algorithms can find valuable material in the book. For teaching proposes, the book could also be used as a reference handbook.
This book, now at the third edition, addresses the main control aspects in underwater manipulation tasks. The mathematical model with significant impact on the control strategy is discussed. The problem of controlling a 6-degrees-of-freedoms autonomous underwater vehicle is deeply investigated and a survey of fault detection/tolerant strategies for unmanned underwater vehicles is provided. Inverse kinematics, dynamic and interaction control for underwater vehicle-manipulator systems are then discussed. The code used to generate most of the numerical simulations is made available and briefly discussed.
The aim of this book is to understand the state-of-the-art theoretical and practical advances of swarm intelligence. It comprises seven contemporary relevant chapters. In chapter 1, a review of Bacteria Foraging Optimization (BFO) techniques for both single and multiple criterions problem is presented. A survey on swarm intelligence for multiple and many objectives optimization is presented in chapter 2 along with a topical study on EEG signal analysis. Without compromising the extensive simulation study, a comparative study of variants of MOPSO is provided in chapter 3. Intractable problems like subset and job scheduling problems are discussed in chapters 4 and 7 by different hybrid swarm intelligence techniques. An attempt to study image enhancement by ant colony optimization is made in chapter 5. Finally, chapter 7 covers the aspect of uncertainty in data by hybrid PSO.
“Propagation, which looks at spreading in complex networks, can be seen from many viewpoints; it is undesirable, or desirable, controllable, the mechanisms generating that propagation can be the topic of interest, but in the end all depends on the setting. This book covers leading research on a wide spectrum of propagation phenomenon and the techniques currently used in its modelling, prediction, analysis and control. Fourteen papers range over topics including epidemic models, models for trust inference, coverage strategies for networks, vehicle flow propagation, bio-inspired routing algorithms, P2P botnet attacks and defences, fault propagation in gene-cellular networks, malware propagation for mobile networks, information propagation in crisis situations, financial contagion in interbank networks, and finally how to maximize the spread of influence in social networks. The compendium will be of interest to researchers, those working in social networking, communications and finance and is aimed at providing a base point for further studies on current research. Above all, by bringing together research from such diverse fields, the book seeks to cross-pollinate ideas, and give the reader a glimpse of the breath of current research.”
The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice’s contributions to pragmatics or in interpretation by abduction. |
You may like...
Handbook of Artificial Intelligence in…
Benedict du Boulay, Antonija Mitrovic, …
Hardcover
R8,636
Discovery Miles 86 360
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
R1,150
Discovery Miles 11 500
Principles Of Business Information…
Ralph Stair, George Reynolds, …
Paperback
(1)R1,780 Discovery Miles 17 800
If Anyone Builds It, Everyone Dies - The…
Eliezer Yudkowsky, Nate Soares
Paperback
|