Your cart is empty
Data driven Artificial Intelligence (AI) and Machine Learning (ML) in digital pathology, radiology, and dermatology is very promising. In specific cases, for example, Deep Learning (DL), even exceeding human performance. However, in the context of medicine it is important for a human expert to verify the outcome. Consequently, there is a need for transparency and re-traceability of state-of-the-art solutions to make them usable for ethical responsible medical decision support. Moreover, big data is required for training, covering a wide spectrum of a variety of human diseases in different organ systems. These data sets must meet top-quality and regulatory criteria and must be well annotated for ML at patient-, sample-, and image-level. Here biobanks play a central and future role in providing large collections of high-quality, well-annotated samples and data. The main challenges are finding biobanks containing ''fit-for-purpose'' samples, providing quality related meta-data, gaining access to standardized medical data and annotations, and mass scanning of whole slides including efficient data management solutions.
This book presents the latest advances in remote-sensing and geographic information systems and applications. It is divided into four parts, focusing on Airborne Light Detection and Ranging (LiDAR) and Optical Measurements of Forests; Individual Tree Modelling; Landscape Scene Modelling; and Forest Eco-system Modelling. Given the scope of its coverage, the book offers a valuable resource for students, researchers, practitioners, and educators interested in remote sensing and geographic information systems and applications.
This book presents the algorithms used to provide recommendations by exploiting matrix factorization and tensor decomposition techniques. It highlights well-known decomposition methods for recommender systems, such as Singular Value Decomposition (SVD), UV-decomposition, Non-negative Matrix Factorization (NMF), etc. and describes in detail the pros and cons of each method for matrices and tensors. This book provides a detailed theoretical mathematical background of matrix/tensor factorization techniques and a step-by-step analysis of each method on the basis of an integrated toy example that runs throughout all its chapters and helps the reader to understand the key differences among methods. It also contains two chapters, where different matrix and tensor methods are compared experimentally on real data sets, such as Epinions, GeoSocialRec, Last.fm, BibSonomy, etc. and provides further insights into the advantages and disadvantages of each method. The book offers a rich blend of theory and practice, making it suitable for students, researchers and practitioners interested in both recommenders and factorization methods. Lecturers can also use it for classes on data mining, recommender systems and dimensionality reduction methods.
This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.
This book presents the latest insights and developments in the field of socio-cultural inspired algorithms. Akin to evolutionary and swarm-based optimization algorithms, socio-cultural algorithms belong to the category of metaheuristics (problem-independent computational methods) and are inspired by natural and social tendencies observed in humans by which they learn from one another through social interactions. This book is an interesting read for engineers, scientists, and students studying/working in the optimization, evolutionary computation, artificial intelligence (AI) and computational intelligence fields.
We delight in using our eyes, particularly when puzzling over pictures. Art and illusionists is a celebration of pictures and the multiple modes of manipulating them to produce illusory worlds on flat surfaces. This has proved fascinating to humankind since the dawning of depiction. Art and illusionists is also a celebration of the ways we see pictures, and of our ability to distil meaning from arrays of contours and colours. Pictures are not only a source of fascination for artists, who produce them, but also for scientists, who analyse the perceptual effects they induce. Illusions provide the glue to cement the art and science of vision. Painters plumb the art of observation itself whereas scientists peer into the processes of perception. Both visual artists and scientists have produced patterns that perplex our perceptions and present us with puzzles that we are pleased to peruse. Art and illusionists presents these two poles of pictorial representation as well as presenting novel 'perceptual portraits' of the artists and scientists who have augmented the art of illusion. The reader can experience the paradoxes of pictures as well as producing their own by using the stereoscopic glasses enclosed and the transparent overlay for making dynamic moire patterns.
This book includes papers presented at the Second International Conference on Electronic Engineering and Renewable Energy (ICEERE 2020), which focus on the application of artificial intelligence techniques, emerging technology and the Internet of things in electrical and renewable energy systems, including hybrid systems, micro-grids, networking, smart health applications, smart grid, mechatronics and electric vehicles. It particularly focuses on new renewable energy technologies for agricultural and rural areas to promote the development of the Euro-Mediterranean region. Given its scope, the book is of interest to graduate students, researchers and practicing engineers working in the fields of electronic engineering and renewable energy.
Mismatch or best match? This book demonstrates that best matching of individual entities to each other is essential to ensure smooth conduct and successful competitiveness in any distributed system, natural and artificial. Interactions must be optimized through best matching in planning and scheduling, enterprise network design, transportation and construction planning, recruitment, problem solving, selective assembly, team formation, sensor network design, and more. Fundamentals of best matching in distributed and collaborative systems are explained by providing: Methodical analysis of various multidimensional best matching processes Comprehensive taxonomy, comparing different best matching problems and processes Systematic identification of systems' hierarchy, nature of interactions, and distribution of decision-making and control functions Practical formulation of solutions based on a library of best matching algorithms and protocols, ready for direct applications and apps development. Designed for both academics and practitioners, oriented to systems engineers and applied operations researchers, diverse types of best matching processes are explained in production, manufacturing, business and service, based on a new reference model developed at Purdue University PRISM Center: "The PRISM Taxonomy of Best Matching". The book concludes with major challenges and guidelines for future basic and applied research in the area of best matching.
The book is based on the research papers presented in Second International Conference on Recent Advances in Information Technology (RAIT 2014), held at Indian School of Mines, Dhanbad, India. It provides the latest developments in the area of information technology and covers a variety of topics, including Advanced Algorithm Design and Analysis, Algorithmic Graph Theory, Artificial Intelligence, Bioinformatics, Circuit Design Automation, Computational Biology, Computational Mathematics, Cryptology, Data Compression, Database Management System, Data Mining, E-Applications, Embedded System, Information and Network Security, Information Retrieval, Internet Computing, etc. The objective is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods.
The focus of this book is on three influential cognitive motives: achievement, affiliation, and power motivation. Incentive-based theories of achievement, affiliation and power motivation are the basis for competence-seeking behaviour, relationship-building, leadership, and resource-controlling behaviour in humans. In this book we show how these motives can be modelled and embedded in artificial agents to achieve behavioural diversity. Theoretical issues are addressed for representing and embedding computational models of motivation in rule-based agents, learning agents, crowds and evolution of motivated agents. Practical issues are addressed for defining games, mini-games or in-game scenarios for virtual worlds in which computer-controlled, motivated agents can participate alongside human players. The book is structured into four parts: game playing in virtual worlds by humans and agents; comparing human and artificial motives; game scenarios for motivated agents; and evolution and the future of motivated game-playing agents. It will provide game programmers, and those with an interest in artificial intelligence, with the knowledge required to develop diverse, believable game-playing agents for virtual worlds.
This book thoroughly discusses computationally efficient (suboptimal) Model Predictive Control (MPC) techniques based on neural models. The subjects treated include:
. A few types of suboptimal MPC algorithms in which a linear approximation of the model or of the predicted trajectory is successively calculated on-line and used for prediction.
. Implementation details of the MPC algorithms for feed forward perceptron neural models, neural Hammerstein models, neural Wiener models and state-space neural models.
. The MPC algorithms based on neural multi-models (inspired by the idea of predictive control).
. The MPC algorithms with neural approximation with no on-line linearization.
. The MPC algorithms with guaranteed stability and robustness.
. Cooperation between the MPC algorithms and set-point optimization.
Thanks to linearization (or neural approximation), the presented suboptimal algorithms do not require demanding on-line nonlinear optimization. The presented simulation results demonstrate high accuracy and computational efficiency of the algorithms. For a few representative nonlinear benchmark processes, such as chemical reactors and a distillation column, for which the classical MPC algorithms based on linear models do not work properly, the trajectories obtained in the suboptimal MPC algorithms are very similar to those given by the ideal'' MPC algorithm with on-line nonlinear optimization repeated at each sampling instant. At the same time, the suboptimal MPC algorithms are significantly less computationally demanding."
This book introduces multiagent planning under uncertainty as formalized by decentralized partially observable Markov decision processes (Dec-POMDPs). The intended audience is researchers and graduate students working in the fields of artificial intelligence related to sequential decision making: reinforcement learning, decision-theoretic planning for single agents, classical multiagent planning, decentralized control, and operations research.
The LNCS journal Transactions on Large-Scale Data- and Knowledge-Centered Systems focuses on data management, knowledge discovery, and knowledge processing, which are core and hot topics in computer science. Since the 1990s, the Internet has become the main driving force behind application development in all domains. An increase in the demand for resource sharing across different sites connected through networks has led to an evolution of data- and knowledge-management systems from centralized systems to decentralized systems enabling large-scale distributed applications providing high scalability. Current decentralized systems still focus on data and knowledge as their main resource. Feasibility of these systems relies basically on P2P (peer-to-peer) techniques and the support of agent systems with scaling and decentralized control. Synergy between grids, P2P systems, and agent technologies is the key to data- and knowledge-centered systems in large-scale environments.This volume, the 26th issue of Transactions on Large-Scale Data- and Knowledge-Centered Systems, focuses on Data Warehousing and Knowledge Discovery from Big Data, and contains extended and revised versions of four papers selected as the best papers from the 16th International Conference on Data Warehousing and Knowledge Discovery (DaWaK 2014), held in Munich, Germany, during September 1-5, 2014. The papers focus on data cube computation, the construction and analysis of a data warehouse in the context of cancer epidemiology, pattern mining algorithms, and frequent item-set border approximation.
This book covers recent advances in Complex Automated Negotiations as a widely studied emerging area in the field of Autonomous Agents and Multi-Agent Systems. The book includes selected revised and extended papers from the 7th International Workshop on Agent-Based Complex Automated Negotiation (ACAN2014), which was held in Paris, France, in May 2014. The book also includes brief introductions about Agent-based Complex Automated Negotiation which are based on tutorials provided in the workshop, and brief summaries and descriptions about the ANAC'14 (Automated Negotiating Agents Competition) competition, where authors of selected finalist agents explain the strategies and the ideas used by them. The book is targeted to academic and industrial researchers in various communities of autonomous agents and multi-agent systems, such as agreement technology, mechanism design, electronic commerce, related areas, as well as graduate, undergraduate, and PhD students working in those areas or having interest in them.
This book discusses soft computing, which provides an efficient platform to deal with imprecision, uncertainty, vagueness and approximation in order to attain robustness and reliable computing. It explores two major concepts of soft computing: fuzzy set theory and neural networks, which relate to uncertainty handling and machine learning techniques respectively. Generally, fuzzy sets are considered as vague or uncertain sets having membership function lying between 0 and 1, and ANN is a type of artificial intelligence that attempts to imitate the way a human brain works by configuring specific applications, for instance pattern recognition or data classification, through learning processes. The book also presents C/MATLAB programming codes related to the basics of fuzzy set, interval arithmetic and ANN in a concise, practical and adaptable manner along, with simple examples and self-validation unsolved practice questions in few cases
This book offers a self-study program on how mathematics, computer science and science can be profitably and seamlessly intertwined. This book focuses on two variable ODE models, both linear and nonlinear, and highlights theoretical and computational tools using MATLAB to explain their solutions. It also shows how to solve cable models using separation of variables and the Fourier Series.
This book shows cognitive scientists in training how mathematics, computer science and science can be usefully and seamlessly intertwined. It is a follow-up to the first two volumes on mathematics for cognitive scientists, and includes the mathematics and computational tools needed to understand how to compute the terms in the Fourier series expansions that solve the cable equation. The latter is derived from first principles by going back to cellular biology and the relevant biophysics. A detailed discussion of ion movement through cellular membranes, and an explanation of how the equations that govern such ion movement leading to the standard transient cable equation are included. There are also solutions for the cable model using separation of variables, as well an explanation of why Fourier series converge and a description of the implementation of MatLab tools to compute the solutions. Finally, the standard Hodgkin - Huxley model is developed for an excitable neuron and is solved using MatLab.
This book provides a self-study program on how mathematics, computer science and science can be usefully and seamlessly intertwined. Learning to use ideas from mathematics and computation is essential for understanding approaches to cognitive and biological science. As such the book covers calculus on one variable and two variables and works through a number of interesting first-order ODE models. It clearly uses MatLab in computational exercises where the models cannot be solved by hand, and also helps readers to understand that approximations cause errors - a fact that must always be kept in mind.
This book presents recent research in the recognition of vulnerabilities of national systems and assets which gained special attention for the Critical Infrastructures in the last two decades. The book concentrates on R&D activities in the relation of Critical Infrastructures focusing on enhancing the performance of services as well as the level of security. The objectives of the book are based on a project entitled "Critical Infrastructure Protection Researches" (TAMOP-4.2.1.B-11/2/KMR-2011-0001) which concentrated on innovative UAV solutions, robotics, cybersecurity, surface engineering, and mechatornics and technologies providing safe operations of essential assets. This report is summarizing the methodologies and efforts taken to fulfill the goals defined. The project has been performed by the consortium of the Obuda University and the National University of Public Service.
The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.
This volume, the 23rd issue of Transactions on Large-Scale Data- and Knowledge-Centered Systems,focuses on information and security engineering. It contains five revised and extended papers selected from the proceedings of the First International Conference on Future Data and Security Engineering, FDSE 2014, held in Ho Chi Minh City, Vietnam, November 19-21, 2014. The titles of the five papers are as follows: A Natural Language Processing Tool for White Collar Crime Investigation; Data Leakage Analysis of the Hibernate Query Language on a Propositional Formulae Domain; An Adaptive Similarity Search in Massive Datasets; Semantic Attack on anonymized Transactions; and Private Indexes for Mixed Encrypted Databases.
Cognitive Neuroscience Robotics is the first introductory book on this new interdisciplinary area. This book consists of two volumes, the first of which, Synthetic Approaches to Human Understanding, advances human understanding from a robotics or engineering point of view. The second, Analytic Approaches to Human Understanding, addresses related subjects in cognitive science and neuroscience. These two volumes are intended to complement each other in order to more comprehensively investigate human cognitive functions, to develop human-friendly information and robot technology (IRT) systems, and to understand what kind of beings we humans are. Volume A describes how human cognitive functions can be replicated in artificial systems such as robots, and investigates how artificial systems could acquire intelligent behaviors through interaction with others and their environment.
This volume is an initiative undertaken by the IEEE Computational Intelligence Society's Task Force on Security, Surveillance and Defense to consolidate and disseminate the role of CI techniques in the design, development and deployment of security and defense solutions. Applications range from the detection of buried explosive hazards in a battlefield to the control of unmanned underwater vehicles, the delivery of superior video analytics for protecting critical infrastructures or the development of stronger intrusion detection systems and the design of military surveillance networks. Defense scientists, industry experts, academicians and practitioners alike will all benefit from the wide spectrum of successful applications compiled in this volume. Senior undergraduate or graduate students may also discover uncharted territory for their own research endeavors.
This Springer Brief introduces wireless sensor networks (WSNs) and the need for resilient WSN operations in application domains. It presents traditional approaches of providing resilient operation of WSNs to ensure continuity of data delivery even when some network sensors fail. The pros and cons of each these approaches are discussed. Also included is an overview of network coding basics, and motivate the use of network coding-based protection in order to combine the advantages, but avoid the disadvantages of the traditional approaches. The authors cover the design and analysis of a centralized approach to network coding-based protection of WSNs. The coverage also includes practical and realistic network situations, and the coding strategies employed. Next discussed is how to recover from data losses using a distributed approach, which is more practical for large scale WSNs. Algorithms for scheduling of transmissions while implementing network coding-based in the two cases of using digital network coding, and analog network coding are covered. Resilient Wireless Sensor Networks includes a concise, but an in-depth coverage of the use of network coding to implement agile and resource efficient protection in WSNs. It will be of interest to professionals and researchers working in the areas of fault tolerance, network coding, and deployment of WSNs in harsh environment. Advanced-level students in electrical engineering and computer science will also find this research valuable as a reference.
You may like...
Computer Age Statistical Inference…
Bradley Efron, Trevor Hastie Hardcover
A Formal Theory of Commonsense…
Andrew S Gordon, Jerry R. Hobbs Hardcover
Finite-State Techniques - Automata…
Stoyan Mihov, Klaus U. Schulz Hardcover
The Alignment Problem - How Can Machines…
Brian Christian Paperback
Neural Approximations for Optimal…
Riccardo Zoppoli, Marcello Sanguineti, … Hardcover R5,337 Discovery Miles 53 370
Mind Performance Projects for the Evil…
Brad Graham, Kathy McGowan Paperback
Springer Handbook of Computational…
Janusz Kacprzyk, Witold Pedrycz Hardcover
Autonomy - The Quest to Build the…
Lawrence Burns Paperback (1)
Deep Fakes and the Infocalypse - What…
Nina Schick Paperback (1)
Girl Decoded - My Quest to Make…
Rana El Kaliouby Paperback (1)