![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.
This special book is dedicated to the memory of Professor Zdzislaw Pawlak, the father of rough set theory, in order to commemorate both the 10th anniversary of his passing and 35 years of rough set theory. The book consists of 20 chapters distributed into four sections, which focus in turn on a historical review of Professor Zdzislaw Pawlak and rough set theory; a review of the theory of rough sets; the state of the art of rough set theory; and major developments in rough set based data mining approaches. Apart from Professor Pawlak's contributions to rough set theory, other areas he was interested in are also included. Moreover, recent theoretical studies and advances in applications are also presented. The book will offer a useful guide for researchers in Knowledge Engineering and Data Mining by suggesting new approaches to solving the problems they encounter.
This book provides a systematic and comprehensive description of Non-Axiomatic Logic, which is the result of the author's research for about three decades.Non-Axiomatic Logic is designed to provide a uniform logical foundation for Artificial Intelligence, as well as an abstract description of the "laws of thought" followed by the human mind. Different from "mathematical" logic, where the focus is the regularity required when demonstrating mathematical conclusions, Non-Axiomatic Logic is an attempt to return to the original aim of logic, that is, to formulate the regularity in actual human thinking. To achieve this goal, the logic is designed under the assumption that the system has insufficient knowledge and resources with respect to the problems to be solved, so that the "logical conclusions" are only valid with respect to the available knowledge and resources. Reasoning processes according to this logic covers cognitive functions like learning, planning, decision making, problem solving, etc.This book is written for researchers and students in Artificial Intelligence and Cognitive Science, and can be used as a textbook for courses at graduate level, or upper-level undergraduate, on Non-Axiomatic Logic.
This authored monograph presents key aspects of signal processing analysis in the biomedical arena. Unlike wireless communication systems, biological entities produce signals with underlying nonlinear, chaotic nature that elude classification using the standard signal processing techniques, which have been developed over the past several decades for dealing primarily with standard communication systems. This book separates what is random from that which appears to be random and yet is truly deterministic with random appearance. At its core, this work gives the reader a perspective on biomedical signals and the means to classify and process such signals. In particular, a review of random processes along with means to assess the behavior of random signals is also provided. The book also includes a general discussion of biological signals in order to demonstrate the inefficacy of the well-known techniques to correctly extract meaningful information from such signals. Finally, a thorough discussion of recently proposed signal processing tools and methods for addressing biological signals is included. The target audience primarily comprises researchers and expert practitioners but the book may also be beneficial for graduate students.
Search is not just a box and ten blue links. Search is a
journey: an exploration where what we encounter along the way
changes what we seek. But in order to guide people along this
journey, designers must understand both the art and science of
search.In "Designing the Search Experience, "authors Tony
Russell-Rose and Tyler Tate weave together the theories of
information seeking with the practice of user interface
design. Understand how people search, and how the concepts of information seeking, information foraging, and sensemaking underpin the search process. Apply the principles of user-centered design to the search box, search results, faceted navigation, mobile interfaces, social search, and much more. Design the cross-channel search experiences of tomorrow that span desktop, tablet, mobile, and other devices.
Snake Robots is a novel treatment of theoretical and practical topics related to snake robots: robotic mechanisms designed to move like biological snakes and able to operate in challenging environments in which human presence is either undesirable or impossible. Future applications of such robots include search and rescue, inspection and maintenance, and subsea operations. Locomotion in unstructured environments is a focus for this book. The text targets the disparate muddle of approaches to modelling, development and control of snake robots in current literature, giving a unified presentation of recent research results on snake robot locomotion to increase the reader's basic understanding of these mechanisms and their motion dynamics and clarify the state of the art in the field. The book is a complete treatment of snake robotics, with topics ranging from mathematical modelling techniques, through mechatronic design and implementation, to control design strategies. The development of two snake robots is described and both are used to provide experimental validation of many of the theoretical results. Snake Robots is written in a clear and easily understandable manner which makes the material accessible by specialists in the field and non-experts alike. Numerous illustrative figures and images help readers to visualize the material. The book is particularly useful to new researchers taking on a topic related to snake robots because it provides an extensive overview of the snake robot literature and also represents a suitable starting point for research in this area.
Call Admission Control (CAC) and Dynamic Channel Assignments (DCA) are important decision-making problems in mobile cellular communication systems. Current research in mobile communication considers them as two independent problems, although the former greatly depends on the resulting free channels obtained as the outcome of the latter. This book provides a solution to the CAC problem, considering DCA as an integral part of decision-making for call admission. Further, current technical resources ignore movement issues of mobile stations and fluctuation in network load (incoming calls) in the control strategy used for call admission. In addition, the present techniques on call admission offers solution globally for the entire network, instead of considering the cells independently. CAC here has been formulated by two alternative approaches. The first approach aimed at handling the uncertainty in the CAC problem by employing fuzzy comparators. The second approach is concerned with formulation of CAC as an optimization problem to minimize call drop, satisfying a set of constraints on feasibility and availability of channels, hotness of cells, and velocity and angular displacement of mobile stations. Evolutionary techniques, including Genetic Algorithm and Biogeography Based Optimization, have been employed to solve the optimization problems. The proposed approaches outperform traditional methods with respect to grade and quality of services.
Some of the fundamental constraints of automated machine vision have been the inability to automatically adapt parameter settings or utilize previous adaptations in changing environments. Symbolic Visual Learning presents research which adds visual learning capabilities to computer vision systems. Using this state-of-the-art recognition technology, the outcome is different adaptive recognition systems that can measure their own performance, learn from their experience and outperform conventional static designs. Written as a companion volume to Early Visual Learning (edited by S. Nayar and T. Poggio), this book is intended for researchers and students in machine vision and machine learning.
Micro/Nano Robotics and Automation technologies have rapidly grown associated with the growth of Micro and Nanotechnologies. This book presents a summary of fundamentals in micro-nano scale engineering and the current state of the art of these technologies. "Micro-Nanorobotic Manipulation Systems and their Applications" introduces these advanced technologies from the basics and applications aspects of Micro/Nano-Robotics and Automation from the prospective micro/nano-scale manipulation. The book is organized in 9 chapters including an overview chapter of Micro/Nanorobotics and Automation technology from the historical view and important related research works. Further chapters are devoted to the physics of micro-nano fields as well as to material and science, microscopes, fabrication technology, importance of biological cell, and control techniques. Furthermore important examples, applications and a concise summary of Micro-Nanorobotics and Automation technologies are given.
The importance of human-computer system interaction problems is increasing due to the growing expectations of users on general computer systems capabilities in human work and life facilitation. Users expect system which is not only a passive tool in human hands but rather an active partner equipped with a sort of artificial intelligence, having access to large information resources, being able to adapt its behavior to the human requirements and to collaborate with the human users. This book collects examples of recent human-computer system solutions. The content of the book is divided into three parts. Part I is devoted to detection, recognition and reasoning in different circumstances and applications. Problems associated with data modeling, acquisition and mining are presented by papers collected in part II and part III is devoted to Optimization.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
This book addresses the challenges of designing high performance analog-to-digital converters (ADCs) based on the "smart data converters" concept, which implies context awareness, on-chip intelligence and adaptation. Readers will learn to exploit various information either a-priori or a-posteriori (obtained from devices, signals, applications or the ambient situations, etc.) for circuit and architecture optimization during the design phase or adaptation during operation, to enhance data converters performance, flexibility, robustness and power-efficiency. The authors focus on exploiting the a-priori knowledge of the system/application to develop enhancement techniques for ADCs, with particular emphasis on improving the power efficiency of high-speed and high-resolution ADCs for broadband multi-carrier systems.
This book presents a comprehensive study of different tools and techniques available to perform network forensics. Also, various aspects of network forensics are reviewed as well as related technologies and their limitations. This helps security practitioners and researchers in better understanding of the problem, current solution space, and future research scope to detect and investigate various network intrusions against such attacks efficiently. Forensic computing is rapidly gaining importance since the amount of crime involving digital systems is steadily increasing. Furthermore, the area is still underdeveloped and poses many technical and legal challenges. The rapid development of the Internet over the past decade appeared to have facilitated an increase in the incidents of online attacks. There are many reasons which are motivating the attackers to be fearless in carrying out the attacks. For example, the speed with which an attack can be carried out, the anonymity provided by the medium, nature of medium where digital information is stolen without actually removing it, increased availability of potential victims and the global impact of the attacks are some of the aspects. Forensic analysis is performed at two different levels: Computer Forensics and Network Forensics. Computer forensics deals with the collection and analysis of data from computer systems, networks, communication streams and storage media in a manner admissible in a court of law. Network forensics deals with the capture, recording or analysis of network events in order to discover evidential information about the source of security attacks in a court of law. Network forensics is not another term for network security. It is an extended phase of network security as the data for forensic analysis are collected from security products like firewalls and intrusion detection systems. The results of this data analysis are utilized for investigating the attacks. Network forensics generally refers to the collection and analysis of network data such as network traffic, firewall logs, IDS logs, etc. Technically, it is a member of the already-existing and expanding the field of digital forensics. Analogously, network forensics is defined as "The use of scientifically proved techniques to collect, fuses, identifies, examine, correlate, analyze, and document digital evidence from multiple, actively processing and transmitting digital sources for the purpose of uncovering facts related to the planned intent, or measured success of unauthorized activities meant to disrupt, corrupt, and or compromise system components as well as providing information to assist in response to or recovery from these activities." Network forensics plays a significant role in the security of today's organizations. On the one hand, it helps to learn the details of external attacks ensuring similar future attacks are thwarted. Additionally, network forensics is essential for investigating insiders' abuses that constitute the second costliest type of attack within organizations. Finally, law enforcement requires network forensics for crimes in which a computer or digital system is either being the target of a crime or being used as a tool in carrying a crime. Network security protects the system against attack while network forensics focuses on recording evidence of the attack. Network security products are generalized and look for possible harmful behaviors. This monitoring is a continuous process and is performed all through the day. However, network forensics involves post mortem investigation of the attack and is initiated after crime notification. There are many tools which assist in capturing data transferred over the networks so that an attack or the malicious intent of the intrusions may be investigated. Similarly, various network forensic frameworks are proposed in the literature.
This book offers a self-study program on how mathematics, computer science and science can be profitably and seamlessly intertwined. This book focuses on two variable ODE models, both linear and nonlinear, and highlights theoretical and computational tools using MATLAB to explain their solutions. It also shows how to solve cable models using separation of variables and the Fourier Series.
Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation. Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo search, and multiobjective optimization and many applications. These reviews and chapters not only provide a timely snapshot of the state-of-art developments, but also provide inspiration for young researchers to carry out potentially ground-breaking research in the active, diverse research areas in artificial intelligence, cryptography, machine learning, evolutionary computation, and nature-inspired metaheuristics. This edited book can serve as a timely reference for graduates, researchers and engineers in artificial intelligence, computer sciences, computational intelligence, soft computing, optimization, and applied sciences.
Demystify the world of artificial intelligence with this groundbreaking guide featuring over 100 innovative ways to incorporate AI into your daily life. Every day, it seems like there’s a new AI tool on the market and a new, complicated way to use it. But what if you could use AI to make your life easier without the complications? In AI for Life, AI expert and creator of @SmartWorkAI offers over 100 ideas and ready-to-use prompts to get AI beginners started using the technology to actually improve their lives. Beginning with a primer on the basics—including an overview of the popular and free AI tools—you will learn expert-tested tips and tricks to get the most out of your AI use, such as layering prompts to dive deeper into an initial response or asking for the output in different formats. Packed with practical how-to information, AI for Life is the must-have guide for using generative AI to make life easier, more productive, more organized, and more fun!
The work described in this book is an excellent example of interdisciplinary research in systems biology. It shows how concepts and approaches from the field of physics can be efficiently used to answer biological questions and reports on a novel methodology involving creative computer-based analyses of high-throughput biological data. Many of the findings described in the book, which are the result of collaborations between the author (a theoretical scientist) and experimental biologists and between different laboratories, have been published in high-quality peer-reviewed journals such as Molecular Cell and Nature. However, while those publications address different aspects of post-transcriptional gene regulation, this book provides readers with a complete, coherent and logical view of the research project as a whole. The introduction presents post-transcriptional gene regulation from a distinct angle, highlighting aspects of information theory and evolution and laying the groundwork for the questions addressed in the subsequent chapters, which concern the regulation of the transcriptome as the primary functional carrier of active genetic information.
This book chiefly presents a novel approach referred to as backward fuzzy rule interpolation and extrapolation (BFRI). BFRI allows observations that directly relate to the conclusion to be inferred or interpolated from other antecedents and conclusions. Based on the scale and move transformation interpolation, this approach supports both interpolation and extrapolation, which involve multiple hierarchical intertwined fuzzy rules, each with multiple antecedents. As such, it offers a means of broadening the applications of fuzzy rule interpolation and fuzzy inference. The book deals with the general situation, in which there may be more than one antecedent value missing for a given problem. Two techniques, termed the parametric approach and feedback approach, are proposed in an attempt to perform backward interpolation with multiple missing antecedent values. In addition, to further enhance the versatility and potential of BFRI, the backward fuzzy interpolation method is extended to support -cut based interpolation by employing a fuzzy interpolation mechanism for multi-dimensional input spaces (IMUL). Finally, from an integrated application analysis perspective, experimental studies based upon a real-world scenario of terrorism risk assessment are provided in order to demonstrate the potential and efficacy of the hierarchical fuzzy rule interpolation methodology.
This book presents mathematical models of mob control with threshold (conformity) collective decision-making of the agents. Based on the results of analysis of the interconnection between the micro- and macromodels of active network structures, it considers the static (deterministic, stochastic and game-theoretic) and dynamic (discrete- and continuous-time) models of mob control, and highlights models of informational confrontation. Many of the results are applicable not only to mob control problems, but also to control problems arising in social groups, online social networks, etc. Aimed at researchers and practitioners, it is also a valuable resource for undergraduate and postgraduate students as well as doctoral candidates specializing in the field of collective behavior modeling.
Nowadays embedded and real-time systems contain complex software. The complexity of embedded systems is increasing, and the amount and variety of software in the embedded products are growing. This creates a big challenge for embedded and real-time software development processes and there is a need to develop separate metrics and benchmarks. "Embedded and Real Time System Development: A Software Engineering Perspective: Concepts, Methods and Principles" presents practical as well as conceptual knowledge of the latest tools, techniques and methodologies of embedded software engineering and real-time systems. Each chapter includes an in-depth investigation regarding the actual or potential role of software engineering tools in the context of the embedded system and real-time system. The book presents state-of-the art and future perspectives with industry experts, researchers, and academicians sharing ideas and experiences including surrounding frontier technologies, breakthroughs, innovative solutions and applications. The book is organized into four parts "Embedded Software Development Process", "Design Patterns and Development Methodology", "Modelling Framework" and "Performance Analysis, Power Management and Deployment" with altogether 12 chapters. The book is aiming at (i) undergraduate students and postgraduate students conducting research in the areas of embedded software engineering and real-time systems; (ii) researchers at universities and other institutions working in these fields; and (iii) practitioners in the R&D departments of embedded system. It can be used as an advanced reference for a course taught at the postgraduate level in embedded software engineering and real-time systems.
This book presents an Intelligent Control Architecture (ICA) to enable multiple collaborating marine vehicles to autonomously carry out underwater intervention missions. The presented ICA is generic in nature but aimed at a case study where a marine surface craft and an underwater vehicle are required to work cooperatively. It is shown that they are capable of cooperating autonomously towards the execution of complex activities since they have different but complementary capabilities. The ICA implementation is verified in simulation, and validated in trials by means of a team of autonomous marine robots. This book also presents architectural details and evaluation scenarios of the ICA, results of simulations and trials from different maritime operations, and future research directions.
This book offers a comprehensive and systematic introduction to the latest research on hesitant fuzzy decision-making theory. It includes six parts: the hesitant fuzzy set and its extensions, novel hesitant fuzzy measures, hesitant fuzzy hybrid weighted aggregation operators, hesitant fuzzy multiple-criteria decision-making with incomplete weights, hesitant fuzzy multiple criteria decision-making with complete weights information, and the hesitant fuzzy preference relation based decision-making theory. These methodologies are implemented in various fields such as decision-making, medical diagnosis, cluster analysis, service quality management, e-learning management and environmental management. A valuable resource for engineers, technicians, and researchers in the fields of fuzzy mathematics, operations research, information science, management science and engineering, it can also be used as a textbook for postgraduate and senior undergraduate students.
Business and medical professionals rely on large data sets to identify trends or other knowledge that can be gleaned from the collection of it. New technologies concentrate on data's management, but do not facilitate users' extraction of meaningful outcomes. Pattern and Data Analysis in Healthcare Settings investigates the approaches to shift computing from analysis on-demand to knowledge on-demand. By providing innovative tactics to apply data and pattern analysis, these practices are optimized into pragmatic sources of knowledge for healthcare professionals. This publication is an exhaustive source for policy makers, developers, business professionals, healthcare providers, and graduate students concerned with data retrieval and analysis. |
You may like...
Artificial Intelligence for Neurological…
Ajith Abraham, Sujata Dash, …
Paperback
R3,925
Discovery Miles 39 250
Handbook of Research on Cyber Security…
Jena Om Prakash, H L Gururaj, …
Hardcover
R5,931
Discovery Miles 59 310
Stochastic Processes and Their…
Christo Ananth, N. Anbazhagan, …
Hardcover
R6,687
Discovery Miles 66 870
|