![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
In the mid 1990s, Tim Berners-Lee had the idea of developing the World Wide Web into a "Semantic Web", a web of information that could be interpreted by machines in order to allow the automatic exploitation of data, which until then had to be done by humans manually. One of the first people to research topics related to the Semantic Web was Professor Rudi Studer. From the beginning, Rudi drove projects like ONTOBROKER and On-to-Knowledge, which later resulted in W3C standards such as RDF and OWL. By the late 1990s, Rudi had established a research group at the University of Karlsruhe, which later became the nucleus and breeding ground for Semantic Web research, and many of today's well-known research groups were either founded by his disciples or benefited from close cooperation with this think tank. In this book, published in celebration of Rudi's 60th birthday, many of his colleagues look back on the main research results achieved during the last 20 years. Under the editorship of Dieter Fensel, once one of Rudi's early PhD students, an impressive list of contributors and contributions has been collected, covering areas like Knowledge Management, Ontology Engineering, Service Management, and Semantic Search. Overall, this book provides an excellent overview of the state of the art in Semantic Web research, by combining historical roots with the latest results, which may finally make the dream of a "Web of knowledge, software and services" come true.
Recent advancements in the field of telecommunications, medical imaging and signal processing deal with signals that are inherently time varying, nonlinear and complex-valued. The time varying, nonlinear characteristics of these signals can be effectively analyzed using artificial neural networks. Furthermore, to efficiently preserve the physical characteristics of these complex-valued signals, it is important to develop complex-valued neural networks and derive their learning algorithms to represent these signals at every step of the learning process. This monograph comprises a collection of new supervised learning algorithms along with novel architectures for complex-valued neural networks. The concepts of meta-cognition equipped with a self-regulated learning have been known to be the best human learning strategy. In this monograph, the principles of meta-cognition have been introduced for complex-valued neural networks in both the batch and sequential learning modes. For applications where the computation time of the training process is critical, a fast learning complex-valued neural network called as a fully complex-valued relaxation network along with its learning algorithm has been presented. The presence of orthogonal decision boundaries helps complex-valued neural networks to outperform real-valued networks in performing classification tasks. This aspect has been highlighted. The performances of various complex-valued neural networks are evaluated on a set of benchmark and real-world function approximation and real-valued classification problems.
This book describes five qualitative investment decision-making methods based on the hesitant fuzzy information. They are: (1) the investment decision-making method based on the asymmetric hesitant fuzzy sigmoid preference relations, (2) the investment decision-making method based on the hesitant fuzzy trade-off and portfolio selection, (3) the investment decision-making method based on the hesitant fuzzy preference envelopment analysis, (4) the investment decision-making method based on the hesitant fuzzy peer-evaluation and strategy fusion, and (5) the investment decision-making method based on the EHVaR measurement and tail analysis.
This volume contains a contemporary, integrated description of the processes of language. These range from fast scales (fractions of a second) to slow ones (over a million years). The contributors, all experts in their fields, address language in the brain, production of sentences and dialogues, language learning, transmission and evolutionary processes that happen over centuries or millenia, the relation between language and genes, the origins of language, self-organization, and language competition and death. The book as a whole will help to show how processes at different scales affect each other, thus presenting language as a dynamic, complex and profoundly human phenomenon.
Machine learning is concerned with the analysis of large data and multiple variables. However, it is also often more sensitive than traditional statistical methods to analyze small data. The first volume reviewed subjects like optimal scaling, neural networks, factor analysis, partial least squares, discriminant analysis, canonical analysis, and fuzzy modeling. This second volume includes various clustering models, support vector machines, Bayesian networks, discrete wavelet analysis, genetic programming, association rule learning, anomaly detection, correspondence analysis, and other subjects. Both the theoretical bases and the step by step analyses are described for the benefit of non-mathematical readers. Each chapter can be studied without the need to consult other chapters. Traditional statistical tests are, sometimes, priors to machine learning methods, and they are also, sometimes, used as contrast tests. To those wishing to obtain more knowledge of them, we recommend to additionally study (1) Statistics Applied to Clinical Studies 5th Edition 2012, (2) SPSS for Starters Part One and Two 2012, and (3) Statistical Analysis of Clinical Data on a Pocket Calculator Part One and Two 2012, written by the same authors, and edited by Springer, New York.
Context-aware ranking is an important task with many applications. E.g. in recommender systems items (products, movies, ...) and for search engines webpages should be ranked. In all these applications, the ranking is not global (i.e. always the same) but depends on the context. Simple examples for context are the user for recommender systems and the query for search engines. More complicated context includes time, last actions, etc. The major problem is that typically the variable domains (e.g. customers, products) are categorical and huge, the observations are very sparse and only positive events are observed. In this book, a generic method for context-aware ranking as well as its application are presented. For modelling a new factorization based on pairwise interactions is proposed and compared to other tensor factorization approaches. For learning, the `Bayesian Context-aware Ranking' framework consisting of an optimization criterion and algorithm is developed. The second main part of the book applies this general theory to the three scenarios of item, tag and sequential-set recommendation. Furthermore extensions of time-variant factors and one-class problems are studied. This book generalizes and builds on work that has received the `WWW 2010 Best Paper Award', the `WSDM 2010 Best Student Paper Award' and the `ECML/PKDD 2009 Best Discovery Challenge Award'.
In Artificial Intelligence in Finance and Investing, authors Robert Trippi and Jae Lee explain this fascinating new technology in terms that portfolio managers, institutional investors, investment analysis, and information systems professionals can understand. Using real-life examples and a practical approach, this rare and readable volume discusses the entire field of artificial intelligence of relevance to investing, so that readers can realize the benefits and evaluate the features of existing or proposed systems, and ultimately construct their own systems. Topics include using Expert Systems for Asset Allocation, Timing Decisions, Pattern Recognition, and Risk Assessment; overview of Popular Knowledge-Based Systems; construction of Synergistic Rule Bases for Securities Selection; incorporating the Markowitz Portfolio Optimization Model into Knowledge-Based Systems; Bayesian Theory and Fuzzy Logic System Components; Machine Learning in Portfolio Selection and Investment Timing, including Pattern-Based Learning and Fenetic Algorithms; and Neural Network-Based Systems. To illustrate the concepts presented in the book, the authors conclude with a valuable practice session and analysis of a typical knowledge-based system for investment management, K-FOLIO. For those who want to stay on the cutting edge of the "application" revolution, Artificial Intelligence in Finance and Investing offers a pragmatic introduction to the use of knowledge-based systems in securities selection and portfolio management.
There isn't a facet of human life that has not been touched and influenced by robots and automation. What makes robots and machines versatile is their computational intelligence. While modern intelligent sensors and powerful hardware capabilities have given a huge fillip to the growth of intelligent machines, the progress in the development of algorithms for smart interaction, collaboration and pro-activeness will result in the next quantum jump. This book deals with the recent advancements in design methodologies, algorithms and implementation techniques to incorporate intelligence in "robots and automation systems." Several articles deal with navigation, localization and mapping of mobile robots, a problem that engineers and researchers are grappling with all the time. Fuzzy logic, neural networks and neuro-fuzzy based techniques for real world applications have been detailed in a few articles. This edited volume is targeted to present the latest state-of-the-art computational intelligence techniques in Robotics and Automation. It is a compilation of the extended versions of the very best papers selected from the many that were presented at the 5th International Conference on Automation, Robotics and Applications (ICARA 2011) which was held in Wellington, New Zealand from 6-8 December, 2011. Scientists and engineers who work with robots and automation systems will find this book very useful and stimulating.
This monograph comprises work on network-based Intrusion Detection (ID) that is grounded in visualisation and hybrid Artificial Intelligence (AI). It has led to the design of MOVICAB-IDS (MObile VIsualisation Connectionist Agent-Based IDS), a novel Intrusion Detection System (IDS), which is comprehensively described in this book. This novel IDS combines different AI paradigms to visualise network traffic for ID at packet level. It is based on a dynamic Multiagent System (MAS), which integrates an unsupervised neural projection model and the Case-Based Reasoning (CBR) paradigm through the use of deliberative agents that are capable of learning and evolving with the environment. The proposed novel hybrid IDS provides security personnel with a synthetic, intuitive snapshot of network traffic and protocol interactions. This visualisation interface supports the straightforward detection of anomalous situations and their subsequent identification. The performance of MOVICAB-IDS was tested through a novel mutation-based testing method in different real domains which entailed several attacks and anomalous situations.
This book contains the combined proceedings of the 4th International Conference on Ubiquitous Computing Application and Wireless Sensor Network (UCAWSN-15) and the 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT-15). The combined proceedings present peer-reviewed contributions from academic and industrial researchers in fields including ubiquitous and context-aware computing, context-awareness reasoning and representation, location awareness services, and architectures, protocols and algorithms, energy, management and control of wireless sensor networks. The book includes the latest research results, practical developments and applications in parallel/distributed architectures, wireless networks and mobile computing, formal methods and programming languages, network routing and communication algorithms, database applications and data mining, access control and authorization and privacy preserving computation.
In this book, Haridimos Tsoukas, one of the most imaginative organization theorists of our time, examines the nature of knowledge in organizations, and how individuals and scholars approach the concept of knowledge. Tsoukas firstly looks at organizational knowledge and its embessedness in social contexts and forms of life. He shows that knowledge is not just a collection of free floating representations of the world to be used at will, but an activity constitute of the world. On the one hand, the organization as an institutionalized system does produce regularities that can be captured via propositional forms of knowledge. On the other, the organization as practice as a lifeworld, or as an open-ended system produce stories, values, and shared traditions which can only be captured by narrative forms of knowledge. Secondly, Tsoukas looks at the issue of how individuals deal with the notion of complexity in organizations: Our inability to reduce the behavior of complex organizations to their constituent parts. Drawing on concepts such as discourse, narrativity, and reflexivity, he adopts a hermeneutical approach to the issue. Finally, Tsoukas examines the concept of meta-knowledge, and how we know what we know. Arguing that the underlying representationalist epistemology of much of mainstream management causes many problems, he advocates adopting a more discursive approach. He describes what such an epistemology might be, and illustrates it with examples from organization studies and strategic management. An ideal introduction to the thinking of a leading organizational theorist, this book will be essential reading for academics, researchers, and students of Knowledge Management, Organization Studies, Management Studies, Business Strategy and Applied Epistemology.
Over the last several years there has been a growing interest in developing computational methodologies for modeling and analyzing movements and behaviors of 'crowds' of people. This interest spans several scientific areas that includes Computer Vision, Computer Graphics, and Pedestrian Evacuation Dynamics. Despite the fact that these different scientific fields are trying to model the same physical entity (i.e. a crowd of people), research ideas have evolved independently. As a result each discipline has developed techniques and perspectives that are characteristically their own. The goal of this book is to provide the readers a comprehensive map towards the common goal of better analyzing and synthesizing the pedestrian movement in dense, heterogeneous crowds. The book is organized into different parts that consolidate various aspects of research towards this common goal, namely the modeling, simulation, and visual analysis of crowds. Through this book, readers will see the common ideas and vision as well as the different challenges and techniques, that will stimulate novel approaches to fully grasping "crowds."
This book examines modern artificial intelligence to display how it may be applied to computer games. It spans the divide that exists between the academic research community working with advanced artificial intelligence and the games programming community which must create and release new and interesting games.
Vast amounts of data are nowadays collected, stored and processed, in an effort to assist in making a variety of administrative and governmental decisions. These innovative steps considerably improve the speed, effectiveness and quality of decisions. Analyses are increasingly performed by data mining and profiling technologies that statistically and automatically determine patterns and trends. However, when such practices lead to unwanted or unjustified selections, they may result in unacceptable forms of discrimination. Processing vast amounts of data may lead to situations in which data controllers know many of the characteristics, behaviors and whereabouts of people. In some cases, analysts might know more about individuals than these individuals know about themselves. Judging people by their digital identities sheds a different light on our views of privacy and data protection. This book discusses discrimination and privacy issues related to data mining and profiling practices. It provides technological and regulatory solutions, to problems which arise in these innovative contexts. The book explains that common measures for mitigating privacy and discrimination, such as access controls and anonymity, fail to properly resolve privacy and discrimination concerns. Therefore, new solutions, focusing on technology design, transparency and accountability are called for and set forth.
Imagine yourself as a military officer in a conflict zone trying to identify locations of weapons caches supporting road-side bomb attacks on your country's troops. Or imagine yourself as a public health expert trying to identify the location of contaminated water that is causing diarrheal diseases in a local population. Geospatial abduction is a new technique introduced by the authors that allows such problems to be solved. Geospatial Abduction provides the mathematics underlying geospatial abduction and the algorithms to solve them in practice; it has wide applicability and can be used by practitioners and researchers in many different fields. Real-world applications of geospatial abduction to military problems are included. Compelling examples drawn from other domains as diverse as criminology, epidemiology and archaeology are covered as well. This book also includes access to a dedicated website on geospatial abduction hosted by University of Maryland. Geospatial Abduction targets practitioners working in general AI, game theory, linear programming, data mining, machine learning, and more. Those working in the fields of computer science, mathematics, geoinformation, geological and biological science will also find this book valuable.
Artificial Intelligence (AI) is playing an increasing role in production and manufacturing engineering. Since a great deal of manufacturing knowledge can be put in the form of rules, expert systems have emerged as a promising practical tool of AI for solving engineering problems. Expert systems allow knowledge to be used for constructing human-machine systems that have specialized methods and techniques for solving problems in practical application areas. This book contains the structure and rules for 15 expert systems dealing with various aspects of production and manufacturing engineering. Topics covered range from casting design evaluation through to quality control. All expert systems included are accompanied by description of their structure and the rules are included on floppy disc in ASCII that can be readily accessed. These expert systems are generic in nature and readers should find it possible to modify these expert systems for their own specific applications.
Counting belongs to the most elementary and frequent mental activities of human beings. Its results are a basis for coming to a decision in a lot of situations and dimensions of our life. This book presents a novel approach to the advanced and sophisticated case, called intelligent counting, in which the objects of counting are imprecisely, fuzzily specified. Formally, this collapses to counting in fuzzy sets, interval-valued fuzzy sets or I-fuzzy sets (Atanassov's intuitionistic fuzzy sets). The monograph is the first one showing and emphasizing that the presented methods of intelligent counting are human-consistent: are reflections and formalizations of real, human counting procedures performed under imprecision and, possibly, incompleteness of information. Other applications of intelligent counting in various areas of intelligent systems and decision support will be discussed, too. The whole presentation is self-contained, systematic, and equipped with many examples, figures and tables. Computer and information scientists, researchers, engineers and practitioners, applied mathematicians, and postgraduate students interested in information imprecision are the target readers.
In this book, the editors explain how students enrolled in two digital forensic courses at their institution are exposed to experiential learning opportunities, where the students acquire the knowledge and skills of the subject-matter while also learning how to adapt to the ever-changing digital forensic landscape. Their findings (e.g., forensic examination of different IoT devices) are also presented in the book. Digital forensics is a topic of increasing importance as our society becomes "smarter" with more of the "things" around us been internet- and inter-connected (e.g., Internet of Things (IoT) and smart home devices); thus, the increasing likelihood that we will need to acquire data from these things in a forensically sound manner. This book is of interest to both digital forensic educators and digital forensic practitioners, as well as students seeking to learn about digital forensics.
(Preliminary): The Orthogonal Frequency Division Multiplexing (OFDM) digital transmission technique has several advantages in broadcast and mobile communications applications. The main objective of this book is to give a good insight into these efforts, and provide the reader with a comprehensive overview of the scientific progress which was achieved in the last decade. Besides topics of the physical layer, such as coding, modulation and non-linearities, a special emphasis is put on system aspects and concepts, in particular regarding cellular networks and using multiple antenna techniques. The work extensively addresses challenges of link adaptation, adaptive resource allocation and interference mitigation in such systems. Moreover, the domain of cross-layer design, i.e. the combination of physical layer aspects and issues of higher layers, are considered in detail. These results will facilitate and stimulate further innovation and development in the design of modern communication systems, based on the powerful OFDM transmission technique.
This volume includes papers presented at IIH-MSP 2017, the 13th International Conference on Intelligent Information Hiding and Multimedia Signal Processing, held from 12 to 15 August 2017 in Matsue, Shimane, Japan. The conference addresses topics ranging from information hiding and security, and multimedia signal processing and networking, to bio-inspired multimedia technologies and systems. This volume of Smart Innovation, Systems and Technologies focuses on subjects related to massive image/video compression and transmission for emerging networks, advances in speech and language processing, information hiding and signal processing for audio and speech signals, intelligent distribution systems and applications, recent advances in security and privacy for multimodal network environments, multimedia signal processing, and machine learning. Updated with the latest research outcomes and findings, the papers presented appeal to researchers and students who are interested in the corresponding fields.
This book explores cybersecurity research and development efforts, including ideas that deal with the growing challenge of how computing engineering can merge with neuroscience. The contributing authors, who are renowned leaders in this field, thoroughly examine new technologies that will automate security procedures and perform autonomous functions with decision making capabilities. To maximize reader insight into the range of professions dealing with increased cybersecurity issues, this book presents work performed by government, industry, and academic research institutions working at the frontier of cybersecurity and network sciences. Cybersecurity Systems for Human Cognition Augmentation is designed as a reference for practitioners or government employees working in cybersecurity. Advanced-level students or researchers focused on computer engineering or neuroscience will also find this book a useful resource.
This book presents powerful techniques for solving global optimization problems on manifolds by means of evolutionary algorithms, and shows in practice how these techniques can be applied to solve real-world problems. It describes recent findings and well-known key facts in general and differential topology, revisiting them all in the context of application to current optimization problems. Special emphasis is put on game theory problems. Here, these problems are reformulated as constrained global optimization tasks and solved with the help of Fuzzy ASA. In addition, more abstract examples, including minimizations of well-known functions, are also included. Although the Fuzzy ASA approach has been chosen as the main optimizing paradigm, the book suggests that other metaheuristic methods could be used as well. Some of them are introduced, together with their advantages and disadvantages. Readers should possess some knowledge of linear algebra, and of basic concepts of numerical analysis and probability theory. Many necessary definitions and fundamental results are provided, with the formal mathematical requirements limited to a minimum, while the focus is kept firmly on continuous problems. The book offers a valuable resource for students, researchers and practitioners. It is suitable for university courses on optimization and for self-study. |
You may like...
Stochastic Processes and Their…
Christo Ananth, N. Anbazhagan, …
Hardcover
R6,687
Discovery Miles 66 870
The Age of AI - And Our Human Future
Henry A. Kissinger, Eric Schmidt, …
Paperback
R364
Discovery Miles 3 640
|