![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are * Modeling techniques for anthropomorphic bipedal walking systems * Optimized walking motions for different objective functions * Identification of objective functions from measurements * Simulation and optimization approaches for humanoid robots * Biologically inspired control algorithms for bipedal walking * Generation and deformation of natural walking in computer graphics * Imitation of human motions on humanoids * Emotional body language during walking * Simulation of biologically inspired actuators for bipedal walking machines * Modeling and simulation techniques for the development of prostheses * Functional electrical stimulation of walking.
This book describes analytical techniques for optimizing knowledge acquisition, processing, and propagation, especially in the contexts of cyber-infrastructure and big data. Further, it presents easy-to-use analytical models of knowledge-related processes and their applications. The need for such methods stems from the fact that, when we have to decide where to place sensors, or which algorithm to use for processing the data-we mostly rely on experts' opinions. As a result, the selected knowledge-related methods are often far from ideal. To make better selections, it is necessary to first create easy-to-use models of knowledge-related processes. This is especially important for big data, where traditional numerical methods are unsuitable. The book offers a valuable guide for everyone interested in big data applications: students looking for an overview of related analytical techniques, practitioners interested in applying optimization techniques, and researchers seeking to improve and expand on these techniques.
The integrated and advanced science research topic
Man-Machine-Environment system engineering (MMESE) was first
established in China by Professor Shengzhao Long in 1981, with
direct support from one of the greatest modern Chinese scientists,
Xuesen Qian. In a letter to Shengzhao Long from October 22nd, 1993,
Xuesen Qian wrote: You have created a very important modern science
and technology in China
There is increasing interaction among communities with multiple languages, thus we need services that can effectively support multilingual communication. The Language Grid is an initiative to build an infrastructure that allows end users to create composite language services for intercultural collaboration. The aim is to support communities to create customized multilingual environments by using language services to overcome local language barriers. The stakeholders of the Language Grid are the language resource providers, the language service users, and the language grid operators who coordinate the former. This book includes 18 chapters in six parts that summarize various research results and associated development activities on the Language Grid. The chapters in Part I describe the framework of the Language Grid, i.e., service-oriented collective intelligence, used to bridge providers, users and operators. Two kinds of software are introduced, the service grid server software and the Language Grid Toolbox, and code for both is available via open source licenses. Part II describes technologies for service workflows that compose atomic language services. Part III reports on research work and activities relating to sharing and using language services. Part IV describes various applications of language services as applicable to intercultural collaboration. Part V contains reports on applying the Language Grid for translation activities, including localization of industrial documents and Wikipedia articles. Finally, Part VI illustrates how the Language Grid can be connected to other service grids, such as DFKI's Heart of Gold and smart classroom services in Tsinghua University in Beijing. The book will be valuable for researchers in artificial intelligence, natural language processing, services computing and human--computer interaction, particularly those who are interested in bridging technologies and user communities. "
Semantics in Adaptive and Personalised Services, initially strikes one as a specific and perhaps narrow domain. Yet, a closer examination of the term reveals much more. On one hand there is the issue of semantics. Nowadays, this most often refers to the use of OWL, RDF or some other XML based ontology description language in order to represent the entities of problem. Still, semantics may also very well refer to the consideration of the meanings and concepts, rather than arithmetic measures, regardless of the representation used. On the other hand, there is the issue of adaptation, i.e. automated re-configuration based on some context. This could be the network and device context, the application context or the user context; we refer to the latter case as personalization. From a different perspective, there is the issue of the point of view from which to examine the topic. There is the point of view of tools, referring to the algorithms and software tools one can use, the point of view of the methods, referring to the abstract methodologies and best practices one can follow, as well as the point of view of applications, referring to successful and pioneering case studies that lead the way in research and innovation. Or at least so we thought. Based on the above reasoning, the editors identified key researchers and practitioners in each of the aforementioned categories and invited them to contribute a corresponding work to this book. However, as the authors' contributions started to arrive, the editors also started to realize that although these categories participate in each chapter to different degrees, none of them can ever be totally obsolete from them. Moreover, it seems that theory and methods are inherent in the development of tools and applications and inversely the application is also inherent in the motivation and presentation of tools and methods.
This book contains some selected papers from the International Conference on Extreme Learning Machine 2014, which was held in Singapore, December 8-10, 2014. This conference brought together the researchers and practitioners of Extreme Learning Machine (ELM) from a variety of fields to promote research and development of "learning without iterative tuning". The book covers theories, algorithms and applications of ELM. It gives the readers a glance of the most recent advances of ELM.
"Intelligent Routines II: Solving Linear Algebra and Differential Geometry with Sage" contains numerous of examples and problems as well as many unsolved problems. This book extensively applies the successful software Sage, which can be found free online http: //www.sagemath.org/. Sage is a recent and popular software for mathematical computation, available freely and simple to use. This book is useful to all applied scientists in mathematics, statistics and engineering, as well for late undergraduate and graduate students of above subjects. It is the first such book in solving symbolically with Sage problems in Linear Algebra and Differential Geometry. Plenty of SAGE applications are given at each step of the exposition.
Over the last decade, we have witnessed an increasing use of Business Intelligence (BI) solutions that allow business people to query, understand, and analyze their business data in order to make better decisions. Traditionally, BI applications allow management and decision-makers to acquire useful knowledge about the performance and problems of business from the data of their organization by means of a variety of technologies, such as data warehousing, data mining, business performance management, OLAP, and periodical business reports. Research in these areas has produced consolidated solutions, techniques, and methodologies, and there are a variety of commercial products available that are based on these results. Business Intelligence Applications and the Web: Models, Systems and Technologies summarizes current research advances in BI and the Web, emphasizing research solutions, techniques, and methodologies which combine both areas in the interest of building better BI solutions. This comprehensive collection aims to emphasize the interconnections that exist among the two research areas and to highlight the benefits of combined use of BI and Web practices, which so far have acted rather independently, often in cases where their joint application would have been sensible.
This volume presents a collection of research studies on sophisticated and functional computational instruments able to recognize, process, and store relevant situated interactional signals, as well as, interact with people, displaying reactions (under conditions of limited time) that show abilities of appropriately sensing and understanding environmental changes, producing suitable, autonomous, and adaptable responses to various social situations. These social robotic autonomous systems will improve the quality of life of their end-users while assisting them on several needs, ranging from educational settings, health care assistance, communicative disorders, and any disorder impairing either their physical, cognitive, or social functional activities. The multidisciplinary themes presented in the volume will be interesting for experts and students coming from different research fields and with different knowledge and backgrounds. The research reported is particularly relevant for academic centers, and Research & Development Institutions.
The book covers a comprehensive overview of the theory, methods, applications and tools of cognition and recognition. The book is a collection of best selected papers presented in the International Conference on Cognition and Recognition 2016 (ICCR 2016) and helpful for scientists and researchers in the field of image processing, pattern recognition and computer vision for advance studies. Nowadays, researchers are working in interdisciplinary areas and the proceedings of ICCR 2016 plays a major role to accumulate those significant works at one place. The chapters included in the proceedings inculcates both theoretical as well as practical aspects of different areas like nature inspired algorithms, fuzzy systems, data mining, signal processing, image processing, text processing, wireless sensor networks, network security and cellular automata.
This book adheres to the vision that in the future compelling user experiences will be key differentiating benefits of products and services. Evaluating the user experience plays a central role, not only during the design process, but also during regular usage: for instance a video recorder that recommends TV programs that fit your current mood, a product that measures your current level of relaxation and produces advice on how to balance your life, or a module that alerts a factory operator when he is getting drowsy. Such systems are required to assess and interpret user experiences (almost) in real-time, and that is exactly what this book is about. How to achieve this? What are potential applications of psychophysiological measurements? Are real-time assessments based on monitoring of user behavior possible? If so, which elements are critical? Are behavioral aspects important? Which technology can be used? How important are intra-individual differences? What can we learn from products already on the market? The book gathers a group of invited authors from different backgrounds, such as technology, academy and business. This is a mosaic of their work, and that of Philips Research, in the assessment of user experience, covering the full range from academic research to commercial propositions..
Presenting current trends in the development and applications of intelligent systems in engineering, this monograph focuses on recent research results in system identification and control. The recurrent neurofuzzy and the fuzzy cognitive network (FCN) models are presented.Both models are suitable for partially-known or unknown complex time-varying systems. Neurofuzzy Adaptive Control contains rigorous proofs of its statements which result in concrete conclusions for the selection of the design parameters of the algorithms presented. The neurofuzzy model combines concepts from fuzzy systems and recurrent high-order neural networks to produce powerful system approximations that are used for adaptive control. The FCN modelstems from fuzzy cognitive maps and uses the notion of concepts and their causal relationships to capture the behavior of complex systems. The book shows how, with the benefit of proper training algorithms, these models are potent system emulators suitable for use in engineering systems.All chapters are supported by illustrative simulation experiments, while separate chapters are devoted to the potential industrial applications of each model including projects in: contemporary power generation; process control and conventional benchmarking problems. Researchers and graduate students working in adaptive estimation and intelligent control will find Neurofuzzy Adaptive Control of interest both for the currency of its models and because it demonstrates their relevance for real systems. The monograph also shows industrial engineers how to test intelligent adaptive control easily using proven theoretical results."
Wagman offers a critical analysis of current theory and research in the psychological and computational sciences, directed toward the elucidation of scientific discovery processes and structures. It discusses human scientific discovery processes, analyzes computer scientific discovery processes, and makes a comparative evaluation of the two. This work examines the scientific reasoning of the discoverers of the inhibition mechanism of gene control; scientific discovery heuristics used at different developmental levels; artificial intelligence and mathematical discovery; the ECHO system; the evolution of artificial intelligence discovery systems; the PAULI system; and the KEKADA system. It concludes with an examination of the extent to which computational discovery systems can emulate a set of 10 types of scientific problems.
Global optimization is a branch of applied mathematics and numerical analysis that deals with the task of finding the absolutely best set of admissible conditions to satisfy certain criteria / objective function(s), formulated in mathematical terms. Global optimization includes nonlinear, stochastic and combinatorial programming, multiobjective programming, control, games, geometry, approximation, algorithms for parallel architectures and so on. Due to its wide usage and applications, it has gained the attention of researchers and practitioners from a plethora of scientific domains. Typical practical examples of global optimization applications include: Traveling salesman problem and electrical circuit design (minimize the path length); safety engineering (building and mechanical structures); mathematical problems (Kepler conjecture); Protein structure prediction (minimize the energy function) etc. Global Optimization algorithms may be categorized into several types: Deterministic (example: branch and bound methods), Stochastic optimization (example: simulated annealing). Heuristics and meta-heuristics (example: evolutionary algorithms) etc. Recently there has been a growing interest in combining global and local search strategies to solve more complicated optimization problems. This edited volume comprises 17 chapters, including several overview Chapters, which provides an up-to-date and state-of-the art research covering the theory and algorithms of global optimization. Besides research articles and expository papers on theory and algorithms of global optimization, papers on numerical experiments and on real world applications were also encouraged. The book is divided into 2 main parts.
Robotic automation has become ubiquitous in the modern manufacturing landscape, spanning an overwhelming range of processes and applications-- from small scale force-controlled grinding operations for orthopedic joints to large scale composite manufacturing of aircraft fuselages. Smart factories, seamlessly linked via industrial networks and sensing, have revolutionized mass production, allowing for intelligent, adaptive manufacturing processes across a broad spectrum of industries. Against this background, an emerging group of researchers, designers, and fabricators have begun to apply robotic technology in the pursuit of architecture, art, and design, implementing them in a range of processes and scales. Coupled with computational design tools the technology is no longer relegated to the repetitive production of the assembly line, and is instead being employed for the mass-customization of non-standard components. This radical shift in protocol has been enabled by the development of new design to production workflows and the recognition of robotic manipulators as multi-functional fabrication platforms, capable of being reconfigured to suit the specific needs of a process. The emerging discourse surrounding robotic fabrication seeks to question the existing norms of manufacturing and has far reaching implications for the future of how architects, artists, and designers engage with materialization processes. This book presents the proceedings of Rob-Arch2014, the second international conference on robotic fabrication in architecture, art, and design. It includes a Foreword by Sigrid Brell-Cokcan and Johannes Braumann, Association for Robots in Architecture. The work contained traverses a wide range of contemporary topics, from methodologies for incorporating dynamic material feedback into existing fabrication processes, to novel interfaces for robotic programming, to new processes for large-scale automated construction. The latent argument behind this research is that the term file-to-factory must not be a reductive celebration of expediency but instead a perpetual challenge to increase the quality of feedback between design, matter, and making. "
This book presents and develops new reinforcement learning methods that enable fast and robust learning on robots in real-time. Robots have the potential to solve many problems in society, because of their ability to work in dangerous places doing necessary jobs that no one wants or is able to do. One barrier to their widespread deployment is that they are mainly limited to tasks where it is possible to hand-program behaviors for every situation that may be encountered. For robots to meet their potential, they need methods that enable them to learn and adapt to novel situations that they were not programmed for. Reinforcement learning (RL) is a paradigm for learning sequential decision making processes and could solve the problems of learning and adaptation on robots. This book identifies four key challenges that must be addressed for an RL algorithm to be practical for robotic control tasks. These RL for Robotics Challenges are: 1) it must learn in very few samples; 2) it must learn in domains with continuous state features; 3) it must handle sensor and/or actuator delays; and 4) it should continually select actions in real time. This book focuses on addressing all four of these challenges. In particular, this book is focused on time-constrained domains where the first challenge is critically important. In these domains, the agent's lifetime is not long enough for it to explore the domains thoroughly, and it must learn in very few samples.
The "Smart Innovation, Systems and Technologies" book series encompasses the topics of knowledge, intelligence, innovation and sustainability. The aim of the series is to make available a platform for the publication of books on all aspects of single and multi-disciplinary research on these themes in order to make the latest results available in a readily-accessible form. This book is devoted to the Intelligent and Adaptive Educational-Learning Systems . It privileges works that highlight key achievements and outline trends to inspire future research. After a rigorous revision process twenty manuscripts were accepted and organized into four parts: "Modeling," "Content, Virtuality" and "Applications." This volume is of interest to researchers, practitioners, professors and postgraduate students aimed to update their knowledge and find out targets for future work in the field of artificial intelligence on education. "
Calculus has been used in solving many scientific and engineering problems. For optimization problems, however, the differential calculus technique sometimes has a drawback when the objective function is step-wise, discontinuous, or multi-modal, or when decision variables are discrete rather than continuous. Thus, researchers have recently turned their interests into metaheuristic algorithms that have been inspired by natural phenomena such as evolution, animal behavior, or metallic annealing. This book especially focuses on a music-inspired metaheuristic algorithm, harmony search. Interestingly, there exists an analogy between music and optimization: each musical instrument corresponds to each decision variable; musical note corresponds to variable value; and harmony corresponds to solution vector. Just like musicians in Jazz improvisation play notes randomly or based on experiences in order to find fantastic harmony, variables in the harmony search algorithm have random values or previously-memorized good values in order to find optimal solution.
Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to erect suspension bridges, although both are products of technological advancement and reflect an increased understanding of our world. In this book, we show how to unify aspects of learning and adaptation within the computational intelligence framework. While a number of algorithms exist that fall under the umbrella of computational intelligence, with new ones added every year, all of them focus on the capabilities of learning, adapting, and helping us seek. So, the term unified computational intelligence relates not to the individual algorithms but to the underlying goals driving them. This book focuses on the computational intelligence areas of neural networks and dynamic programming, showing how to unify aspects of these areas to create new, more powerful, computational intelligence architectures to apply to new problem domains.
Intelligent paradigms are increasingly finding their ways in the design and development of decision support systems. This book presents a sample of recent research results from key researchers. The contributions include: Introduction to intelligent systems in decision making - A new method of ranking intuitionistic fuzzy alternatives - Fuzzy rule base model identification by bacterial memetic algorithms - Discovering associations with uncertainty from large databases - Dempster-Shafer structures, monotonic set measures and decision making - Interpretable decision-making models - A general methodology for managerial decision making - Supporting decision making via verbalization of data analysis results using linguistic data summaries - Computational intelligence in medical decisions making. This book is directed to the researchers, graduate students, professors, decision makers and to those who are interested to investigate intelligent paradigms in decision making.
This research volume is a continuation of our previous volumes on intelligent machine. It is divided into three parts. Part I deals with big data and ontologies. It includes examples related to the text mining, rule mining and ontology. Part II is on knowledge-based systems. It includes context-centered systems, knowledge discovery, interoperability, consistency and systems of systems. The final part is on applications. The applications involve prediction, decision optimization and assessment. This book is directed to the researchers who wish to explore the field of knowledge engineering further.
This book is a tribute to Professor Jacek Zurada, who is best known for his contributions to computational intelligence and knowledge-based neurocomputing. It is dedicated to Professor Jacek Zurada, Full Professor at the Computational Intelligence Laboratory, Department of Electrical and Computer Engineering, J.B. Speed School of Engineering, University of Louisville, Kentucky, USA, as a token of appreciation for his scientific and scholarly achievements, and for his longstanding service to many communities, notably the computational intelligence community, in particular neural networks, machine learning, data analyses and data mining, but also the fuzzy logic and evolutionary computation communities, to name but a few. At the same time, the book recognizes and honors Professor Zurada's dedication and service to many scientific, scholarly and professional societies, especially the IEEE (Institute of Electrical and Electronics Engineers), the world's largest professional technical professional organization dedicated to advancing science and technology in a broad spectrum of areas and fields. The volume is divided into five major parts, the first of which addresses theoretic, algorithmic and implementation problems related to the intelligent use of data in the sense of how to derive practically useful information and knowledge from data. In turn, Part 2 is devoted to various aspects of neural networks and connectionist systems. Part 3 deals with essential tools and techniques for intelligent technologies in systems modeling and Part 4 focuses on intelligent technologies in decision-making, optimization and control, while Part 5 explores the applications of intelligent technologies.
This book is written for both linguists and computer scientists working in the field of artificial intelligence as well as to anyone interested in intelligent text processing. Lexical function is a concept that formalizes semantic and syntactic relations between lexical units. Collocational relation is a type of institutionalized lexical relations which holds between the base and its partner in a collocation. Knowledge of collocation is important for natural language processing because collocation comprises the restrictions on how words can be used together. The book shows how collocations can be annotated with lexical functions in a computer readable dictionary - allowing their precise semantic analysis in texts and their effective use in natural language applications including parsers, high quality machine translation, periphrasis system and computer-aided learning of lexica. The books shows how to extract collocations from corpora and annotate them with lexical functions automatically. To train algorithms, the authors created a dictionary of lexical functions containing more than 900 Spanish disambiguated and annotated examples which is a part of this book. The obtained results show that machine learning is feasible to achieve the task of automatic detection of lexical functions.
The theory of formal languages is widely accepted as the backbone of t- oretical computer science. It mainly originated from mathematics (com- natorics, algebra, mathematical logic) and generative linguistics. Later, new specializations emerged from areas ofeither computer science(concurrent and distributed systems, computer graphics, arti?cial life), biology (plant devel- ment, molecular genetics), linguistics (parsing, text searching), or mathem- ics (cryptography). All human problem solving capabilities can be considered, in a certain sense, as a manipulation of symbols and structures composed by symbols, which is actually the stem of formal language theory. Language - in its two basic forms, natural and arti?cial - is a particular case of a symbol system. This wide range of motivations and inspirations explains the diverse - plicability of formal language theory ? and all these together explain the very large number of monographs and collective volumes dealing with formal language theory. In 2004 Springer-Verlag published the volume Formal Languages and - plications, edited by C. Martin-Vide, V. Mitrana and G. P?un in the series Studies in Fuzziness and Soft Computing 148, which was aimed at serving as an overall course-aid and self-study material especially for PhD students in formal language theory and applications. Actually, the volume emerged in such a context: it contains the core information from many of the lectures - livered to the students of the International PhD School in Formal Languages and Applications organized since 2002 by the Research Group on Mathem- ical Linguistics from Rovira i Virgili University, Tarragona, Spain." |
You may like...
Fracture Mechanics Testing Methods for…
D. R. Moore, J.G. Williams, …
Hardcover
R3,832
Discovery Miles 38 320
Intelligent Systems and Learning Data…
Santi Caballe, Stavros N. Demetriadis, …
Paperback
R2,653
Discovery Miles 26 530
Lossless Information Hiding in Images
Zheming Lu, Shize Guo
Paperback
|