![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
Data Management is the process of planning, coordinating and controlling data resources. More often, applications need to store and search a large amount of data. Managing Data has been continuously challenged by demands from various areas and applications and has evolved in parallel with advances in hardware and computing techniques. This volume focuses on its recent advances and it is composed of five parts and a total of eighteen chapters. The first part of the book contains five contributions in the area of information retrieval and Web intelligence: a novel approach to solving index selection problem, integrated retrieval from Web of documents and data, bipolarity in database querying, deriving data summarization through ontologies, and granular computing for Web intelligence. The second part of the book contains four contributions in knowledge discovery area. Its third part contains three contributions in information integration and data security area. The remaining two parts of the book contain six contributions in the area of intelligent agents and applications of data management in medical domain.
The human ambition to reproduce and improve natural objects and processes has a long history, and ranges from dreams to actual design, from Icarus's wings to modern robotics and bioengineering. This imperative seems to be linked not only to practical utility but also to our deepest psychology. Nevertheless, reproducing something natural is not an easy enterprise, and the actual replication of a natural object or process by means of some technology is impossible. In this book the author uses the term naturoid to designate any real artifact arising from our attempts to reproduce natural instances. He concentrates on activities that involve the reproduction of something existing in nature, and whose reproduction, through construction strategies which differ from natural ones, we consider to be useful, appealing or interesting. The development of naturoids may be viewed as a distinct class of technological activity, and the concept should be useful for methodological research into establishing the common rules, potentialities and constraints that characterize the human effort to reproduce natural objects. The author shows that a naturoid is always the result of a reduction of the complexity of natural objects, due to an unavoidable multiple selection strategy. Nevertheless, the reproduction process implies that naturoids take on their own new complexity, resulting in a transfiguration of the natural exemplars and their performances, and leading to a true innovation explosion. While the core performances of contemporary naturoids improve, paradoxically the more a naturoid develops the further it moves away from its natural counterpart. Therefore, naturoids will more and more affect our relationships with advanced technologies and with nature, but in ways quite beyond our predictive capabilities. The book will be of interest to design scholars and researchers of technology, cultural studies, anthropology and the sociology of science and technology."
This book proposes the formulation of an efficient methodology that estimates energy system uncertainty and predicts Remaining Useful Life (RUL) accurately with significantly reduced RUL prediction uncertainty. Renewable and non-renewable sources of energy are being used to supply the demands of societies worldwide. These sources are mainly thermo-chemo-electro-mechanical systems that are subject to uncertainty in future loading conditions, material properties, process noise, and other design parameters.It book informs the reader of existing and new ideas that will be implemented in RUL prediction of energy systems in the future. The book provides case studies, illustrations, graphs, and charts. Its chapters consider engineering, reliability, prognostics and health management, probabilistic multibody dynamical analysis, peridynamic and finite-element modelling, computer science, and mathematics.
Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intelligence has been collected in this volume.
Video segmentation has become one of the core areas in visual signal processing research. The objective of Video Segmentation and Its Applications is to present the latest advances in video segmentation and analysis techniques while covering the theoretical approaches, real applications and methods being developed in the computer vision and video analysis community. The book will also provide researchers and practitioners a comprehensive understanding of state-of-the-art of video segmentation techniques and a resource for potential applications and successful practice.
Stochastic global optimization is a very important subject, that has applications in virtually all areas of science and technology. Therefore there is nothing more opportune than writing a book about a successful and mature algorithm that turned out to be a good tool in solving difficult problems. Here we present some techniques for solving several problems by means of Fuzzy Adaptive Simulated Annealing (Fuzzy ASA), a fuzzy-controlled version of ASA, and by ASA itself. ASA is a sophisticated global optimization algorithm that is based upon ideas of the simulated annealing paradigm, coded in the C programming language and developed to statistically find the best global fit of a nonlinear constrained, non-convex cost function over a multi-dimensional space. By presenting detailed examples of its application we want to stimulate the reader's intuition and make the use of Fuzzy ASA (or regular ASA) easier for everyone wishing to use these tools to solve problems. We kept formal mathematical requirements to a minimum and focused on continuous problems, although ASA is able to handle discrete optimization tasks as well. This book can be used by researchers and practitioners in engineering and industry, in courses on optimization for advanced undergraduate and graduate levels, and also for self-study.
The book offers a comprehensive survey of intuitionistic fuzzy logics. By reporting on both the author's research and others' findings, it provides readers with a complete overview of the field and highlights key issues and open problems, thus suggesting new research directions. Starting with an introduction to the basic elements of intuitionistic fuzzy propositional calculus, it then provides a guide to the use of intuitionistic fuzzy operators and quantifiers, and lastly presents state-of-the-art applications of intuitionistic fuzzy sets. The book is a valuable reference resource for graduate students and researchers alike.
This edited book presents scientific results of the 12th International Conference on Software Engineering, Artificial Intelligence Research, Management and Applications (SERA 2014) held on August 31 - September 4, 2014 in Kitakyushu, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. This publication captures 17 of the conference's most promising papers.
This book represents the combined peer-reviewed
proceedings The 41 contributions published in this book address many
topics
There is a basic perplexity in our times. On the one hand, we ?nd a blind trust in technology and rationalism. In our neo-liberalistically dominated world only what can be rapidly exploited and commercialized seems to count. The only opposing reaction to this kind of rationalism is an extreme rejection of all kinds of reasoning, and sometimes attendant religious fundamentalism. But instead of re?ecting on the limits and possibilites of reasoning, dialogue is replaced by a demagogic struggle between cultures. One cause of the blind trust in technology is misunderstandings about the sign- cance and the application of theories in the reception of the so-called Enlightenment. The Enlightenment is essentially characterized by two forces: (i) the conception of society as a social contract and (ii) the new science (New- nian physics, etc.). But as a result we lost ground: Atomistic individualism nourished the illusion of a self-contained ego prior to man's entering into a shared inter-subjective world. And in the new science, our constructions of reality became autonomous and indep- dent of our interventions. Thus we became caught in the inherent dynamism of our computational constructions of reality. Science, as it is applied today, operates with far too simple parameters and model-theoretic constructions - erroneously taking the latter (the models) as literal descriptions of reality.
Compared with data from general application domains, modern biological data has many unique characteristics. Biological data are often characterized as having large volumes, complex structures, high dimensionality, evolving biological concepts, and insufficient data modelling practices. Over the past several years, bioinformatics has become an all-encompassing term for everything relating to both computer science and biology. The goal of this book is to cover data and applications identifying new issues and directions for future research in biomedical domain. The book will become a useful guide learning state-of-the-art development in biomedical data management, data-intensive bioinformatics systems, and other miscellaneous biological database applications. The book addresses various topics in bioinformatics with varying degrees of balance between biomedical data models and their real-world applications.
Let's try to play the music and not the background. Ornette Coleman, liner notes of the LP "Free Jazz" 20] WhenIbegantocreateacourseonfreejazz, theriskofsuchanenterprise was immediately apparent: I knew that Cecil Taylor had failed to teach such a matter, and that for other, more academic instructors, the topic was still a sort of outlandish adventure. To be clear, we are not talking about tea- ing improvisation here-a di?erent, and also problematic, matter-rather, we wish to create a scholarly discourse about free jazz as a cultural achievement, and follow its genealogy from the American jazz tradition through its various outbranchings, suchastheEuropeanandJapanesejazzconceptionsandint- pretations. We also wish to discuss some of the underlying mechanisms that are extant in free improvisation, things that could be called technical aspects. Such a discourse bears the ?avor of a contradicto in adjecto: Teachingthe unteachable, the very negation of rules, above all those posited by white jazz theorists, and talking about the making of sounds without aiming at so-called factual results and all those intellectual sedimentations: is this not a suicidal topic? My own endeavors as a free jazz pianist have informed and advanced my conviction that this art has never been theorized in a satisfactory way, not even by Ekkehard Jost in his unequaled, phenomenologically precise p- neering book "Free Jazz" 57].
Intelligent information and database systems are two closely related and we- established subfields of modern computer science. They focus on the integration of artificial intelligence and classic database technologies in order to create the class of next generation information systems. The major target of this new gene- tion of systems is to provide end-users with intelligent behavior: simple and/or advanced learning, problem solving, uncertain and certain reasoning, se- organization, cooperation, etc. Such intelligent abilities are implemented in classic information systems to make them autonomous and user oriented, in particular when advanced problems of multimedia information and knowledge discovery, access, retrieval and manipulation are to be solved in the context of large, distr- uted and heterogeneous environments. It means that intelligent knowledge-based information and database systems are used to solve basic problems of large coll- tions management, carry out knowledge discovery from large data collections, reason about information under uncertain conditions, support users in their for- lation of complex queries etc. Topics discussed in this volume include but are not limited to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, implementation, validation, maintenance and evolution.
This book contains some selected papers from the International Conference on Extreme Learning Machine 2014, which was held in Singapore, December 8-10, 2014. This conference brought together the researchers and practitioners of Extreme Learning Machine (ELM) from a variety of fields to promote research and development of "learning without iterative tuning". The book covers theories, algorithms and applications of ELM. It gives the readers a glance of the most recent advances of ELM.
This book describes the application of modern information technology to reservoir modeling and well management in shale. While covering Shale Analytics, it focuses on reservoir modeling and production management of shale plays, since conventional reservoir and production modeling techniques do not perform well in this environment. Topics covered include tools for analysis, predictive modeling and optimization of production from shale in the presence of massive multi-cluster, multi-stage hydraulic fractures. Given the fact that the physics of storage and fluid flow in shale are not well-understood and well-defined, Shale Analytics avoids making simplifying assumptions and concentrates on facts (Hard Data - Field Measurements) to reach conclusions. Also discussed are important insights into understanding completion practices and re-frac candidate selection and design. The flexibility and power of the technique is demonstrated in numerous real-world situations.
Computational Intelligence (CI) has emerged as a rapid growing field over the past decade. Its various techniques have been recognized as powerful tools for intelligent information processing, decision making and knowledge management. "Advances of Computational Intelligence in Industrial Systems" reports the exploration of CI frontiers with an emphasis on a broad spectrum of real-world applications. Section I Theory and Foundation presents some of the latest developments in CI, e.g. particle swarm optimization, Web services, data mining with privacy protection, kernel methods for text analysis, etc. Section II Industrial Application covers the CI applications in a wide variety of domains, e.g. clinical decision support, process monitoring for industrial CNC machine, novelty detection for jet engines, ant algorithm for berth allocation, etc. Such a collection of chapters has presented the state-of-the-art of CI applications in industry and will be an essential resource for professionals and researchers who wish to learn and spot the opportunities in applying CI techniques to their particular problems. "
Artificial Intelligence (AI) is penetrating in all sciences as a multidisciplinary approach. However, adopting the theory of AI including computer vision and computer audition to urban intellectual space, is always difficult for architecture and urban planners. This book overcomes this challenge through a conceptual framework by merging computer vision and audition to urban studies based on a series of workshops called Remorph, conducted by Tehran Urban Innovation Center (TUIC).
This unique book dicusses the latest research, innovative ideas, challenges and computational intelligence (CI) solutions in sustainable computing. It presents novel, in-depth fundamental research on achieving a sustainable lifestyle for society, either from a methodological or from an application perspective. Sustainable computing has expanded to become a significant research area covering the fields of computer science and engineering, electrical engineering and other engineering disciplines, and there has been an increase in the amount of literature on aspects sustainable computing such as energy efficiency and natural resources conservation that emphasizes the role of ICT (information and communications technology) in achieving system design and operation objectives. The energy impact/design of more efficient IT infrastructures is a key challenge in realizing new computing paradigms. The book explores the uses of computational intelligence (CI) techniques for intelligent decision support that can be exploited to create effectual computing systems, and addresses sustainability problems in computing and information processing environments and technologies at the different levels of CI paradigms. An excellent guide to surveying the state of the art in computational intelligence applied to challenging real-world problems in sustainable computing, it is intended for scientists, practitioners, researchers and academicians dealing with the new challenges and advances in area.
Electric power systems are experiencing significant changes at the worldwide scale in order to become cleaner, smarter, and more reliable. This edited book examines a wide range of topics related to these changes, which are primarily caused by the introduction of information technologies, renewable energy penetration, digitalized equipment, new operational strategies, and so forth. The emphasis will be put on the modeling and control of smart grid systems. The book addresses research topics such as high efficiency transforrmers, wind turbines and generators, fuel cells, or high speed turbines and generators.
This volume features key contributions from the International Conference on Pattern Recognition Applications and Methods, (ICPRAM 2012,) held in Vilamoura, Algarve, Portugal from February 6th-8th, 2012. The conference provided a major point of collaboration between researchers, engineers and practitioners in the areas of Pattern Recognition, both from theoretical and applied perspectives, with a focus on mathematical methodologies. Contributions describe applications of pattern recognition techniques to real-world problems, interdisciplinary research, and experimental and theoretical studies which yield new insights that provide key advances in the field. This book will be suitable for scientists and researchers in optimization, numerical methods, computer science, statistics and for differential geometers and mathematical physicists.
This book offers an inspiring and naive view on language and reasoning. It presents a new approach to ordinary reasoning that follows the author's former work on fuzzy logic. Starting from a pragmatic scientific view on meaning as a quantity, and the common sense reasoning from a primitive notion of inference, which is shared by both laypeople and experts, the book shows how this can evolve, through the addition of more and more suppositions, into various formal and specialized modes of precise, imprecise, and approximate reasoning. The logos are intended here as a synonym for rationality, which is usually shown by the processes of questioning, guessing, telling, and computing. Written in a discursive style and without too many technicalities, the book presents a number of reflections on the study of reasoning, together with a new perspective on fuzzy logic and Zadeh's "computing with words" grounded in both language and reasoning. It also highlights some mathematical developments supporting this view. Lastly, it addresses a series of questions aimed at fostering new discussions and future research into this topic. All in all, this book represents an inspiring read for professors and researchers in computer science, and fuzzy logic in particular, as well as for psychologists, linguists and philosophers.
Condition Monitoring Using Computational Intelligence Methods promotes the various approaches gathered under the umbrella of computational intelligence to show how condition monitoring can be used to avoid equipment failures and lengthen its useful life, minimize downtime and reduce maintenance costs. The text introduces various signal-processing and pre-processing techniques, wavelets and principal component analysis, for example, together with their uses in condition monitoring and details the development of effective feature extraction techniques classified into frequency-, time-frequency- and time-domain analysis. Data generated by these techniques can then be used for condition classification employing tools such as: * fuzzy systems; rough and neuro-rough sets; neural and Bayesian networks;hidden Markov and Gaussian mixture models; and support vector machines.
The increasing complexity of our world demands new perspectives on the role of technology in decision making. Human decision making has its li- tations in terms of information-processing capacity. We need new technology to cope with the increasingly complex and information-rich nature of our modern society. This is particularly true for critical environments such as crisis management and tra?c management, where humans need to engage in close collaborations with arti?cial systems to observe and understand the situation and respond in a sensible way. We believe that close collaborations between humans and arti?cial systems will become essential and that the importance of research into Interactive Collaborative Information Systems (ICIS) is self-evident. Developments in information and communication technology have ra- cally changed our working environments. The vast amount of information available nowadays and the wirelessly networked nature of our modern so- ety open up new opportunities to handle di?cult decision-making situations such as computer-supported situation assessment and distributed decision making. To make good use of these new possibilities, we need to update our traditional views on the role and capabilities of information systems. The aim of the Interactive Collaborative Information Systems project is to develop techniques that support humans in complex information en- ronments and that facilitate distributed decision-making capabilities. ICIS emphasizes the importance of building actor-agent communities: close c- laborations between human and arti?cial actors that highlight their comp- mentary capabilities, and in which task distribution is ?exible and adaptive.
The book offers a new approach to information theory that is more general then the classical approach by Shannon. The classical definition of information is given for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space ) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of , i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two new concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these new concepts, mostly in statistics and in neuroscience. |
You may like...
Hardware Accelerator Systems for…
Shiho Kim, Ganesh Chandra Deka
Hardcover
R3,950
Discovery Miles 39 500
Artificial Intelligence for Neurological…
Ajith Abraham, Sujata Dash, …
Paperback
R3,925
Discovery Miles 39 250
Creativity in Computing and DataFlow…
Suyel Namasudra, Veljko Milutinovic
Hardcover
R4,204
Discovery Miles 42 040
|