![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence
The Finite Element Method, shortly FEM, is a widely used computational tool in structural engineering. For basic design purposes it usually suf ces to apply a linear-elastic analysis. Only for special structures and for forensic investigations the analyst need to apply more advanced features like plasticity and cracking to account for material nonlinearities, or nonlinear relations between strains and displacements for geometrical nonlinearity to account for buckling. Advanced analysis techniques may also be necessary if we have to judge the remaining structural capacity of aging structures. In this book we will abstain from such special cases and focus on everyday jobs. Our goal is the worldwide everyday use of linear-elastic analysis, and dimensioning on basis of these elastic computations. We cover steel and concrete structures, though attention to structural concrete prevails. Structural engineers have access to powerful FEM packages and apply them intensively. Experience makes clear that often they do not understand the software that they are using. This book aims to be a bridge between the software world and structural engineering. Many problems are related to the correct input data and the proper interpretation and handling of output. The book is neither a text on the Finite Element Method, nor a user manual for the software packages. Rather it aims to be a guide to understanding and handling the results gained by such software. We purposely restrict ourselves to structure types which frequently occur in practise.
Humans have an extraordinary capability to combine different types of information in a single meaningful interpretation. The quickness with which interpretation processes evolve suggests the existence of a uniform procedure for all domains. In this book the authors suggest that such a procedure can be found. They concentrate on the introduction of a theory of interpretation, and they define a model that enables a meaningful representation of knowledge, based on a dynamic view of information and a cognitive model of human information processing. The book consists of three parts. The first part focuses on the properties of signs and sign interpretation; in the second part the authors introduce a model that complies with the conditions for sign processing set by the first part; and in the third part they examine applications of their model in the domain of logic, natural language, reasoning and mathematics. Finally they show how these domains pop up as perspectives in an overall model of knowledge representation. The reader is assumed to have some interest in human information processing and knowledge modeling. Natural language is considered in the obvious sense, familiarity with linguistic theories is not required. Sign theoretical concepts are restricted to a manageable subset, which is introduced gently. Finally, some familiarity with basic concepts of propositional and syllogistic logic may be useful.
This book constitutes the refereed proceedings of the 7th IFIP WG 5.5/SOCOLNET Advanced Doctoral Conference on Computing, Electrical and Industrial Systems, DoCEIS 2016, held in Costa de Caparica, Portugal, in April 2016. The 53 revised full papers were carefully reviewed and selected from 112 submissions. The papers present selected results produced in engineering doctoral programs and focus on research, development, and application of cyber-physical systems. Research results and ongoing work are presented, illustrated and discussed in the following areas: enterprise collaborative networks; ontologies; Petri nets; manufacturing systems; biomedical applications; intelligent environments; control and fault tolerance; optimization and decision support; wireless technologies; energy: smart grids, renewables, management, and optimization; bio-energy; and electronics.
This book presents emerging trends in the evolution of service-oriented and enterprise architectures. New architectures and methods of both business and IT are integrating services to support mobility systems, Internet of Things, Ubiquitous Computing, collaborative and adaptive business processes, Big Data, and Cloud ecosystems. They inspire current and future digital strategies and create new opportunities for the digital transformation of next digital products and services. Services Oriented Architectures (SOA) and Enterprise Architectures (EA) have emerged as a useful framework for developing interoperable, large-scale systems, typically implementing various standards, like Web Services, REST, and Microservices. Managing the adaptation and evolution of such systems presents a great challenge. Service-Oriented Architecture enables flexibility through loose coupling, both between the services themselves and between the IT organizations that manage them. Enterprises evolve continuously by transforming and extending their services, processes and information systems. Enterprise Architectures provide a holistic blueprint to help define the structure and operation of an organization with the goal of determining how an organization can most effectively achieve its objectives. The book proposes several approaches to address the challenges of the service-oriented evolution of digital enterprise and software architectures.
This book discusses human-machine interactions, specifically focusing on making them as natural as human-human interaction. It is based on the premise that to get the right connect between human and machines, it is essential to understand not only the behavior of the person interacting with the machine, but also the limitations of the technology. Firstly, the authors review the evolution of language as a spontaneous, natural phenomenon in the overall scheme of the evolutionary development of living beings. They then go on to examine the possible approaches to understanding and representing the meaning and the common aspects of human-human and human-machine interactions, and introduce the keyconcept-keyword (also called minimal parsing) approach as a convenient and realistic way to implement usable human-machine interface (HMI) systems. For researchers looking for practical approaches, way beyond the realms of theory, this book is a must read.
The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scienti c research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The rst chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, rst describing an agent-based simulation framework, then a simulator for electrical machines, and nally an airborne network emulation environment. The two subsequent chapters discuss power distribution networks from different points of view-anticipation and optimization of multi-echelon inventory policy. After that, the book includes also a group of chapters discussing the mathematical modelling supported by veri cation simulations, and a set of chapters with models synthesised by means of arti cial intelligence tools and complex automata framework. Lastly, the book includes a chapter introducing the use of graph-grammar model for generation of threedimensional computational meshes and a chapter focused on the experimental and computational results regarding simulation of aero engine vortexes. Authors believe, that this book is a valuable reference to researchers and practitioners in the eld, as well as an inspiration to those interested in the area of Intelligent Modelling and Simulation.
The noted inventor and futurist’s successor to his landmark book The Singularity Is Near explores how technology will transform the human race in the decades to come. Since it was first published in 2005, Ray Kurzweil’s The Singularity Is Near and its vision of an exponential future have spawned a worldwide movement. Kurzweil's predictions about technological advancements have largely come true, with concepts like AI, intelligent machines, and biotechnology now widely familiar to the public. In this entirely new book Ray Kurzweil brings a fresh perspective to advances toward the Singularity—assessing his 1999 prediction that AI will reach human level intelligence by 2029 and examining the exponential growth of technology—that, in the near future, will expand human intelligence a millionfold and change human life forever. Among the topics he discusses are rebuilding the world, atom by atom with devices like nanobots; radical life extension beyond the current age limit of 120; reinventing intelligence by connecting our brains to the cloud; how exponential technologies are propelling innovation forward in all industries and improving all aspects of our well-being such as declining poverty and violence; and the growth of renewable energy and 3-D printing. He also considers the potential perils of biotechnology, nanotechnology, and artificial intelligence, including such topics of current controversy as how AI will impact employment and the safety of autonomous cars, and "After Life" technology, which aims to virtually revive deceased individuals through a combination of their data and DNA. The culmination of six decades of research on artificial intelligence, The Singularity Is Nearer is Ray Kurzweil’s crowning contribution to the story of this science and the revolution that is to come.
Brain Inspired Cognitive Systems 2008 (June 24-27, 2008; S o Lu s, Brazil) brought together leading scientists and engineers who use analytic, syntactic and computational methods both to understand the prodigious processing properties of biological systems and, specifically, of the brain, and to exploit such knowledge to advance computational methods towards ever higher levels of cognitive competence. This book includes the papers presented at four major symposia: Part I - Cognitive Neuroscience Part II - Biologically Inspired Systems Part III - Neural Computation Part IV - Models of Consciousness.
This book employs a new eco-cognitive model of abduction to underline the distributed and embodied nature of scientific cognition. Its main focus is on the knowledge-enhancing virtues of abduction and on the productive role of scientific models. What are the distinctive features that define the kind of knowledge produced by science? To provide an answer to this question, the book first addresses the ideas of Aristotle, who stressed the essential inferential and distributed role of external cognitive tools and epistemic mediators in abductive cognition. This is analyzed in depth from both a naturalized logic and an ecology of cognition perspective. It is shown how the maximization of cognition, and of abducibility - two typical goals of science - are related to a number of fundamental aspects: the optimization of the eco-cognitive situatedness; the maximization of changeability for both the input and the output of the inferences involved; a high degree of information-sensitiveness; and the need to record the "past life" of abductive inferential practices. Lastly, the book explains how some impoverished epistemological niches - the result of a growing epistemic irresponsibility associated with the commodification and commercialization of science - are now seriously jeopardizing the flourishing development of human creative abduction.
The rapid increase in computing power and communication speed, coupled with computer storage facilities availability, has led to a new age of multimedia app- cations. Multimedia is practically everywhere and all around us we can feel its presence in almost all applications ranging from online video databases, IPTV, - teractive multimedia and more recently in multimedia based social interaction. These new growing applications require high-quality data storage, easy access to multimedia content and reliable delivery. Moving ever closer to commercial - ployment also aroused a higher awareness of security and intellectual property management issues. All the aforementioned requirements resulted in higher demands on various - eas of research (signal processing, image/video processing and analysis, com- nication protocols, content search, watermarking, etc.). This book covers the most prominent research issues in multimedia and is divided into four main sections: i) content based retrieval, ii) storage and remote access, iii) watermarking and co- right protection and iv) multimedia applications. Chapter 1 of the first section presents an analysis on how color is used and why is it crucial in nowadays multimedia applications. In chapter 2 the authors give an overview of the advances in video abstraction for fast content browsing, transm- sion, retrieval and skimming in large video databases and chapter 3 extends the discussion on video summarization even further. Content retrieval problem is tackled in chapter 4 by describing a novel method for producing meaningful s- ments suitable for MPEG-7 description based on binary partition trees (BPTs).
The purpose of this book is to present a methodology for designing and tuning fuzzy expert systems in order to identify nonlinear objects; that is, to build input-output models using expert and experimental information. The results of these identifications are used for direct and inverse fuzzy evidence in forecasting and diagnosis problem solving. The book is organised as follows: Chapter 1 presents the basic knowledge about fuzzy sets, genetic algorithms and neural nets necessary for a clear understanding of the rest of this book. Chapter 2 analyzes direct fuzzy inference based on fuzzy if-then rules. Chapter 3 is devoted to the tuning of fuzzy rules for direct inference using genetic algorithms and neural nets. Chapter 4 presents models and algorithms for extracting fuzzy rules from experimental data. Chapter 5 describes a method for solving fuzzy logic equations necessary for the inverse fuzzy inference in diagnostic systems. Chapters 6 and 7 are devoted to inverse fuzzy inference based on fuzzy relations and fuzzy rules. Chapter 8 presents a method for extracting fuzzy relations from data. All the algorithms presented in Chapters 2-8 are validated by computer experiments and illustrated by solving medical and technical forecasting and diagnosis problems. Finally, Chapter 9 includes applications of the proposed methodology in dynamic and inventory control systems, prediction of results of football games, decision making in road accident investigations, project management and reliability analysis.
This book is about synergy in computational intelligence (CI). It is a c- lection of chapters that covers a rich and diverse variety of computer-based techniques, all involving some aspect of computational intelligence, but each one taking a somewhat pragmatic view. Many complex problems in the real world require the application of some form of what we loosely call "intel- gence"fortheirsolution. Fewcanbesolvedbythenaiveapplicationofasingle technique, however good it is. Authors in this collection recognize the li- tations of individual paradigms, and propose some practical and novel ways in which di?erent CI techniques can be combined with each other, or with more traditional computational techniques, to produce powerful probl- solving environments which exhibit synergy, i. e., systems in which the whole 1 is greater than the sum of the parts . Computational intelligence is a relatively new term, and there is some d- agreement as to its precise de?nition. Some practitioners limit its scope to schemes involving evolutionary algorithms, neural networks, fuzzy logic, or hybrids of these. For others, the de?nition is a little more ?exible, and will include paradigms such as Bayesian belief networks, multi-agent systems, case-based reasoning and so on. Generally, the term has a similar meaning to the well-known phrase "Arti?cial Intelligence" (AI), although CI is p- ceived moreas a "bottom up" approachfrom which intelligent behaviour can emerge, whereasAItendstobestudiedfromthe"topdown,"andderivefrom pondering upon the "meaning of intelligence." (These and other key issues will be discussed in more detail in Chapter 1.
The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.
This book features papers presented at IIH-MSP 2018, the 14th International Conference on Intelligent Information Hiding and Multimedia Signal Processing. The scope of IIH-MSP included information hiding and security, multimedia signal processing and networking, and bio-inspired multimedia technologies and systems. The book discusses subjects related to massive image/video compression and transmission for emerging networks, advances in speech and language processing, recent advances in information hiding and signal processing for audio and speech signals, intelligent distribution systems and applications, recent advances in security and privacy for multimodal network environments, multimedia signal processing, and machine learning. Presenting the latest research outcomes and findings, it is suitable for researchers and students who are interested in the corresponding fields. IIH-MSP 2018 was held in Sendai, Japan on 26-28 November 2018. It was hosted by Tohoku University and was co-sponsored by the Fujian University of Technology in China, the Taiwan Association for Web Intelligence Consortium in Taiwan, and the Swinburne University of Technology in Australia, as well as the Fujian Provincial Key Laboratory of Big Data Mining and Applications (Fujian University of Technology) and the Harbin Institute of Technology Shenzhen Graduate School in China.
Genetic programming (GP) is a popular heuristic methodology of program synthesis with origins in evolutionary computation. In this generate-and-test approach, candidate programs are iteratively produced and evaluated. The latter involves running programs on tests, where they exhibit complex behaviors reflected in changes of variables, registers, or memory. That behavior not only ultimately determines program output, but may also reveal its `hidden qualities' and important characteristics of the considered synthesis problem. However, the conventional GP is oblivious to most of that information and usually cares only about the number of tests passed by a program. This `evaluation bottleneck' leaves search algorithm underinformed about the actual and potential qualities of candidate programs. This book proposes behavioral program synthesis, a conceptual framework that opens GP to detailed information on program behavior in order to make program synthesis more efficient. Several existing and novel mechanisms subscribing to that perspective to varying extent are presented and discussed, including implicit fitness sharing, semantic GP, co-solvability, trace convergence analysis, pattern-guided program synthesis, and behavioral archives of subprograms. The framework involves several concepts that are new to GP, including execution record, combined trace, and search driver, a generalization of objective function. Empirical evidence gathered in several presented experiments clearly demonstrates the usefulness of behavioral approach. The book contains also an extensive discussion of implications of the behavioral perspective for program synthesis and beyond.
This book describes new algorithms and ideas for making effective decisions under constraints, including applications in control engineering, manufacturing (how to optimally determine the production level), econometrics (how to better predict stock market behavior), and environmental science and geosciences (how to combine data of different types). It also describes general algorithms and ideas that can be used in other application areas. The book presents extended versions of selected papers from the annual International Workshops on Constraint Programming and Decision Making (CoProd'XX) from 2013 to 2016. These workshops, held in the US (El Paso, Texas) and in Europe (Wurzburg, Germany, and Uppsala, Sweden), have attracted researchers and practitioners from all over the world. It is of interest to practitioners who benefit from the new techniques, to researchers who want to extend the ideas from these papers to new application areas and/or further improve the corresponding algorithms, and to graduate students who want to learn more - in short, to anyone who wants to make more effective decisions under constraints.
"Advances in Bio-inspired Combinatorial Optimization Problems" illustrates several recent bio-inspired efficient algorithms for solving NP-hard problems. Theoretical bio-inspired concepts and models, in particular for agents, ants and virtual robots are described. Large-scale optimization problems, for example: the Generalized Traveling Salesman Problem and the Railway Traveling Salesman Problem, are solved and their results are discussed. Some of the main concepts and models described in this book are: inner rule to guide ant search - a recent model in ant optimization, heterogeneous sensitive ants; virtual sensitive robots; ant-based techniques for static and dynamic routing problems; stigmergic collaborative agents and learning sensitive agents. This monograph is useful for researchers, students and all people interested in the recent natural computing frameworks. The reader is presumed to have knowledge of combinatorial optimization, graph theory, algorithms and programming. The book should furthermore allow readers to acquire ideas, concepts and models to use and develop new software for solving complex real-life problems.
The book describes a novel ideology and supporting information technology for integral management of both civil and defence-orientated large, distributed dynamic systems. The approach is based on a high-level Spatial Grasp Language, SGL, expressing solutions in physical, virtual, executive and combined environments in the form of active self-evolving and self-propagating patterns spatially matching the systems to be created, modified and controlled. The communicating interpreters of SGL can be installed in key system points, which may be in large numbers (up to millions and billions) and represent equipped humans, robots, laptops, smartphones, smart sensors, etc. Operating under gestalt-inspired scenarios in SGL initially injected from any points, these systems can be effectively converted into goal-driven spatial machines (rather than computers as dealing with physical matter too) capable of responding to numerous challenges caused by growing world dynamics in the 21st century. Including numerous practical examples, the book is a valuable resource for system managers and programmers.
There have been significant developments in the design and application of algorithms for both one-dimensional signal processing and multidimensional signal processing, namely image and video processing, with the recent focus changing from a step-by-step procedure of designing the algorithm first and following up with in-depth analysis and performance improvement to instead applying heuristic-based methods to solve signal-processing problems. In this book the contributing authors demonstrate both general-purpose algorithms and those aimed at solving specialized application problems, with a special emphasis on heuristic iterative optimization methods employing modern evolutionary and swarm intelligence based techniques. The applications considered are in domains such as communications engineering, estimation and tracking, digital filter design, wireless sensor networks, bioelectric signal classification, image denoising, and image feature tracking. The book presents interesting, state-of-the-art methodologies for solving real-world problems and it is a suitable reference for researchers and engineers in the areas of heuristics and signal processing."
Current companies and communities of practice are involved in intensive networking and collaborative systems by a great variety of electronic relations and collaborative interactions. This has resulted in entangled systems that need to be managed efficiently and in an autonomous way, thus facing many issues and challenges. The extensive research produced in this book will help virtual organizations to exploit latest and powerful technologies based on Grid and Wireless infrastructures as well as Cloud computing in order to alleviate complex issues and challenges arisen in networking and collaborative systems, in terms of collaborative applications, resource management, mobility, and security and system resilience. The ultimate aim of the book is to stimulate research that leads to the creation of responsive environments for networking and, at longer-term, the development of adaptive, secure, mobile, and intuitive intelligent systems for collaborative work and learning. Academic researchers, professionals and practitioners in the field will be inspired and put in practice the ideas and experiences proposed in the book in order to evaluate them for their specific research and work.
To deal with the flexible architectures and evolving functionalities of complex modern systems, the agent metaphor and agent-based computing are often the most appropriate software design approach. As a result, a broad range of special-purpose design processes has been developed in the last several years to tackle the challenges of these specific application domains. In this context, in early 2012 the IEEE-FIPA Design Process Documentation Template SC0097B was defined, which facilitates the representation of design processes and method fragments through the use of standardized templates, thus supporting the creation of easily sharable repositories and facilitating the composition of new design processes. Following this standardization approach, this book gathers the documentations of some of the best-known agent-oriented design processes. After an introductory section, describing the goal of the book and the existing IEEE FIPA standard for design process documentation, thirteen processes (including the widely known Open UP, the de facto standard in object-oriented software engineering) are documented by their original creators or other well-known scientists working in the field. As a result, this is the first work to adopt a standard, unified descriptive approach for documenting different processes, making it much easier to study the individual processes, to rigorously compare them, and to apply them in industrial projects.While there are a few books on the market describing the individual agent-oriented design processes, none of them presents all the processes, let alone in the same format. With this handbook, for the first time, researchers as well as professional software developers looking for an overview as well as for detailed and standardized descriptions of design processes will find a comprehensive presentation of the most important agent-oriented design processes, which will be an invaluable resource when developing solutions in various application areas.
This book focuses on neuro-engineering and neural computing, a multi-disciplinary field of research attracting considerable attention from engineers, neuroscientists, microbiologists and material scientists. It explores a range of topics concerning the design and development of innovative neural and brain interfacing technologies, as well as novel information acquisition and processing algorithms to make sense of the acquired data. The book also highlights emerging trends and advances regarding the applications of neuro-engineering in real-world scenarios, such as neural prostheses, diagnosis of neural degenerative diseases, deep brain stimulation, biosensors, real neural network-inspired artificial neural networks (ANNs) and the predictive modeling of information flows in neuronal networks. The book is broadly divided into three main sections including: current trends in technological developments, neural computation techniques to make sense of the neural behavioral data, and application of these technologies/techniques in the medical domain in the treatment of neural disorders.
This book is dedicated to the memory of Professor Zdzis{\l}aw Pawlak who passed away almost six year ago. He is the founder of the Polish school of Artificial Intelligence and one of the pioneers in Computer Engineering and Computer Science with worldwide influence. He was a truly great scientist, researcher, teacher and a human being. This book prepared in two volumes contains more than 50 chapters. This demonstrates that the scientific approaches discovered by of Professor Zdzis{\l}aw Pawlak, especially the rough set approach as a tool for dealing with imperfect knowledge, are vivid and intensively explored by many researchers in many places throughout the world. The submitted papers prove that interest in rough set research is growing and is possible to see many new excellent results both on theoretical foundations and applications of rough sets alone or in combination with other approaches. We are proud to offer the readers this book.
This book proposes a consistent methodology for building intelligent systems. It puts forward several formal models for designing and implementing rules-based systems, and presents illustrative case studies of their applications. These include software engineering, business process systems, Semantic Web, and context-aware systems on mobile devices. Rules offer an intuitive yet powerful method for representing human knowledge, and intelligent systems based on rules have many important applications. However, their practical development requires proper techniques and models - a gap that this book effectively addresses.
A principal aim of computer graphics is to generate images that look as real as photographs. Realistic computer graphics imagery has however proven to be quite challenging to produce, since the appearance of materials arises from complicated physical processes that are difficult to analytically model and simulate, and image-based modeling of real material samples is often impractical due to the high-dimensional space of appearance data that needs to be acquired. This book presents a general framework based on the inherent coherency in the appearance data of materials to make image-based appearance modeling more tractable. We observe that this coherence manifests itself as low-dimensional structure in the appearance data, and by identifying this structure we can take advantage of it to simplify the major processes in the appearance modeling pipeline. This framework consists of two key components, namely the coherence structure and the accompanying reconstruction method to fully recover the low-dimensional appearance data from sparse measurements. Our investigation of appearance coherency has led to three major forms of low-dimensional coherence structure and three types of coherency-based reconstruction upon which our framework is built. This coherence-based approach can be comprehensively applied to all the major elements of image-based appearance modeling, from data acquisition of real material samples to user-assisted modeling from a photograph, from synthesis of volumes to editing of material properties, and from efficient rendering algorithms to physical fabrication of objects. In this book we present several techniques built on this coherency framework to handle various appearance modeling tasks both for surface reflections and subsurface scattering, the two primary physical components that generate material appearance. We believe that coherency-based appearance modeling will make it easier and more feasible for practitioners to bring computer graphics imagery to life. This book is aimed towards readers with an interest in computer graphics. In particular, researchers, practitioners and students will benefit from this book by learning about the underlying coherence in appearance structure and how it can be utilized to improve appearance modeling. The specific techniques presented in our manuscript can be of value to anyone who wishes to elevate the realism of their computer graphics imagery. For understanding this book, an elementary background in computer graphics is assumed, such as from an introductory college course or from practical experience with computer graphics. |
You may like...
Chechen-English and English-Chechen…
Johanna Nichols, Ronald L. Sprouse, …
Hardcover
R5,545
Discovery Miles 55 450
The Welsh Academy English-Welsh…
Bruce Griffiths, Dafydd Jones
Hardcover
R1,577
Discovery Miles 15 770
Lives of the Most Eminent Painters…
Vincenzo Fortunato Marchese
Paperback
R570
Discovery Miles 5 700
Harbour Protection Through Data Fusion…
Elisa Shahbazian, Galina Rogova, …
Hardcover
R5,776
Discovery Miles 57 760
|