![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General
The pursuit of artificial intelligence has been a highly active domain of research for decades, yielding exciting scientific insights and productive new technologies. In terms of generating intelligence, however, this pursuit has yielded only limited success. This book explores the hypothesis that adaptive growth is a means of moving forward. By emulating the biological process of development, we can incorporate desirable characteristics of natural neural systems into engineered designs and thus move closer towards the creation of brain-like systems. The particular focus is on how to design artificial neural networks for engineering tasks. The book consists of contributions from 18 researchers, ranging from detailed reviews of recent domains by senior scientists, to exciting new contributions representing the state of the art in machine learning research. The book begins with broad overviews of artificial neurogenesis and bio-inspired machine learning, suitable both as an introduction to the domains and as a reference for experts. Several contributions provide perspectives and future hypotheses on recent highly successful trains of research, including deep learning, the Hyper NEAT model of developmental neural network design, and a simulation of the visual cortex. Other contributions cover recent advances in the design of bio-inspired artificial neural networks, including the creation of machines for classification, the behavioural control of virtual agents, the desi gn of virtual multi-component robots and morphologies and the creation of flexible intelligence. Throughout, the contributors share their vast expertise on the means and benefits of creating brain-like machines. This book is appropriate for advanced students and practitioners of artificial intelligence and machine learning.
The book Soft Computing for Business Intelligence is the remarkable output of a program based on the idea of joint trans-disciplinary research as supported by the Eureka Iberoamerica Network and the University of Oldenburg. It contains twenty-seven papers allocated to three sections: Soft Computing, Business Intelligence and Knowledge Discovery, and Knowledge Management and Decision Making. Although the contents touch different domains they are similar in so far as they follow the BI principle “Observation and Analysis” while keeping a practical oriented theoretical eye on sound methodologies, like Fuzzy Logic, Compensatory Fuzzy Logic (CFL), Rough Sets and other soft computing elements. The book tears down the traditional focus on business, and extends Business Intelligence techniques in an impressive way to a broad range of fields like medicine, environment, wind farming, social collaboration and interaction, car sharing and sustainability.
Digital Speech Processing Using Matlab deals with digital speech pattern recognition, speech production model, speech feature extraction, and speech compression. The book is written in a manner that is suitable for beginners pursuing basic research in digital speech processing. Matlab illustrations are provided for most topics to enable better understanding of concepts. This book also deals with the basic pattern recognition techniques (illustrated with speech signals using Matlab) such as PCA, LDA, ICA, SVM, HMM, GMM, BPN, and KSOM.
This book describes the challenges that critical infrastructure systems face, and presents state of the art solutions to address them. How can we design intelligent systems or intelligent agents that can make appropriate real-time decisions in the management of such large-scale, complex systems? What are the primary challenges for critical infrastructure systems? The book also provides readers with the relevant information to recognize how important infrastructures are, and their role in connection with a society’s economy, security and prosperity. It goes on to describe state-of-the-art solutions to address these points, including new methodologies and instrumentation tools (e.g. embedded software and intelligent algorithms) for transforming and optimizing target infrastructures. The book is the most comprehensive resource to date for professionals in both the private and public sectors, while also offering an essential guide for students and researchers in the areas of modeling and analysis of critical infrastructure systems, monitoring, control, risk/impact evaluation, fault diagnosis, fault-tolerant control, and infrastructure dependencies/interdependencies. The importance of the research presented in the book is reflected in the fact that currently, for the first time in human history, more people live in cities than in rural areas, and that, by 2050, roughly 70% of the world’s total population is expected to live in cities.
This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue is the ultimate challenge in natural language processing, and the key to a wide range of exciting applications. The breadth and depth of coverage of this book makes it suitable as a reference and overview of the state of the field for researchers in Computational Linguistics, Semantics, Computer Science, Cognitive Science, and Artificial Intelligence.
This book brings together a selection of the best papers from the sixteenth edition of the Forum on specification and Design Languages Conference (FDL), which was held in September 2013 in Paris, France. FDL is a well-established international forum devoted to dissemination of research results, practical experiences and new ideas in the application of specification, design and verification languages to the design, modeling and verification of integrated circuits, complex hardware/software embedded systems and mixed-technology systems.
Presenting the concept and design and implementation of configurable intelligent optimization algorithms in manufacturing systems, this book provides a new configuration method to optimize manufacturing processes. It provides a comprehensive elaboration of basic intelligent optimization algorithms, and demonstrates how their improvement, hybridization and parallelization can be applied to manufacturing. Furthermore, various applications of these intelligent optimization algorithms are exemplified in detail, chapter by chapter. The intelligent optimization algorithm is not just a single algorithm; instead it is a general advanced optimization mechanism which is highly scalable with robustness and randomness. Therefore, this book demonstrates the flexibility of these algorithms, as well as their robustness and reusability in order to solve mass complicated problems in manufacturing. Since the genetic algorithm was presented decades ago, a large number of intelligent optimization algorithms and their improvements have been developed. However, little work has been done to extend their applications and verify their competence in solving complicated problems in manufacturing. This book will provide an invaluable resource to students, researchers, consultants and industry professionals interested in engineering optimization. It will also be particularly useful to three groups of readers: algorithm beginners, optimization engineers and senior algorithm designers. It offers a detailed description of intelligent optimization algorithms to algorithm beginners; recommends new configurable design methods for optimization engineers, and provides future trends and challenges of the new configuration mechanism to senior algorithm designers.
What are the limitations of computer models and why do we still not have working models of people that are recognizably human? This is the principle puzzle explored in this book where ideas behind systems that behave intelligently are described and different philosophical issues are touched upon. The key to human behavior is taken to be intelligence and the ability to reason about the world. A strong scientific approach is taken, but first it was required to understand what a scientific approach could mean in the context of both natural and artificial systems. A theory of intelligence is proposed that can be tested and developed in the light of experimental results. The book illustrates that intelligence is much more than just behavior confined to a unique person or a single computer program within a fixed time frame. Some answers are unraveled and some puzzles emerge from these investigations and experiments. Natural and Artificial Reasoning provides a few steps of an exciting journey that began many centuries ago with the word ‘why?’
Design thinking as a user-centric innovation method has become more and more widespread during the past years. An increasing number of people and institutions have experienced its innovative power. While at the same time the demand has grown for a deep, evidence-based understanding of the way design thinking functions. This challenge is addressed by the Design Thinking Research Program between Stanford University, Palo Alto, USA and Hasso Plattner Institute, Potsdam, Germany. Summarizing the outcomes of the 5th program year, this book imparts the scientific findings gained by the researchers through their investigations, experiments and studies. The method of design thinking works when applied with diligence and insight. With this book and the underlying research projects, we aim to understand the innovation process of design thinking and the people behind it. The contributions ultimately center on the issue of building innovators. The focus of the investigation is on what people are doing and thinking when engaged in creative design innovation and how their innovation work can be supported. Therefore, within three topic areas, various frameworks, methodologies, mind sets, systems and tools are explored and further developed. The book begins with an assessment of crucial factors for innovators such as empathy and creativity, the second part addresses the improvement of team collaboration and finally we turn to specific tools and approaches which ensure information transfer during the design process. All in all, the contributions shed light and show deeper insights how to support the work of design teams in order to systematically and successfully develop innovations and design progressive solutions for tomorrow.
With an emphasis on applications of computational models for solving modern challenging problems in biomedical and life sciences, this book aims to bring collections of articles from biologists, medical/biomedical and health science researchers together with computational scientists to focus on problems at the frontier of biomedical and life sciences. The goals of this book are to build interactions of scientists across several disciplines and to help industrial users apply advanced computational techniques for solving practical biomedical and life science problems. This book is for users in the fields of biomedical and life sciences who wish to keep abreast with the latest techniques in signal and image analysis. The book presents a detailed description to each of the applications. It can be used by those both at graduate and specialist levels.
This book describes the advances and applications in Sliding mode control (SMC) which is widely used as a powerful method to tackle uncertain nonlinear systems. The book is organized into 21 chapters which have been organised by the editors to reflect the various themes of sliding mode control. The book provides the reader with a broad range of material from first principles up to the current state of the art in the area of SMC and observation presented in a clear, matter-of-fact style. As such it is appropriate for graduate students with a basic knowledge of classical control theory and some knowledge of state-space methods and nonlinear systems. The resulting design procedures are emphasized using Matlab/Simulink software.
This book is about running modern industrial enterprises with the help of information systems. Enterprise resource planning (ERP) is the core of business information processing. An ERP system is the backbone of most companies' information systems landscape. All major business processes are handled with the help of this system. Supply chain management (SCM) looks beyond the individual company, taking into account that enterprises are increasingly concentrating on their core competencies, leaving other activities to suppliers. With the growing dependency on the partners, effective supply chains have become as important for a company's success as efficient in-house processes. This book covers typical business processes and shows how these processes are implemented. Examples are presented using the leading systems on the market – SAP ERP and SAP SCM. In this way, the reader can understand how business processes are actually carried out "in the real world".
The development of innovative drugs is becoming more difficult while relying on empirical approaches. This inspired all major pharmaceutical companies to pursue alternative model-based paradigms. The key question is: How to find innovative compounds and, subsequently, appropriate dosage regimens? Written from the industry perspective and based on many years of experience, this book offers: - Concepts for creation of drug-disease models, introduced and supplemented with extensive MATLAB programs - Guidance for exploration and modification of these programs to enhance the understanding of key principles - Usage of differential equations to pharmacokinetic, pharmacodynamic and (patho-) physiologic problems thereby acknowledging their dynamic nature - A range of topics from single exponential decay to adaptive dosing, from single subject exploration to clinical trial simulation, and from empirical to mechanistic disease modeling. Students with an undergraduate mathematical background or equivalent education, interest in life sciences and skills in a high-level programming language such as MATLAB, are encouraged to engage in model-based pharmaceutical research and development.
This book celebrates the past, present and future of knowledge management. It brings a timely review of two decades of the accumulated history of knowledge management. By tracking its origin and conceptual development, this review contributes to the improved understanding of the field and helps to assess the unresolved questions and open issues. For practitioners, the book provides a clear evidence of value of knowledge management. Lessons learnt from implementations in business, government and civil sectors help to appreciate the field and gain useful reference points. The book also provides guidance for future research by drawing together authoritative views from people currently facing and engaging with the challenge of knowledge management, who signal a bright future for the field.
The book is about user interfaces to applications that have been designed for social and physical interaction. The interfaces are ‘playful’, that is, users feel challenged to engage in social and physical interaction because that will be fun. The topics that will be present in this book are interactive playgrounds, urban games using mobiles, sensor-equipped environments for playing, child-computer interaction, tangible game interfaces, interactive tabletop technology and applications, full-body interaction, exertion games, persuasion, engagement, evaluation and user experience. Readers of the book will not only get a survey of state-of-the-art research in these areas, but the chapters in this book will also provide a vision of the future where playful interfaces will be ubiquitous, that is, present and integrated in home, office, recreational, sports and urban environments, emphasizing that in the future in these environments game elements will be integrated and welcomed.
Cyberspace in increasingly important to people in their everyday lives for purchasing goods on the Internet, to energy supply increasingly managed remotely using Internet protocols. Unfortunately, this dependence makes us susceptible to attacks from nation states, terrorists, criminals and hactivists. Therefore, we need a better understanding of cyberspace, for which patterns, which are predictable regularities, may help to detect, understand and respond to incidents better. The inspiration for the workshop came from the existing work on formalising design patterns applied to cybersecurity, but we also need to understand the many other types of patterns that arise in cyberspace.
This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications and multimedia systems methods and applications. The book will be interesting for practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced academics and specialists interested in developing and verification of intelligent information, database and multimedia systems models, methods and applications. The readers of this volume are enabled to find many inspiring ideas and motivating practical examples that will help them in the current and future work.
This contributed volume explores how data mining, machine learning, and similar statistical techniques can analyze the types of problems arising from Traditional Chinese Medicine (TCM) research. The book focuses on the study of clinical data and the analysis of herbal data. Challenges addressed include diagnosis, prescription analysis, ingredient discoveries, network based mechanism deciphering, pattern-activity relationships, and medical informatics. Each author demonstrates how they made use of machine learning, data mining, statistics and other analytic techniques to resolve their research challenges, how successful if these techniques were applied, any insight noted and how these insights define the most appropriate future work to be carried out. Readers are given an opportunity to understand the complexity of diagnosis and treatment decision, the difficulty of modeling of efficacy in terms of herbs, the identification of constituent compounds in an herb, the relationship between these compounds and biological outcome so that evidence-based predictions can be made. Drawing on a wide range of experienced contributors, Data Analytics for Traditional Chinese Medicine Research is a valuable reference for professionals and researchers working in health informatics and data mining. The techniques are also useful for biostatisticians and health practitioners interested in traditional medicine and data analytics.
This book is devoted to the study of the functional architecture of the visual cortex. Its geometrical structure is the differential geometry of the connectivity between neural cells. This connectivity is building and shaping the hidden brain structures underlying visual perception. The story of the problem runs over the last 30 years, since the discovery of Hubel and Wiesel of the modular structure of the primary visual cortex, and slowly cams towards a theoretical understanding of the experimental data on what we now know as functional architecture of the primary visual cortex. Experimental data comes from several domains: neurophysiology, phenomenology of perception and neurocognitive imaging. Imaging techniques like functional MRI and diffusion tensor MRI allow to deepen the study of cortical structures. Due to this variety of experimental data, neuromathematematics deals with modelling both cortical structures and perceptual spaces. From the mathematical point of view, neuromathematical call for new instruments of pure mathematics: sub-Riemannian geometry models horizontal connectivity, harmonic analysis in non commutative groups allows to understand pinwheels structure, as well as non-linear dimensionality reduction is at the base of many neural morphologies and possibly of the emergence of perceptual units. But at the center of the neurogeometry is the problem of harmonizing contemporary mathematical instruments with neurophysiological findings and phenomenological experiments in an unitary science of vision. The contributions to this book come from the very founders of the discipline.
Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity and dissimilarity indicators, and analogical proportions in lattice structures. Lastly, the volume reports case studies and discussions about the use of similarity judgments and the process of analogy making, at work in IQ tests, creativity or other cognitive tasks. This volume gathers fully revised and expanded versions of papers presented at an international workshop‚ as well as invited contributions. All chapters have benefited of a thorough peer review process.
This book introduces Local Binary Patterns (LBP), arguably one of the most powerful texture descriptors, and LBP variants. This volume provides the latest reviews of the literature and a presentation of some of the best LBP variants by researchers at the forefront of textual analysis research and research on LBP descriptors and variants. The value of LBP variants is illustrated with reported experiments using many databases representing a diversity of computer vision applications in medicine, biometrics, and other areas. There is also a chapter that provides an excellent theoretical foundation for texture analysis and LBP in particular. A special section focuses on LBP and LBP variants in the area of face recognition, including thermal face recognition. This book will be of value to anyone already in the field as well as to those interested in learning more about this powerful family of texture descriptors.
As virtual reality expands from the imaginary worlds of science fiction and pervades every corner of everyday life, it is becoming increasingly important for students and professionals alike to understand the diverse aspects of this technology. This book aims to provide a comprehensive guide to the theoretical and practical elements of virtual reality, from the mathematical and technological foundations of virtual worlds to the human factors and the applications that enrich our lives: in the fields of medicine, entertainment, education and others. After providing a brief introduction to the topic, the book describes the kinematic and dynamic mathematical models of virtual worlds. It explores the many ways a computer can track and interpret human movement, then progresses through the modalities that make up a virtual world: visual, acoustic and haptic. It explores the interaction between the actual and virtual environments, as well as design principles of the latter. The book closes with an examination of different applications, focusing on augmented reality as a special case. Though the content is primarily VR-related, it is also relevant for many other fields.
This book introduces approaches that have the potential to transform the daily practice of psychiatrists and psychologists. This includes the asynchronous communication between mental health care providers and clients as well as the automation of assessment and therapy. Speech and language are particularly interesting from the viewpoint of psychological assessment. For instance, depression may change the characteristics of voice in individuals and these changes can be detected by a special form of speech analysis. Computational screening methods that utilize speech and language can detect subtle changes and alert clinicians as well as individuals and caregivers. The use of online technologies in mental health, however, poses ethical problems that will occupy concerned individuals, governments and the wider public for some time. Assuming that these ethical problems can be solved, it should be possible to diagnose and treat mental health disorders online (excluding the use of medication). Speech and language are particularly interesting from the viewpoint of psychological assessment. For instance, depression may change the characteristics of voice in individuals and these changes can be detected by a special form of speech analysis. Computational screening methods that utilize speech and language can detect subtle changes and alert clinicians as well as individuals and caregivers. The use of online technologies in mental health, however, poses ethical problems that will occupy concerned individuals, governments and the wider public for some time. Assuming that these ethical problems can be solved, it should be possible to diagnose and treat mental health disorders online (excluding the use of medication).
This book, written by experts from universities and major industrial research laboratories, is devoted to the very hot topic of cognitive radio and networking for cooperative coexistence of heterogeneous wireless networks. Selected highly relevant advanced research is presented on spectrum sensing and progress toward the realization of accurate radio environment mapping, biomimetic learning for self-organizing networks, security threats (with a special focus on primary user emulation attack), and cognition as a tool for green next-generation networks. The research activities covered include work undertaken within the framework of the European COST Action IC0902, which is geared towards the definition of a European platform for cognitive radio and networks. Communications engineers, R&D engineers, researchers, and students will all benefit from this complete reference on recent advances in wireless communications and the design and implementation of cognitive radio systems and networks.
This book is aimed at presenting concepts, methods and algorithms ableto cope with undersampled and limited data. One such trend that recently gained popularity and to some extent revolutionised signal processing is compressed sensing. Compressed sensing builds upon the observation that many signals in nature are nearly sparse (or compressible, as they are normally referred to) in some domain, and consequently they can be reconstructed to within high accuracy from far fewer observations than traditionally held to be necessary. Apart from compressed sensing this book contains other related approaches. Each methodology has its own formalities for dealing with such problems. As an example, in the Bayesian approach, sparseness promoting priors such as Laplace and Cauchy are normally used for penalising improbable model variables, thus promoting low complexity solutions. Compressed sensing techniques and homotopy-type solutions, such as the LASSO, utilise l1-norm penalties for obtaining sparse solutions using fewer observations than conventionally needed. The book emphasizes on the role of sparsity as a machinery for promoting low complexity representations and likewise its connections to variable selection and dimensionality reduction in various engineering problems. This book is intended for researchers, academics and practitioners with interest in various aspects and applications of sparse signal processing. |
You may like...
If Anyone Builds It, Everyone Dies - The…
Eliezer Yudkowsky, Nate Soares
Paperback
The Coming Wave - AI, Power and Our…
Mustafa Suleyman, Michael Bhaskar
Paperback
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
R1,150
Discovery Miles 11 500
Algorithms, Collusion and Competition…
Steven Van Uytsel, Salil K. Mehra, …
Hardcover
R3,597
Discovery Miles 35 970
Handbook of Artificial Intelligence in…
Benedict du Boulay, Antonija Mitrovic, …
Hardcover
R8,636
Discovery Miles 86 360
|