![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General
This book describes recent radiotherapy technologies including tools for measuring target position during radiotherapy and tracking-based delivery systems. This book presents a customized prediction of respiratory motion with clustering from multiple patient interactions. The proposed method contributes to the improvement of patient treatments by considering breathing pattern for the accurate dose calculation in radiotherapy systems. Real-time tumor-tracking, where the prediction of irregularities becomes relevant, has yet to be clinically established. The statistical quantitative modeling for irregular breathing classification, in which commercial respiration traces are retrospectively categorized into several classes based on breathing pattern are discussed as well. The proposed statistical classification may provide clinical advantages to adjust the dose rate before and during the external beam radiotherapy for minimizing the safety margin. In the first chapter following the Introduction to this book, we review three prediction approaches of respiratory motion: model-based methods, model-free heuristic learning algorithms, and hybrid methods. In the following chapter, we present a phantom study—prediction of human motion with distributed body sensors—using a Polhemus Liberty AC magnetic tracker. Next we describe respiratory motion estimation with hybrid implementation of extended Kalman filter. The given method assigns the recurrent neural network the role of the predictor and the extended Kalman filter the role of the corrector. After that, we present customized prediction of respiratory motion with clustering from multiple patient interactions. For the customized prediction, we construct the clustering based on breathing patterns of multiple patients using the feature selection metrics that are composed of a variety of breathing features. We have evaluated the new algorithm by comparing the prediction overshoot and the tracking estimation value. The experimental results of 448 patients’ breathing patterns validated the proposed irregular breathing classifier in the last chapter.
This book provides the readers with a thorough and systematic introduction to hesitant fuzzy theory. It presents the most recent research results and advanced methods in the field. These includes: hesitant fuzzy aggregation techniques, hesitant fuzzy preference relations, hesitant fuzzy measures, hesitant fuzzy clustering algorithms and hesitant fuzzy multi-attribute decision making methods. Since its introduction by Torra and Narukawa in 2009, hesitant fuzzy sets have become more and more popular and have been used for a wide range of applications, from decision-making problems to cluster analysis, from medical diagnosis to personnel appraisal and information retrieval. This book offers a comprehensive report on the state-of-the-art in hesitant fuzzy sets theory and applications, aiming at becoming a reference guide for both researchers and practitioners in the area of fuzzy mathematics and other applied research fields (e.g. operations research, information science, management science and engineering) characterized by uncertain ("hesitant") information. Because of its clarity and self contained explanations, the book can also be adopted as a textbook from graduate and advanced undergraduate students.
FSR, the International Conference on Field and Service Robotics, is the leading single track conference of robotics for field and service applications. This book presents the results of FSR2012, the eighth conference of Field and Service Robotics, which was originally planned for 2011 with the venue of Matsushima in Tohoku region of Japan. However, on March 11, 2011, a magnitude M9.0 earthquake occurred off the Pacific coast of Tohoku, and a large-scale disaster was caused by the Tsunami which resulted, therefore the conference was postponed by one year to July, 2012. In fact, this earthquake raised issues concerning the contribution of field and service robotics technology to emergency scenarios. A number of precious lessons were learned from operation of robots in the resulting, very real and challenging, disaster environments. Up-to-date study on disaster response, relief and recovery was then featured in the conference. This book offers 43 papers on a broad range of topics including: Disaster Response, Service/Entertainment Robots, Inspection/Maintenance Robots, Mobile Robot Navigation, Agricultural Robots, Robots for Excavation, Planetary Exploration, Large Area Mapping, SLAM for Outdoor Robots, and Elemental Technology for Mobile Robots.
This present book provides an alternative approach to study the pre-kernel solution of transferable utility games based on a generalized conjugation theory from convex analysis. Although the pre-kernel solution possesses an appealing axiomatic foundation that lets one consider this solution concept as a standard of fairness, the pre-kernel and its related solutions are regarded as obscure and too technically complex to be treated as a real alternative to the Shapley value. Comprehensible and efficient computability is widely regarded as a desirable feature to qualify a solution concept apart from its axiomatic foundation as a standard of fairness. We review and then improve an approach to compute the pre-kernel of a cooperative game by the indirect function. The indirect function is known as the Fenchel-Moreau conjugation of the characteristic function. Extending the approach with the indirect function, we are able to characterize the pre-kernel of the grand coalition simply by the solution sets of a family of quadratic objective functions.
The methodology described in this book is the result of many years of research experience in the field of synthesizable VHDL design targeting FPGA based platforms. VHDL was first conceived as a documentation language for ASIC designs. Afterwards, the language was used for the behavioral simulation of ASICs, and also as a design input for synthesis tools. VHDL is a rich language, but just a small subset of it can be used to write synthesizable code, from which a physical circuit can be obtained. Usually VHDL books describe both, synthesis and simulation aspects of the language, but in this book the reader is conducted just through the features acceptable by synthesis tools. The book introduces the subjects in a gradual and concise way, providing just enough information for the reader to develop their synthesizable digital systems in VHDL. The examples in the book were planned targeting an FPGA platform widely used around the world.
The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations. Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.
This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms. The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling. Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social networks, business intelligence and forming collaboration schemes. Given the subject area, this book is aimed at a broad audience of researchers and practitioners. Owing to the nature of the material being covered and a way it is organized, the volume will appeal to the well-established communities including those active in various disciplines in which social networks, their analysis and optimization are of genuine relevance. Those involved in operations research, management, various branches of engineering, and economics will benefit from the exposure to the subject matter.
Forecasting is a crucial function for companies in the fashion industry, but for many real-life forecasting applications in the, the data patterns are notorious for being highly volatile and it is very difficult, if not impossible, to analytically learn about the underlying patterns. As a result, many traditional methods (such as pure statistical models) will fail to make a sound prediction. Over the past decade, advances in artificial intelligence and computing technologies have provided an alternative way of generating precise and accurate forecasting results for fashion businesses. Despite being an important and timely topic, there is currently an absence of a comprehensive reference source that provides up-to-date theoretical and applied research findings on the subject of intelligent fashion forecasting systems. This three-part handbook fulfills this need and covers materials ranging from introductory studies and technical reviews, theoretical modeling research, to intelligent fashion forecasting applications and analysis. This book is suitable for academic researchers, graduate students, senior undergraduate students and practitioners who are interested in the latest research on fashion forecasting.
This book is concerned with recent advances in fitness landscapes. The concept of fitness landscapes originates from theoretical biology and refers to a framework for analysing and visualizing the relationships between genotypes, phenotypes and fitness. These relationships lay at the centre of attempts to mathematically describe evolutionary processes and evolutionary dynamics. The book addresses recent advances in the understanding of fitness landscapes in evolutionary biology and evolutionary computation. In the volume, experts in the field of fitness landscapes present these findings in an integrated way to make it accessible to a number of audiences: senior undergraduate and graduate students in computer science, theoretical biology, physics, applied mathematics and engineering, but also researcher looking for a reference or/and entry point into using fitness landscapes for analysing algorithms. Also practitioners wanting to employ fitness landscape techniques for evaluating bio- and nature-inspired computing algorithms can find valuable material in the book. For teaching proposes, the book could also be used as a reference handbook.
This book, now at the third edition, addresses the main control aspects in underwater manipulation tasks. The mathematical model with significant impact on the control strategy is discussed. The problem of controlling a 6-degrees-of-freedoms autonomous underwater vehicle is deeply investigated and a survey of fault detection/tolerant strategies for unmanned underwater vehicles is provided. Inverse kinematics, dynamic and interaction control for underwater vehicle-manipulator systems are then discussed. The code used to generate most of the numerical simulations is made available and briefly discussed.
The aim of this book is to understand the state-of-the-art theoretical and practical advances of swarm intelligence. It comprises seven contemporary relevant chapters. In chapter 1, a review of Bacteria Foraging Optimization (BFO) techniques for both single and multiple criterions problem is presented. A survey on swarm intelligence for multiple and many objectives optimization is presented in chapter 2 along with a topical study on EEG signal analysis. Without compromising the extensive simulation study, a comparative study of variants of MOPSO is provided in chapter 3. Intractable problems like subset and job scheduling problems are discussed in chapters 4 and 7 by different hybrid swarm intelligence techniques. An attempt to study image enhancement by ant colony optimization is made in chapter 5. Finally, chapter 7 covers the aspect of uncertainty in data by hybrid PSO.      Â
“Propagation, which looks at spreading in complex networks, can be seen from many viewpoints; it is undesirable, or desirable, controllable, the mechanisms generating that propagation can be the topic of interest, but in the end all depends on the setting. This book covers leading research on a wide spectrum of propagation phenomenon and the techniques currently used in its modelling, prediction, analysis and control. Fourteen papers range over topics including epidemic models, models for trust inference, coverage strategies for networks, vehicle flow propagation, bio-inspired routing algorithms, P2P botnet attacks and defences, fault propagation in gene-cellular networks, malware propagation for mobile networks, information propagation in crisis situations, financial contagion in interbank networks, and finally how to maximize the spread of influence in social networks. The compendium will be of interest to researchers, those working in social networking, communications and finance and is aimed at providing a base point for further studies on current research. Above all, by bringing together research from such diverse fields, the book seeks to cross-pollinate ideas, and give the reader a glimpse of the breath of current research.”
The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice’s contributions to pragmatics or in interpretation by abduction.
This book presents a comprehensive report on the evolution of Fuzzy Logic since its formulation in Lotfi Zadeh’s seminal paper on “fuzzy sets,” published in 1965. In addition, it features a stimulating sampling from the broad field of research and development inspired by Zadeh’s paper. The chapters, written by pioneers and prominent scholars in the field, show how fuzzy sets have been successfully applied to artificial intelligence, control theory, inference, and reasoning. The book also reports on theoretical issues; features recent applications of Fuzzy Logic in the fields of neural networks, clustering, data mining and software testing; and highlights an important paradigm shift caused by Fuzzy Logic in the area of uncertainty management. Conceived by the editors as an academic celebration of the fifty years’ anniversary of the 1965 paper, this work is a must-have for students and researchers willing to get an inspiring picture of the potentialities, limitations, achievements and accomplishments of Fuzzy Logic-based systems.
Intelligent information and database systems are two closely related subfields of modern computer science which have been known for over thirty years. They focus on the integration of artificial intelligence and classic database technologies to create the class of next generation information systems. The book focuses on new trends in intelligent information and database systems and discusses topics addressed to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, and implementation, their validation, maintenance and evolution. They cover a broad spectrum of research topics discussed both from the practical and theoretical points of view such as: intelligent information retrieval, natural language processing, semantic web, social networks, machine learning, knowledge discovery, data mining, uncertainty management and reasoning under uncertainty, intelligent optimization techniques in information systems, security in databases systems, and multimedia data analysis. Intelligent information systems and their applications in business, medicine and industry, database systems applications, and intelligent internet systems are also presented and discussed in the book. The book consists of 38 chapters based on original works presented during the 7th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2015) held on 23–25 March 2015 in Bali, Indonesia. The book is divided into six parts related to Advanced Machine Learning and Data Mining, Intelligent Computational Methods in Information Systems, Semantic Web, Social Networks and Recommendation Systems, Cloud Computing and Intelligent Internet Systems, Knowledge and Language Processing, and Intelligent Information and Database Systems: Applications.
This research volume is a continuation of our previous volumes on intelligent machine. It is divided into three parts. Part I deals with big data and ontologies. It includes examples related to the text mining, rule mining and ontology. Part II is on knowledge-based systems. It includes context-centered systems, knowledge discovery, interoperability, consistency and systems of systems. The final part is on applications. The applications involve prediction, decision optimization and assessment. This book is directed to the researchers who wish to explore the field of knowledge engineering further.
3D GeoInfo aims to bring together international state-of-the-art research and facilitate the dialogue on emerging topics in the field of 3D geo-information. The conference offers an interdisciplinary forum in the fields of 3D data collection and modeling; reconstruction and methods for 3D representation; data management for maintenance of 3D geo-information or 3D data analysis and visualization. The book covers the best papers from 3D GeoInfo held in Istanbul in November 2013.
Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the readers will have stimulating experiences to pursue research in these directions.
This is a unique book addressing the integration of risk methodology from various fields. It will stimulate intellectual debate and communication across disciplines, promote better risk management practices and contribute to the development of risk management methodologies. Individual chapters explain fundamental risk models and measurement, and address risk and security issues from diverse areas such as finance and insurance, the health sciences, life sciences, engineering and information science. Integrated Risk Sciences is an emerging discipline that considers risks in different fields, aiming at a common language, and at sharing and improving methods developed in different fields. Readers should have a Bachelor degree and have taken at least one basic university course in statistics and probability. The main goal of the book is to provide basic knowledge on risk and security in a common language; the authors have taken particular care to ensure that all content can readily be understood by doctoral students and researchers across disciplines. Each chapter provides simple case studies and examples, open research questions and discussion points, and a selected bibliography inviting readers to further study.
Research into social systems is challenging due to their complex nature. Traditional methods of analysis are often difficult to apply effectively as theories evolve over time. This can be due to a lack of appropriate data, or too much uncertainty. It can also be the result of problems which are not yet understood well enough in the general sense so that they can be classified, and an appropriate solution quickly identified. Simulation is one tool that deals well with these challenges, fits in well with the deductive process, and is useful for testing theory. This field is still relatively new, and much of the work is necessarily innovative, although it builds upon a rich and varied foundation. There are a number of existing modelling paradigms being applied to complex social systems research. Additionally, new methods and measures are being devised through the process of conducting research. We expect that readers will enjoy the collection of high quality research works from new and accomplished researchers. Â Â
The book is a collection of peer-reviewed scientific papers submitted by active researchers in the 37th National System Conference (NSC 2013). NSC is an annual event of the Systems Society of India (SSI), primarily oriented to strengthen the systems movement and its applications for the welfare of humanity. A galaxy of academicians, professionals, scientists, statesman and researchers from different parts of the country and abroad are invited to attend the conference. The book presents research articles in the areas of system’s modelling, complex network modelling, cyber security, sustainable systems design, health care systems, socio-economic systems, and clean and green technologies. The book can be used as a tool for further research.
This book provides readers with a snapshot of the state-of-the art in fuzzy logic. Throughout the chapters, key theories developed in the last fifty years as well as important applications to practical problems are presented and discussed from different perspectives, as the authors hail from different disciplines and therefore use fuzzy logic for different purposes. The book aims at showing how fuzzy logic has evolved since the first theory formulation by Lotfi A. Zadeh in his seminal paper on Fuzzy Sets in 1965. Fuzzy theories and implementation grew at an impressive speed and achieved significant results, especially on the applicative side. The study of fuzzy logic and its practice spread all over the world, from Europe to Asia, America and Oceania. The editors believe that, thanks to the drive of young researchers, fuzzy logic will be able to face the challenging goals posed by computing with words. New frontiers of knowledge are waiting to be explored. In order to motivate young people to engage in the future development of fuzzy logic, fuzzy methodologies, fuzzy applications, etc., the editors invited a team of internationally respected experts to write the present collection of papers, which shows the present and future potentials of fuzzy logic from different disciplinary perspectives and personal standpoints.
This book is intended as an introduction to fuzzy algebraic hyperstructures. As the first in its genre, it includes a number of topics, most of which reflect the authors’ past research and thus provides a starting point for future research directions. The book is organized in five chapters. The first chapter introduces readers to the basic notions of algebraic structures and hyperstructures. The second covers fuzzy sets, fuzzy groups and fuzzy polygroups. The following two chapters are concerned with the theory of fuzzy Hv-structures: while the third chapter presents the concept of fuzzy Hv-subgroup of Hv-groups, the fourth covers the theory of fuzzy Hv-ideals of Hv-rings. The final chapter discusses several connections between hypergroups and fuzzy sets, and includes a study on the association between hypergroupoids and fuzzy sets endowed with two membership functions. In addition to providing a reference guide to researchers, the book is also intended as textbook for undergraduate and graduate students.
This book collects ECM research from the academic discipline of Information Systems and related fields to support academics and practitioners who are interested in understanding the design, use and impact of ECM systems. It also provides a valuable resource for students and lecturers in the field. “Enterprise content management in Information Systems research – Foundations, methods and cases” consolidates our current knowledge on how today’s organizations can manage their digital information assets. The business challenges related to organizational information management include reducing search times, maintaining information quality, and complying with reporting obligations and standards. Many of these challenges are well-known in information management, but because of the vast quantities of information being generated today, they are more difficult to deal with than ever. Many companies use the term “enterprise content management” (ECM) to refer to the management of all forms of information, especially unstructured information. While ECM systems promise to increase and maintain information quality, to streamline content-related business processes, and to track the lifecycle of information, their implementation poses several questions and challenges: Which content objects should be put under the control of the ECM system? Which processes are affected by the implementation? How should outdated technology be replaced? Research is challenged to support practitioners in answering these questions.
Sparse grids have gained increasing interest in recent years for the numerical treatment of high-dimensional problems. Whereas classical numerical discretization schemes fail in more than three or four dimensions, sparse grids make it possible to overcome the “curse” of dimensionality to some degree, extending the number of dimensions that can be dealt with. This volume of LNCSE collects the papers from the proceedings of the second workshop on sparse grids and applications, demonstrating once again the importance of this numerical discretization scheme. The selected articles present recent advances on the numerical analysis of sparse grids as well as efficient data structures, and the range of applications extends to uncertainty quantification settings and clustering, to name but a few examples. |
You may like...
Principles Of Business Information…
Ralph Stair, George Reynolds, …
Paperback
(1)
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
R1,150
Discovery Miles 11 500
|