![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
This book provides a valuable combination of relevant research works on developing smart city ecosystem from the artificial intelligence (AI) and Internet of things (IoT) perspective. The technical research works presented here are focused on a number of aspects of smart cities: smart mobility, smart living, smart environment, smart citizens, smart government, and smart waste management systems as well as related technologies and concepts. This edited book offers critical insight to the key underlying research themes within smart cities, highlighting the limitations of current developments and potential future directions.
This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.
The book presents a comprehensive treatment on a novel design theory that fosters innovative thinking and creativity essential for addressing wicked problems. Wicked problems are ill-defined, ambiguous in both aims and solutions, and complex with interconnected and intertwined (coupled) factors. While being ubiquitous and difficult, however, wicked problems share characteristics common to science and design in three regards, namely agent finitude, system complexity, and problem normativity. These fundamental attributes allow a core cognitive process common to design and science to be identified and a strategic problem-solving conception of methodology be formulated as a result. The theory facilitates new opportunities for synergetic cross-disciplinary research and practice by incorporating the essences of Extenics to axiomatic design. Innovative thinking is enabled by exploring Extenics for problem reframing, paradigm shift, and abductive reasoning and by engaging axiomatic design in the co-evolution (iteration) of the need and viable design concept. The theory is unique in that it is a framework for quantifying imprecise and vague design information available during the conceptual design stage as mathematical expression and algorithm early in the design effort and enables the objective evaluation and emergence of an optimal design concept from among multitude of viable ones. The book is conceived for students and real-world practitioners in engineering, natural and social sciences, business, and fine arts who seek to develop powerful design thinking for solving problems in a creative and innovative way.
The objective of the book is to provide materials to demonstrate the development of TOPSIS and to serve as a handbook. It contains the basic process of TOPSIS, numerous variant processes, property explanations, theoretical developments, and illustrative examples with real-world cases. Possible readers would be graduate students, researchers, analysts, and professionals who are interested in TOPSIS, a distance-based algorithm, and who would like to compare TOPSIS with other MCDM methods. The book serves as a research reference as well as a self-learning book with step-by-step illustrations for the MCDM community.
Fault-Tolerant Parallel Computation presents recent advances in algorithmic ways of introducing fault-tolerance in multiprocessors under the constraint of preserving efficiency. The difficulty associated with combining fault-tolerance and efficiency is that the two have conflicting means: fault-tolerance is achieved by introducing redundancy, while efficiency is achieved by removing redundancy. This monograph demonstrates how in certain models of parallel computation it is possible to combine efficiency and fault-tolerance and shows how it is possible to develop efficient algorithms without concern for fault-tolerance, and then correctly and efficiently execute these algorithms on parallel machines whose processors are subject to arbitrary dynamic fail-stop errors. The efficient algorithmic approaches to multiprocessor fault-tolerance presented in this monograph make a contribution towards bridging the gap between the abstract models of parallel computation and realizable parallel architectures. Fault-Tolerant Parallel Computation presents the state of the art in algorithmic approaches to fault-tolerance in efficient parallel algorithms. The monograph synthesizes work that was presented in recent symposia and published in refereed journals by the authors and other leading researchers. This is the first text that takes the reader on the grand tour of this new field summarizing major results and identifying hard open problems. This monograph will be of interest to academic and industrial researchers and graduate students working in the areas of fault-tolerance, algorithms and parallel computation and may also be used as a text in a graduate course on parallel algorithmic techniques and fault-tolerance.
Takes an interdisciplinary approach to contribute to the ongoing development of human-AI interaction. Current debate and development of AI is "algorithm-driven" or technical-oriented in lieu of human-centered. At present, there is no systematic interdisciplinary discussion to effectively deal with issues and challenges arising from AI. This book offers critical analysis of the logic and social implications of algorithmic processes. Reporting from the processes of scientific research, the results can be useful for understanding the relationship between algorithms and humans, allowing AI designers to assess the quality of the meaningful interactions with AI systems.
This volume is intended to be used as a textbook for a special topic course in computer science. It addresses contemporary research topics of interest such as intelligent control, genetic algorithms, neural networks, optimization techniques, expert systems, fractals, and computer vision. The work incorporates many new research ideas, and focuses on the role of continuous mathematics. Audience: This book will be valuable to graduate students interested in theoretical computer topics, algorithms, expert systems, neural networks, and software engineering.
In this book the author introduces a novel approach to securing exam systems. He provides an in-depth understanding, useful for studying the security of exams and similar systems, such as public tenders, personnel selections, project reviews, and conference management systems. After a short chapter that explains the context and objectives of the book, in Chap. 2 the author introduces terminology for exams and the foundations required to formulate their security requirements. He describes the tasks that occur during an exam, taking account of the levels of detail and abstraction of an exam specification and the threats that arise out of the different exam roles. He also presents a taxonomy that classifies exams by types and categories. Chapter 3 contains formal definitions of the authentication, privacy, and verifiability requirements for exams, a framework based on the applied pi-calculus for the specification of authentication and privacy, and a more abstract approach based on set-theory that enables the specification of verifiability. Chapter 4 describes the Huszti-Petho protocol in detail and proposes a security enhancement. In Chap. 5 the author details Remark!, a protocol for Internet-based exams, discussing its cryptographic building blocks and some security considerations. Chapter 6 focuses on WATA, a family of computer-assisted exams that employ computer assistance while keeping face-to-face testing. The chapter also introduces formal definitions of accountability requirements and details the analysis of a WATA protocol against such definitions. In Chaps. 4, 5, and 6 the author uses the cryptographic protocol verifier ProVerif for the formal analyses. Finally, the author outlines future work in Chap. 7. The book is valuable for researchers and graduate students in the areas of information security, in particular for people engaged with exams or protocols.
This book provides tools and algorithms for solving a wide class of optimization tasks by learning from their repetitions. A unified framework is provided for learning algorithms that are based on the stochastic gradient (a golden standard in learning), including random simultaneous perturbations and the response surface the methodology. Original algorithms include model-free learning of short decision sequences as well as long sequences-relying on model-supported gradient estimation. Learning is based on whole sequences of a process observation that are either vectors or images. This methodology is applicable to repetitive processes, covering a wide range from (additive) manufacturing to decision making for COVID-19 waves mitigation. A distinctive feature of the algorithms is learning between repetitions-this idea extends the paradigms of iterative learning and run-to-run control. The main ideas can be extended to other decision learning tasks, not included in this book. The text is written in a comprehensible way with the emphasis on a user-friendly presentation of the algorithms, their explanations, and recommendations on how to select them. The book is expected to be of interest to researchers, Ph.D., and graduate students in computer science and engineering, operations research, decision making, and those working on the iterative learning control.
Machine learning (ML) has become a commonplace element in our everyday lives and a standard tool for many fields of science and engineering. To make optimal use of ML, it is essential to understand its underlying principles. This book approaches ML as the computational implementation of the scientific principle. This principle consists of continuously adapting a model of a given data-generating phenomenon by minimizing some form of loss incurred by its predictions. The book trains readers to break down various ML applications and methods in terms of data, model, and loss, thus helping them to choose from the vast range of ready-made ML methods. The book's three-component approach to ML provides uniform coverage of a wide range of concepts and techniques. As a case in point, techniques for regularization, privacy-preservation as well as explainability amount to specific design choices for the model, data, and loss of a ML method.
The Complexity Theory Companion is an accessible, algorithmically oriented, research-centered, up-to-date guide to some of the most interesting techniques of complexity theory. The book's thesis is that simple algorithms are at the heart of complexity theory. From the tree-pruning and interval-pruning algorithms that shape the first chapter to the query simulation procedures that dominate the last chapter, the central proof methods of the book are algorithmic. And to more clearly highlight the role of algorithmic techniques in complexity theory, the book is - unlike other texts on complexity - organized by technique rather than by topic. Each chapter of this book focuses on one technique: what it is, and what results and applications it yields. This textbook was developed at the University of Rochester in courses given to graduate students and advanced undergraduates. Researchers also will find this book a valuable source of reference due to the comprehensive bibliography of close to five hundred entries, the thirty-five page subject index, and the appendices giving overviews of complexity classes and reductions.
Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.
Access to large data sets has led to a paradigm shift in the tourism research landscape. Big data is enabling a new form of knowledge gain, while at the same time shaking the epistemological foundations and requiring new methods and analysis approaches. It allows for interdisciplinary cooperation between computer sciences and social and economic sciences, and complements the traditional research approaches. This book provides a broad basis for the practical application of data science approaches such as machine learning, text mining, social network analysis, and many more, which are essential for interdisciplinary tourism research. Each method is presented in principle, viewed analytically, and its advantages and disadvantages are weighed up and typical fields of application are presented. The correct methodical application is presented with a "how-to" approach, together with code examples, allowing a wider reader base including researchers, practitioners, and students entering the field. The book is a very well-structured introduction to data science - not only in tourism - and its methodological foundations, accompanied by well-chosen practical cases. It underlines an important insight: data are only representations of reality, you need methodological skills and domain background to derive knowledge from them - Hannes Werthner, Vienna University of Technology Roman Egger has accomplished a difficult but necessary task: make clear how data science can practically support and foster travel and tourism research and applications. The book offers a well-taught collection of chapters giving a comprehensive and deep account of AI and data science for tourism - Francesco Ricci, Free University of Bozen-Bolzano This well-structured and easy-to-read book provides a comprehensive overview of data science in tourism. It contributes largely to the methodological repository beyond traditional methods. - Rob Law, University of Macau
This open access book gives an overview of cutting-edge work on a new paradigm called the "sublinear computation paradigm," which was proposed in the large multiyear academic research project "Foundations of Innovative Algorithms for Big Data." That project ran from October 2014 to March 2020, in Japan. To handle the unprecedented explosion of big data sets in research, industry, and other areas of society, there is an urgent need to develop novel methods and approaches for big data analysis. To meet this need, innovative changes in algorithm theory for big data are being pursued. For example, polynomial-time algorithms have thus far been regarded as "fast," but if a quadratic-time algorithm is applied to a petabyte-scale or larger big data set, problems are encountered in terms of computational resources or running time. To deal with this critical computational and algorithmic bottleneck, linear, sublinear, and constant time algorithms are required.The sublinear computation paradigm is proposed here in order to support innovation in the big data era. A foundation of innovative algorithms has been created by developing computational procedures, data structures, and modelling techniques for big data. The project is organized into three teams that focus on sublinear algorithms, sublinear data structures, and sublinear modelling. The work has provided high-level academic research results of strong computational and algorithmic interest, which are presented in this book. The book consists of five parts: Part I, which consists of a single chapter on the concept of the sublinear computation paradigm; Parts II, III, and IV review results on sublinear algorithms, sublinear data structures, and sublinear modelling, respectively; Part V presents application results. The information presented here will inspire the researchers who work in the field of modern algorithms.
The massive volume of data generated in modern applications can overwhelm our ability to conveniently transmit, store, and index it. For many scenarios, building a compact summary of a dataset that is vastly smaller enables flexibility and efficiency in a range of queries over the data, in exchange for some approximation. This comprehensive introduction to data summarization, aimed at practitioners and students, showcases the algorithms, their behavior, and the mathematical underpinnings of their operation. The coverage starts with simple sums and approximate counts, building to more advanced probabilistic structures such as the Bloom Filter, distinct value summaries, sketches, and quantile summaries. Summaries are described for specific types of data, such as geometric data, graphs, and vectors and matrices. The authors offer detailed descriptions of and pseudocode for key algorithms that have been incorporated in systems from companies such as Google, Apple, Microsoft, Netflix and Twitter.
This book highlights various evolutionary algorithm techniques for various medical conditions and introduces medical applications of evolutionary computation for real-time diagnosis. Evolutionary Intelligence for Healthcare Applications presents how evolutionary intelligence can be used in smart healthcare systems involving big data analytics, mobile health, personalized medicine, and clinical trial data management. It focuses on emerging concepts and approaches and highlights various evolutionary algorithm techniques used for early disease diagnosis, prediction, and prognosis for medical conditions. The book also presents ethical issues and challenges that can occur within the healthcare system. Researchers, healthcare professionals, data scientists, systems engineers, students, programmers, clinicians, and policymakers will find this book of interest.
This contributed volume offers practical solutions and design-, modeling-, and implementation-related insights that address current research problems in memristors, memristive devices, and memristor computing. The book studies and addresses related challenges in and proposes solutions for the future of memristor computing. State-of-the-art research on memristor modeling, memristive interconnections, memory circuit architectures, software simulation tools, and applications of memristors in computing are presented. Utilising contributions from numerous experts in the field, written in clear language and illustrated throughout, this book is a comprehensive reference work. Memristor Computing Systems explains memristors and memristive devices in an accessible way for graduate students and researchers with a basic knowledge of electrical and control systems engineering, as well as prompting further research for more experienced academics.
This book includes high-quality papers presented at the Second International Conference on Data Science and Management (ICDSM 2021), organized by the Gandhi Institute for Education and Technology, Bhubaneswar, from 19 to 20 February 2021. It features research in which data science is used to facilitate the decision-making process in various application areas, and also covers a wide range of learning methods and their applications in a number of learning problems. The empirical studies, theoretical analyses and comparisons to psychological phenomena described contribute to the development of products to meet market demands.
This book discusses the effective use of modern ICT solutions for business needs, including the efficient use of IT resources, decision support systems, business intelligence, data mining and advanced data processing algorithms, as well as the processing of large datasets (inter alia social networking such as Twitter and Facebook, etc.). The ability to generate, record and process qualitative and quantitative data, including in the area of big data, the Internet of Things (IoT) and cloud computing offers a real prospect of significant improvements for business, as well as the operation of a company within Industry 4.0. The book presents new ideas, approaches, solutions and algorithms in the area of knowledge representation, management and processing, quantitative and qualitative data processing (including sentiment analysis), problems of simulation performance, and the use of advanced signal processing to increase the speed of computation. The solutions presented are also aimed at the effective use of business process modeling and notation (BPMN), business process semantization and investment project portfolio selection. It is a valuable resource for researchers, data analysts, entrepreneurs and IT professionals alike, and the research findings presented make it possible to reduce costs, increase the accuracy of investment, optimize resources and streamline operations and marketing.
This book focuses on artifi cial intelligence in the field of digital signal processing and wireless communication. The implementation of machine learning and deep learning in audio, image, and video processing is presented, while adaptive signal processing and biomedical signal processing are also explored through DL algorithms, as well as 5G and green communication. Finally, metaheuristic algorithms of related mathematical problems are explored.
This book is a groundbreaking resource that covers both algorithms and technologies of interactive videos. It presents recent research and application work for building and browsing interactive digital videos. The book deals mainly with low-level semi-automatic and full-automatic processing of the video content for intelligent human computer interaction. There is a special focus on eye tracking methods.
Machine Vision Algorithms in Java provides a comprehensive introduction to the algorithms and techniques associated with machine vision systems. The Java programming language is also introduced, with particular reference to its imaging capabilities. The book contains explanations of key machine vision techniques and algorithms, along with the associated Java source code.Special features include: - A complete self-contained treatment of the topics and techniques essential to the understanding and implementation of machine vision.- An introduction to object-oriented programming and to the Java programming language, with particular reference to its imaging capabilities.- Java source code for a wide range of practical image processing and analysis functions.- Readers will be given the opportunity to download a fully functional Java-based visual programming environment for machine vision, available via the WWW. This contains over 200 image processing, manipulation and analysis functions and will enable users to implement many of the ideas covered in this book. - Details relating to the design of a Java-based visual programming environment for machine vision.- An introduction to the Java 2D imaging and Java Advanced Imaging (JAI) APIs- A wide range of illustrative examples.- Practical treatment of the subject matter. This book is aimed at senior undergraduate and postgraduate students in engineering and computer science as well as practitioners in machine vision who may wish to update or expand their knowledge of the subject. The techniques and algorithms of machine vision are expounded in a way that will be understood not only by specialists but also by those who are less familiar with the topic.
Credit Data and Scoring: The First Triumph of Big Data and Big Algorithms illuminates the often-hidden practice of predicting an individual's economic responsibility. Written by a leading practitioner, it examines the international implications of US leadership in credit scoring and what other countries have learned from it in building their own systems. Through its comprehensive contemporary perspective, the book also explores how algorithms and big data are driving the future of credit scoring. By revealing a new big picture and data comparisons, it delivers useful insights into legal, regulatory and data manipulation.
This book develops survey data analysis tools in Python, to create and analyze cross-tab tables and data visuals, weight data, perform hypothesis tests, and handle special survey questions such as Check-all-that-Apply. In addition, the basics of Bayesian data analysis and its Python implementation are presented. Since surveys are widely used as the primary method to collect data, and ultimately information, on attitudes, interests, and opinions of customers and constituents, these tools are vital for private or public sector policy decisions. As a compact volume, this book uses case studies to illustrate methods of analysis essential for those who work with survey data in either sector. It focuses on two overarching objectives: Demonstrate how to extract actionable, insightful, and useful information from survey data; and Introduce Python and Pandas for analyzing survey data. |
You may like...
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
|