![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. In this year's edition, the topics covered include many of the most important issues and research questions in the field, such as: opportune application domains for GP-based methods, game playing and co-evolutionary search, symbolic regression and efficient learning strategies, encodings and representations for GP, schema theorems, and new selection mechanisms.The volume includes several chapters on best practices and lessons learned from hands-on experience. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
Covers three important aspects of smart cities i.e., healthcare, smart communication and information, and smart transportation technologies Discusses on various security aspects of medical documents and the data preserving mechanisms Provides better solution using IoT techniques for healthcare, transportation, and communication systems Includes the implementation example, various datasets, experimental results, and simulation procedures Offers solution for various disease prediction systems with intelligent techniques
This timely text/reference explores the business and technical issues involved in the management of information systems in the era of big data and beyond. Topics and features: presents review questions and discussion topics in each chapter for classroom group work and individual research assignments; discusses the potential use of a variety of big data tools and techniques in a business environment, explaining how these can fit within an information systems strategy; reviews existing theories and practices in information systems, and explores their continued relevance in the era of big data; describes the key technologies involved in information systems in general and big data in particular, placing these technologies in an historic context; suggests areas for further research in this fast moving domain; equips readers with an understanding of the important aspects of a data scientist's job; provides hands-on experience to further assist in the understanding of the technologies involved.
This book introduces a new scheduler to fairly and efficiently distribute system resources to many users of varying usage patterns compete for them in large shared computing environments. The Rawlsian Fair scheduler developed for this effort is shown to boost performance while reducing delay in high performance computing workloads of certain types including the following four types examined in this book: i. Class A - similar but complementary workloads ii. Class B - similar but steady vs intermittent workloads iii. Class C - Large vs small workloads iv. Class D - Large vs noise-like workloads This new scheduler achieves short-term fairness for small timescale demanding rapid response to varying workloads and usage profiles. Rawlsian Fair scheduler is shown to consistently benefit workload Classes C and D while it only benefits Classes A and B workloads where they become disproportionate as the number of users increases. A simulation framework, dSim, simulates the new Rawlsian Fair scheduling mechanism. The dSim helps achieve instantaneous fairness in High Performance Computing environments, effective utilization of computing resources, and user satisfaction through the Rawlsian Fair scheduler.
This book provides a comprehensive overview of how the course, content and outcome of policy making is affected by big data. It scrutinises the notion that big and open data makes policymaking a more rational process, in which policy makers are able to predict, assess and evaluate societal problems. It also examines how policy makers deal with big data, the problems and limitations they face, and how big data shapes policymaking on the ground. The book considers big data from various perspectives, not just the political, but also the technological, legal, institutional and ethical dimensions. The potential of big data use in the public sector is also assessed, as well as the risks and dangers this might pose. Through several extended case studies, it demonstrates the dynamics of big data and public policy. Offering a holistic approach to the study of big data, this book will appeal to students and scholars of public policy, public administration and data science, as well as those interested in governance and politics.
This textbook introduces basic and advanced embedded system topics through Arm Cortex M microcontrollers, covering programmable microcontroller usage starting from basic to advanced concepts using the STMicroelectronics Discovery development board. Designed for use in upper-level undergraduate and graduate courses on microcontrollers, microprocessor systems, and embedded systems, the book explores fundamental and advanced topics, real-time operating systems via FreeRTOS and Mbed OS, and then offers a solid grounding in digital signal processing, digital control, and digital image processing concepts - with emphasis placed on the usage of a microcontroller for these advanced topics. The book uses C language, "the" programming language for microcontrollers, C++ language, and MicroPython, which allows Python language usage on a microcontroller. Sample codes and course slides are available for readers and instructors, and a solutions manual is available to instructors. The book will also be an ideal reference for practicing engineers and electronics hobbyists who wish to become familiar with basic and advanced microcontroller concepts.
This book provides in-depth and wide-ranging analyses of the emergence, and subsequent ubiquity, of algorithms in diverse realms of social life. The plurality of Algorithmic Cultures emphasizes: 1) algorithms' increasing importance in the formation of new epistemic and organizational paradigms; and 2) the multifaceted analyses of algorithms across an increasing number of research fields. The authors in this volume address the complex interrelations between social groups and algorithms in the construction of meaning and social interaction. The contributors highlight the performative dimensions of algorithms by exposing the dynamic processes through which algorithms - themselves the product of a specific approach to the world - frame reality, while at the same time organizing how people think about society. With contributions from leading experts from Media Studies, Social Studies of Science and Technology, Cultural and Media Sociology from Canada, France, Germany, UK and the USA, this volume presents cutting edge empirical and conceptual research that includes case studies on social media platforms, gaming, financial trading and mobile security infrastructures.
The authors describe systematic methods for uncovering scientific laws a priori, on the basis of intuition, or "Gedanken Experiments". Mathematical expressions of scientific laws are, by convention, constrained by the rule that their form must be invariant with changes of the units of their variables. This constraint makes it possible to narrow down the possible forms of the laws. It is closely related to, but different from, dimensional analysis. It is a mathematical book, largely based on solving functional equations. In fact, one chapter is an introduction to the theory of functional equations.
This book highlights selected papers from the 4th ICSA-Canada Chapter Symposium, as well as invited articles from established researchers in the areas of statistics and data science. It covers a variety of topics, including methodology development in data science, such as methodology in the analysis of high dimensional data, feature screening in ultra-high dimensional data and natural language ranking; statistical analysis challenges in sampling, multivariate survival models and contaminated data, as well as applications of statistical methods. With this book, readers can make use of frontier research methods to tackle their problems in research, education, training and consultation.
The book provides a comprehensive introduction and a novel mathematical foundation of the field of information geometry with complete proofs and detailed background material on measure theory, Riemannian geometry and Banach space theory. Parametrised measure models are defined as fundamental geometric objects, which can be both finite or infinite dimensional. Based on these models, canonical tensor fields are introduced and further studied, including the Fisher metric and the Amari-Chentsov tensor, and embeddings of statistical manifolds are investigated. This novel foundation then leads to application highlights, such as generalizations and extensions of the classical uniqueness result of Chentsov or the Cramer-Rao inequality. Additionally, several new application fields of information geometry are highlighted, for instance hierarchical and graphical models, complexity theory, population genetics, or Markov Chain Monte Carlo. The book will be of interest to mathematicians who are interested in geometry, information theory, or the foundations of statistics, to statisticians as well as to scientists interested in the mathematical foundations of complex systems.
This book is based on deep learning approaches used for the diagnosis of neurological disorders, including basics of deep learning algorithms using diagrams, data tables, and practical examples, for diagnosis of neurodegenerative and neurodevelopmental disorders. It includes application of feed-forward neural networks, deep generative models, convolutional neural networks, graph convolutional networks, and recurrent neural networks in the field of diagnosis of neurological disorders. Along with this, data pre-processing including scaling, correction, trimming, normalization is also included. Offers a detailed description of the deep learning approaches used for the diagnosis of neurological disorders Demonstrates concepts of deep learning algorithms using diagrams, data tables, and examples for the diagnosis of neurodegenerative disorders; neurodevelopmental, and psychiatric disorders. Helps build, train, and deploy different types of deep architectures for diagnosis Explores data pre-processing techniques involved in diagnosis Include real-time case studies and examples This book is aimed at graduate students and researchers in biomedical imaging and machine learning.
This book offers an accessible guide to ubiquitous computing, with an emphasis on pervasive networking. It addresses various technical obstacles, such as connectivity, levels of service, performance, reliability and fairness. The focus is on describing currently available off-the-shelf technologies, novel algorithms and techniques in areas such as: underwater sensor networks, ant colony based routing, heterogeneous networks, agent based distributed networks, cognitive radio networks, real-time WSN applications, machine translation, intelligent computing and ontology based bit masking. By introducing the core topics and exploring assistive pervasive systems that draw on pervasive networking, the book provides readers with a robust foundation of knowledge on this growing field of research. Written in a straightforward style, the book is also accessible to a broad audience of researchers and designers who are interested in exploring pervasive computing further.
This book discusses applications of blockchain in healthcare sector. The security of confidential and sensitive data is of utmost importance in healthcare industry. The introduction of blockchain methods in an effective manner will bring secure transactions in a peer-to-peer network. The book also covers gaps of the current available books/literature available for use cases of Distributed Ledger Technology (DLT) in healthcare. The information and applications discussed in the book are immensely helpful for researchers, database professionals, and practitioners. The book also discusses protocols, standards, and government regulations which are very useful for policymakers.
The text covers recent advances in artificial intelligence, smart computing, and their applications in augmenting medical and health care systems. It will serve as an ideal reference text for graduate students and academic researchers in diverse engineering fields including electrical, electronics and communication, computer, and biomedical. The book- Presents architecture, characteristics, and applications of artificial intelligence and smart computing in health care systems Highlight privacy issues faced in health care and health informatics using artificial intelligence and smart computing technologies. Discusses nature-inspired computing algorithms for the brain-computer interface. Covers graph neural network application in the medical domain. Provides insights into the state-of-the-art Artificial Intelligence and Smart Computing enabling and emerging technologies. This book text discusses recent advances and applications of artificial intelligence and smart technologies in the field of healthcare. It highlights privacy issues faced in health care and health informatics using artificial intelligence and smart computing technologies. It covers nature-inspired computing algorithms such as genetic algorithms, particle swarm optimization algorithms, and common scrambling algorithms to study brain-computer interfaces. It will serve as an ideal reference text for graduate students and academic researchers in the fields of electrical engineering, electronics and communication engineering, computer engineering, and biomedical engineering.
Highlights the contributions of different optimization techniques, decision analytics (predictive, prescriptive, and descriptive), multi-criteria decision making "Helps develop intelligent machines to provide solutions to real-world problems, which are not modelled or are too difficult to model mathematically in hospital management systems " Discusses machine learning-based analytics such as GAN networks, autoencoders, computational imaging, quantum computing will be rigorously applied to smart cloud computing Explores evolutionary algorithms that demonstrate their ability as robust approaches to cope with the fundamental steps of image processing, image analysis, and computer vision pipeline (e.g., restoration, segmentation, registration, classification, reconstruction, or tracking), Creates a bridge between Industrial Engineering concepts and Computational Intelligence for designing complex and convoluted hospital management problems
This book contains extended and revised versions of the best papers presented at the 28th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2020, held in Salt Lake City, UT, USA, in October 2020.*The 16 full papers included in this volume were carefully reviewed and selected from the 38 papers (out of 74 submissions) presented at the conference. The papers discuss the latest academic and industrial results and developments as well as future trends in the field of System-on-Chip (SoC) design, considering the challenges of nano-scale, state-of-the-art and emerging manufacturing technologies. In particular they address cutting-edge research fields like low-power design of RF, analog and mixed-signal circuits, EDA tools for the synthesis and verification of heterogenous SoCs, accelerators for cryptography and deep learning and on-chip Interconnection system, reliability and testing, and integration of 3D-ICs. *The conference was held virtually.
With the advent of approximation algorithms for NP-hard combinatorial optimization problems, several techniques from exact optimization such as the primal-dual method have proven their staying power and versatility. This book describes a simple and powerful method that is iterative in essence, and similarly useful in a variety of settings for exact and approximate optimization. The authors highlight the commonality and uses of this method to prove a variety of classical polyhedral results on matchings, trees, matroids, and flows. The presentation style is elementary enough to be accessible to anyone with exposure to basic linear algebra and graph theory, making the book suitable for introductory courses in combinatorial optimization at the upper undergraduate and beginning graduate levels. Discussions of advanced applications illustrate their potential for future application in research in approximation algorithms.
This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. It covers architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability.
This comprehensive compendium provides a rigorous framework to tackle the daunting challenges of designing correct and efficient algorithms. It gives a uniform approach to the design, analysis, optimization, and verification of algorithms. The volume also provides essential tools to understand algorithms and their associated data structures.This useful reference text describes a way of thinking that eases the task of proving algorithm correctness. Working through a proof of correctness reveals an algorithm's subtleties in a way that a typical description does not. Algorithm analysis is presented using careful definitions that make the analyses mathematically rigorous.
Revealing the flaws in human decision making, this book explores how AI can be used to optimise decisions for improved business outcomes and efficiency, as well as looking ahead into the significant contributions Decision Intelligence (DI) can make to society and the ethical challenges it may raise. Offering an impressive framework of Decision Intelligence (DI), from the theories and concepts used to design autonomous intelligent agents to the technologies that power DI systems and the ways in which companies use decision-making building blocks to build DI solutions that enable businesses to democratise AI, this book provides a systematic approach to AI intelligence and human involvement. Replete with case studies on DI application, as well as wider discussions on the social implications of the technology, this book appeals to both students of AI and data solutions and businesses considering DI adoption.
Scan 2000, the GAMM - IMACS International Symposium on Scientific Computing, Computer Arithmetic, and Validated Numerics and Interval 2000, the International Conference on Interval Methods in Science and Engineering were jointly held in Karlsruhe, September 19-22, 2000. The joint conference continued the series of 7 previous Scan-symposia under the joint sponsorship of GAMM and IMACS. These conferences have traditionally covered the numerical and algorithmic aspects of scientific computing, with a strong emphasis on validation and verification of computed results as well as on arithmetic, programming, and algorithmic tools for this purpose. The conference further continued the series of 4 former Interval conferences focusing on interval methods and their application in science and engineering. The objectives are to propagate current applications and research as well as to promote a greater understanding and increased awareness of the subject matters. The symposium was held in Karlsruhe the European cradle of interval arithmetic and self-validating numerics and attracted 193 researchers from 33 countries. 12 invited and 153 contributed talks were given. But not only the quantity was overwhelming we were deeply impressed by the emerging maturity of our discipline. There were many talks discussing a wide variety of serious applications stretching all parts of mathematical modelling. New efficient, publicly available or even commercial tools were proposed or presented, and also foundations of the theory of intervals and reliable computations were considerably strengthened.
Introduction to Data Compression, Fifth Edition, builds on the success of what is widely considered the best introduction and reference text on the art and science of data compression. Data compression techniques and technology are ever-evolving with new applications in image, speech, text, audio and video. This new edition includes all the latest developments in the field. Khalid Sayood provides an extensive introduction to the theory underlying today's compression techniques, with detailed instruction for their applications using several examples to explain the concepts. Encompassing the entire field of data compression, the book includes lossless and lossy compression, Huffman coding, arithmetic coding, dictionary techniques, context based compression, and scalar and vector quantization. The book provides a comprehensive working knowledge of data compression, giving the reader the tools to develop a complete and concise compression package.
Digital Intermediation offers a new framework for understanding content creation and distribution across automated media platforms - a new mediatisation process. The book draws on empirical and theoretical research to carefully identify and describe a number of unseen digital infrastructures that contribute to a predictive media production process through technologies, institutions and automation. Field data is drawn from several international sites, including Los Angeles, San Francisco, Portland, London, Amsterdam, Munich, Berlin, Hamburg, Sydney and Cartagena. By highlighting an increasingly automated content production and distribution process, the book responds to a number of regulatory debates on the societal impact of social media platforms. It highlights emerging areas of key importance that shape the production and distribution of social media content, including micro-platformization and digital first personalities. The book explains how technologies, institutions and automation are used within agencies to increase exposure for the talent they manage, while providing inside access to the processes and requirements of producers who create content for platform algorithms. Finally, it outlines user agency as a strategy for those who seek diversity in the information they access on automated social media content distribution platforms. The findings in this book provide key recommendations for policymakers working within digital media platforms, and will be invaluable reading for students and academics interested in automated media environments.
Data driven methods have long been used in Automatic Speech Recognition (ASR) and Text-To-Speech (TTS) synthesis and have more recently been introduced for dialogue management, spoken language understanding, and Natural Language Generation. Machine learning is now present "end-to-end" in Spoken Dialogue Systems (SDS). However, these techniques require data collection and annotation campaigns, which can be time-consuming and expensive, as well as dataset expansion by simulation. In this book, we provide an overview of the current state of the field and of recent advances, with a specific focus on adaptivity. |
You may like...
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
Computational Methods and Algorithms for…
Kwok Tai Chui, Miltiadis D Lytras
Hardcover
R6,044
Discovery Miles 60 440
|