![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
Discusses concepts such as Basic Programming Principles, OOP Principles, Database Programming, GUI Programming, Network Programming, Data Analytics and Visualization, Statistical Analysis, Virtual Reality, Web Development, Machine Learning, Deep Learning Provides the code and the output for all the concepts discussed Includes a case study at the end of each chapter
The book describes state-of-the-art advances in simulators and emulators for quantum computing. It introduces the main concepts of quantum computing, defining q-bits, explaining the parallelism behind any quantum computation, describing measurement of the quantum state of information and explaining the process of quantum bit entanglement, collapsed state and cloning. The book reviews the concept of quantum unitary, binary and ternary quantum operators as well as the computation implied by each operator. It provides details of the quantum processor, providing its architecture, which is validated via execution simulation of some quantum instructions.
This book gathers outstanding research papers presented at the 5th International Joint Conference on Advances in Computational Intelligence (IJCACI 2021), held online during October 23-24, 2021. IJCACI 2021 is jointly organized by Jahangirnagar University (JU), Bangladesh, and South Asian University (SAU), India. The book presents the novel contributions in areas of computational intelligence and it serves as a reference material for advance research. The topics covered are collective intelligence, soft computing, optimization, cloud computing, machine learning, intelligent software, robotics, data science, data security, big data analytics, and signal and natural language processing.
This book presents a comprehensive study of different tools and techniques available to perform network forensics. Also, various aspects of network forensics are reviewed as well as related technologies and their limitations. This helps security practitioners and researchers in better understanding of the problem, current solution space, and future research scope to detect and investigate various network intrusions against such attacks efficiently. Forensic computing is rapidly gaining importance since the amount of crime involving digital systems is steadily increasing. Furthermore, the area is still underdeveloped and poses many technical and legal challenges. The rapid development of the Internet over the past decade appeared to have facilitated an increase in the incidents of online attacks. There are many reasons which are motivating the attackers to be fearless in carrying out the attacks. For example, the speed with which an attack can be carried out, the anonymity provided by the medium, nature of medium where digital information is stolen without actually removing it, increased availability of potential victims and the global impact of the attacks are some of the aspects. Forensic analysis is performed at two different levels: Computer Forensics and Network Forensics. Computer forensics deals with the collection and analysis of data from computer systems, networks, communication streams and storage media in a manner admissible in a court of law. Network forensics deals with the capture, recording or analysis of network events in order to discover evidential information about the source of security attacks in a court of law. Network forensics is not another term for network security. It is an extended phase of network security as the data for forensic analysis are collected from security products like firewalls and intrusion detection systems. The results of this data analysis are utilized for investigating the attacks. Network forensics generally refers to the collection and analysis of network data such as network traffic, firewall logs, IDS logs, etc. Technically, it is a member of the already-existing and expanding the field of digital forensics. Analogously, network forensics is defined as "The use of scientifically proved techniques to collect, fuses, identifies, examine, correlate, analyze, and document digital evidence from multiple, actively processing and transmitting digital sources for the purpose of uncovering facts related to the planned intent, or measured success of unauthorized activities meant to disrupt, corrupt, and or compromise system components as well as providing information to assist in response to or recovery from these activities." Network forensics plays a significant role in the security of today's organizations. On the one hand, it helps to learn the details of external attacks ensuring similar future attacks are thwarted. Additionally, network forensics is essential for investigating insiders' abuses that constitute the second costliest type of attack within organizations. Finally, law enforcement requires network forensics for crimes in which a computer or digital system is either being the target of a crime or being used as a tool in carrying a crime. Network security protects the system against attack while network forensics focuses on recording evidence of the attack. Network security products are generalized and look for possible harmful behaviors. This monitoring is a continuous process and is performed all through the day. However, network forensics involves post mortem investigation of the attack and is initiated after crime notification. There are many tools which assist in capturing data transferred over the networks so that an attack or the malicious intent of the intrusions may be investigated. Similarly, various network forensic frameworks are proposed in the literature.
Highlights the importance and applications of Swarm Intelligence and Machine learning in Healthcare industry. Elaborates Swarm Intelligence and Machine Learning for Cancer Detection. Focuses on applying Swarm Intelligence and Machine Learning for Heart Disease detection and diagnosis. Explores of the concepts of machine learning along with swarm intelligence techniques, along with recent research developments in healthcare sectors. Investigates how healthcare companies can leverage the tapestry of big data to discover new business values. Provides a strong foundation for Diabetic Retinopathy detection using Swarm and Evolutionary algorithms.
The text is written to provide readers with a comprehensive study of information security and management system, audit planning and preparation, audit techniques and collecting evidence, international information security (ISO) standard 27001, and asset management. It further discusses important topics such as security mechanisms, security standards, audit principles, audit competence and evaluation methods, and the principles of asset management. It will serve as an ideal reference text for senior undergraduate, graduate students, and researchers in fields including electrical engineering, electronics and communications engineering, computer engineering, and information technology. The book explores information security concepts and applications from an organizational information perspective and explains the process of audit planning and preparation. It further demonstrates audit techniques and collecting evidence to write important documentation by following the ISO 27001 standards. The book: Elaborates on the application of confidentiality, integrity, and availability (CIA) in the area of audit planning and preparation Covers topics such as managing business assets, agreements on how to deal with business assets, and media handling Demonstrates audit techniques and collects evidence to write the important documentation by following the ISO 27001 standards Explains how the organization's assets are managed by asset management, and access control policies Presents seven case studies
Reviews the literature of the Moth-Flame Optimization algorithm; Provides an in-depth analysis of equations, mathematical models, and mechanisms of the Moth-Flame Optimization algorithm; Proposes different variants of the Moth-Flame Optimization algorithm to solve binary, multi-objective, noisy, dynamic, and combinatorial optimization problems; Demonstrates how to design, develop, and test different hybrids of Moth-Flame Optimization algorithm; Introduces several applications areas of the Moth-Flame Optimization algorithm focusing in sustainability.
This book aims at the tiny machine learning (TinyML) software and hardware synergy for edge intelligence applications. This book presents on-device learning techniques covering model-level neural network design, algorithm-level training optimization and hardware-level instruction acceleration. Analyzing the limitations of conventional in-cloud computing would reveal that on-device learning is a promising research direction to meet the requirements of edge intelligence applications. As to the cutting-edge research of TinyML, implementing a high-efficiency learning framework and enabling system-level acceleration is one of the most fundamental issues. This book presents a comprehensive discussion of the latest research progress and provides system-level insights on designing TinyML frameworks, including neural network design, training algorithm optimization and domain-specific hardware acceleration. It identifies the main challenges when deploying TinyML tasks in the real world and guides the researchers to deploy a reliable learning system. This book will be of interest to students and scholars in the field of edge intelligence, especially to those with sufficient professional Edge AI skills. It will also be an excellent guide for researchers to implement high-performance TinyML systems.
This timely text/reference explores the business and technical issues involved in the management of information systems in the era of big data and beyond. Topics and features: presents review questions and discussion topics in each chapter for classroom group work and individual research assignments; discusses the potential use of a variety of big data tools and techniques in a business environment, explaining how these can fit within an information systems strategy; reviews existing theories and practices in information systems, and explores their continued relevance in the era of big data; describes the key technologies involved in information systems in general and big data in particular, placing these technologies in an historic context; suggests areas for further research in this fast moving domain; equips readers with an understanding of the important aspects of a data scientist's job; provides hands-on experience to further assist in the understanding of the technologies involved.
This compendium contains 10 chapters written by world renowned researchers with expertise in semantic computing, genome sequence analysis, biomolecular interaction, time-series microarray analysis, and machine learning algorithms.The salient feature of this book is that it highlights eight types of computational techniques to tackle different biomedical applications. These techniques include unsupervised learning algorithms, principal component analysis, fuzzy integral, graph-based ensemble clustering method, semantic analysis, interolog approach, molecular simulations and enzyme kinetics.The unique volume will be a useful reference material and an inspirational read for advanced undergraduate and graduate students, computer scientists, computational biologists, bioinformatics and biomedical professionals.
This book is designed as a reference book and presents a systematic approach to analyze evolutionary and nature-inspired population-based search algorithms. Beginning with an introduction to optimization methods and algorithms and various enzymes, the book then moves on to provide a unified framework of process optimization for enzymes with various algorithms. The book presents current research on various applications of machine learning and discusses optimization techniques to solve real-life problems. The book compiles the different machine learning models for optimization of process parameters for production of industrially important enzymes. The production and optimization of various enzymes produced by different microorganisms are elaborated in the book It discusses the optimization methods that help minimize the error in developing patterns and classifications, which further helps improve prediction and decision-making Covers the best-performing methods and approaches for optimization sustainable enzymes production with AI integration in a real-time environment Featuring valuable insights, the book helps readers explore new avenues leading towards multidisciplinary research discussions The book is aimed primarily at advanced undergraduates and graduates studying machine learning, data science and industrial biotechnology. Researchers and professionals will also find this book useful.
This book offers an accessible guide to ubiquitous computing, with an emphasis on pervasive networking. It addresses various technical obstacles, such as connectivity, levels of service, performance, reliability and fairness. The focus is on describing currently available off-the-shelf technologies, novel algorithms and techniques in areas such as: underwater sensor networks, ant colony based routing, heterogeneous networks, agent based distributed networks, cognitive radio networks, real-time WSN applications, machine translation, intelligent computing and ontology based bit masking. By introducing the core topics and exploring assistive pervasive systems that draw on pervasive networking, the book provides readers with a robust foundation of knowledge on this growing field of research. Written in a straightforward style, the book is also accessible to a broad audience of researchers and designers who are interested in exploring pervasive computing further.
Introduces the GUHA method of mechanizing hypothesis formation as a data mining tool. Presents examples of data mining with enhanced association rules, histograms, contingency tables and action rules. Provides examples of data mining for exception rules and examples of subgroups discovery. Outlines possibilities of GUHA in business intelligence and big data. Overviews related theoretical results and challenges related to mechanizing hypothesis formation.
This comprehensive reference text discusses nature inspired algorithms and their applications. It presents the methodology to write new algorithms with the help of MATLAB programs and instructions for better understanding of concepts. It covers well-known algorithms including evolutionary algorithms, genetic algorithm, particle Swarm optimization and differential evolution, and recent approached including gray wolf optimization. A separate chapter discusses test case generation using techniques such as particle swarm optimization, genetic algorithm, and differential evolution algorithm. The book- Discusses in detail various nature inspired algorithms and their applications Provides MATLAB programs for the corresponding algorithm Presents methodology to write new algorithms Examines well-known algorithms like the genetic algorithm, particle swarm optimization and differential evolution, and recent approaches like gray wolf optimization. Provides conceptual linking of algorithms with theoretical concepts The text will be useful for graduate students in the field of electrical engineering, electronics engineering, computer science and engineering. Discussing nature inspired algorithms and their applications in a single volume, this text will be useful as a reference text for graduate students in the field of electrical engineering, electronics engineering, computer science and engineering. It discusses important algorithms including deterministic algorithms, randomized algorithms, evolutionary algorithms, particle swarm optimization, big bang big crunch (BB-BC) algorithm, genetic algorithm and grey wolf optimization algorithm. "
This book highlights various evolutionary algorithm techniques for various medical conditions and introduces medical applications of evolutionary computation for real-time diagnosis. Evolutionary Intelligence for Healthcare Applications presents how evolutionary intelligence can be used in smart healthcare systems involving big data analytics, mobile health, personalized medicine, and clinical trial data management. It focuses on emerging concepts and approaches and highlights various evolutionary algorithm techniques used for early disease diagnosis, prediction, and prognosis for medical conditions. The book also presents ethical issues and challenges that can occur within the healthcare system. Researchers, healthcare professionals, data scientists, systems engineers, students, programmers, clinicians, and policymakers will find this book of interest.
A successor to the first and second editions, this updated and revised book is a leading companion guide for students and engineers alike, specifically software engineers who design algorithms. While succinct, this edition is mathematically rigorous, covering the foundations for both computer scientists and mathematicians with interest in the algorithmic foundations of Computer Science.Besides expositions on traditional algorithms such as Greedy, Dynamic Programming and Divide & Conquer, the book explores two classes of algorithms that are often overlooked in introductory textbooks: Randomised and Online algorithms - with emphasis placed on the algorithm itself. The book also covers algorithms in Linear Algebra, and the foundations of Computation.The coverage of Randomized and Online algorithms is timely: the former have become ubiquitous due to the emergence of cryptography, while the latter are essential in numerous fields as diverse as operating systems and stock market predictions.While being relatively short to ensure the essentiality of content, a strong focus has been placed on self-containment, introducing the idea of pre/post-conditions and loop invariants to readers of all backgrounds, as well as all the necessary mathematical foundations. The programming exercises in Python will be available on the web (see www.msoltys.com/book for the companion web site).
With the rapid penetration of technology in varied application domains, the existing cities are getting connected more seamlessly. Cities becomes smart by inducing ICT in the classical city infrastructure for its management. According to McKenzie Report, about 68% of the world population will migrate towards urban settlements in near future. This migration is largely because of the improved Quality of Life (QoL) and livelihood in urban settlements. In the light of urbanization, climate change, democratic flaws, and rising urban welfare expenditures, smart cities have emerged as an important approach for society's future development. Smart cities have achieved enhanced QoL by giving smart information to people regarding healthcare, transportation, smart parking, smart traffic structure, smart home, smart agronomy, community security etc. Typically, in smart cities data is sensed by the sensor devices and provided to end users for further use. The sensitive data is transferred with the help of internet creating higher chances for the adversaries to breach the data. Considering the privacy and security as the area of prime focus, this book covers the most prominent security vulnerabilities associated with varied application areas like healthcare, manufacturing, transportation, education and agriculture etc. Furthermore, the massive amount of data being generated through ubiquitous sensors placed across the smart cities needs to be handled in an effective, efficient, secured and privacy preserved manner. Since a typical smart city ecosystem is data driven, it is imperative to manage this data in an optimal manner. Enabling technologies like Internet of Things (IoT), Natural Language Processing (NLP), Blockchain Technology, Deep Learning, Machine Learning, Computer vision, Big Data Analytics, Next Generation Networks and Software Defined Networks (SDN) provide exemplary benefits if they are integrated in the classical city ecosystem in an effective manner. The application of Artificial Intelligence (AI) is expanding across many domains in the smart city, such as infrastructure, transportation, environmental protection, power and energy, privacy and security, governance, data management, healthcare, and more. AI has the potential to improve human health, prosperity, and happiness by reducing our reliance on manual labor and accelerating our progress in the sciences and technologies. NLP is an extensive domain of AI and is used in collaboration with machine learning and deep learning algorithms for clinical informatics and data processing. In modern smart cities, blockchain provides a complete framework that controls the city operations and ensures that they are managed as effectively as possible. Besides having an impact on our daily lives, it also facilitates many areas of city management.
Introduction to Quantum Natural Language Processing. Overview of Leadership and AI. The Age of Quantum Superiority. Challenges To Today's Leadership. AI-induced Strategic Implementation and Organizational Performance.
Classifies the optimization problems of the ports into five scheduling decisions. For each decision, it supplies an overview, formulates each of the decisions as constraint satisfaction and optimization problems, and then covers possible solutions, implementation, and performance. Part One explores the various optimization problems in modern container terminals, while Part Two details advanced algorithms for the minimum cost flow (MCF) problem and for the scheduling problem of AGVs in ports. A complete package that can help readers address the scheduling problems of AGVs in ports.
Due to efficacy and optimization potential of genetic and evolutionary algorithms, they are used in learning and modeling especially with the advent of big data related problems. This book presents the algorithms and strategies specifically associated with pertinent issues in materials science domain. It discusses the procedures for evolutionary multi-objective optimization of objective functions created through these procedures and introduces available codes. Recent applications ranging from primary metal production to materials design are covered. It also describes hybrid modeling strategy, and other common modeling and simulation strategies like molecular dynamics, cellular automata etc. Features: Focuses on data-driven evolutionary modeling and optimization, including evolutionary deep learning. Include details on both algorithms and their applications in materials science and technology. Discusses hybrid data-driven modeling that couples evolutionary algorithms with generic computing strategies. Thoroughly discusses applications of pertinent strategies in metallurgy and materials. Provides overview of the major single and multi-objective evolutionary algorithms. This book aims at Researchers, Professionals, and Graduate students in Materials Science, Data-Driven Engineering, Metallurgical Engineering, Computational Materials Science, Structural Materials, and Functional Materials.
Motivated by a variational model concerning the depth of the objects in a picture and the problem of hidden and illusory contours, this book investigates one of the central problems of computer vision: the topological and algorithmic reconstruction of a smooth three dimensional scene starting from the visible part of an apparent contour. The authors focus their attention on the manipulation of apparent contours using a finite set of elementary moves, which correspond to diffeomorphic deformations of three dimensional scenes. A large part of the book is devoted to the algorithmic part, with implementations, experiments, and computed examples. The book is intended also as a user's guide to the software code appcontour, written for the manipulation of apparent contours and their invariants. This book is addressed to theoretical and applied scientists working in the field of mathematical models of image segmentation.
The Fourier transform is one of the most fundamental tools for computing the frequency representation of signals. It plays a central role in signal processing, communications, audio and video compression, medical imaging, genomics, astronomy, as well as many other areas. Because of its widespread use, fast algorithms for computing the Fourier transform can benefit a large number of applications. The fastest algorithm for computing the Fourier transform is the Fast Fourier Transform (FFT), which runs in near-linear time making it an indispensable tool for many applications. However, today, the runtime of the FFT algorithm is no longer fast enough especially for big data problems where each dataset can be few terabytes. Hence, faster algorithms that run in sublinear time, i.e., do not even sample all the data points, have become necessary. This book addresses the above problem by developing the Sparse Fourier Transform algorithms and building practical systems that use these algorithms to solve key problems in six different applications: wireless networks; mobile systems; computer graphics; medical imaging; biochemistry; and digital circuits. This is a revised version of the thesis that won the 2016 ACM Doctoral Dissertation Award.
This book discusses an important area of numerical optimization, called interior-point method. This topic has been popular since the 1980s when people gradually realized that all simplex algorithms were not convergent in polynomial time and many interior-point algorithms could be proved to converge in polynomial time. However, for a long time, there was a noticeable gap between theoretical polynomial bounds of the interior-point algorithms and efficiency of these algorithms. Strategies that were important to the computational efficiency became barriers in the proof of good polynomial bounds. The more the strategies were used in algorithms, the worse the polynomial bounds became. To further exacerbate the problem, Mehrotra's predictor-corrector (MPC) algorithm (the most popular and efficient interior-point algorithm until recently) uses all good strategies and fails to prove the convergence. Therefore, MPC does not have polynomiality, a critical issue with the simplex method. This book discusses recent developments that resolves the dilemma. It has three major parts. The first, including Chapters 1, 2, 3, and 4, presents some of the most important algorithms during the development of the interior-point method around the 1990s, most of them are widely known. The main purpose of this part is to explain the dilemma described above by analyzing these algorithms' polynomial bounds and summarizing the computational experience associated with them. The second part, including Chapters 5, 6, 7, and 8, describes how to solve the dilemma step-by-step using arc-search techniques. At the end of this part, a very efficient algorithm with the lowest polynomial bound is presented. The last part, including Chapters 9, 10, 11, and 12, extends arc-search techniques to some more general problems, such as convex quadratic programming, linear complementarity problem, and semi-definite programming.
Scan 2000, the GAMM - IMACS International Symposium on Scientific Computing, Computer Arithmetic, and Validated Numerics and Interval 2000, the International Conference on Interval Methods in Science and Engineering were jointly held in Karlsruhe, September 19-22, 2000. The joint conference continued the series of 7 previous Scan-symposia under the joint sponsorship of GAMM and IMACS. These conferences have traditionally covered the numerical and algorithmic aspects of scientific computing, with a strong emphasis on validation and verification of computed results as well as on arithmetic, programming, and algorithmic tools for this purpose. The conference further continued the series of 4 former Interval conferences focusing on interval methods and their application in science and engineering. The objectives are to propagate current applications and research as well as to promote a greater understanding and increased awareness of the subject matters. The symposium was held in Karlsruhe the European cradle of interval arithmetic and self-validating numerics and attracted 193 researchers from 33 countries. 12 invited and 153 contributed talks were given. But not only the quantity was overwhelming we were deeply impressed by the emerging maturity of our discipline. There were many talks discussing a wide variety of serious applications stretching all parts of mathematical modelling. New efficient, publicly available or even commercial tools were proposed or presented, and also foundations of the theory of intervals and reliable computations were considerably strengthened.
Data driven methods have long been used in Automatic Speech Recognition (ASR) and Text-To-Speech (TTS) synthesis and have more recently been introduced for dialogue management, spoken language understanding, and Natural Language Generation. Machine learning is now present "end-to-end" in Spoken Dialogue Systems (SDS). However, these techniques require data collection and annotation campaigns, which can be time-consuming and expensive, as well as dataset expansion by simulation. In this book, we provide an overview of the current state of the field and of recent advances, with a specific focus on adaptivity. |
![]() ![]() You may like...
Time in Quantum Mechanics
Gonzalo Muga, R. Sala Mayato, …
Hardcover
Chaos in Classical and Quantum Mechanics
Martin C. Gutzwiller
Hardcover
R4,679
Discovery Miles 46 790
|