![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
This book explains the most prominent and some promising new, general techniques that combine metaheuristics with other optimization methods. A first introductory chapter reviews the basic principles of local search, prominent metaheuristics, and tree search, dynamic programming, mixed integer linear programming, and constraint programming for combinatorial optimization purposes. The chapters that follow present five generally applicable hybridization strategies, with exemplary case studies on selected problems: incomplete solution representations and decoders; problem instance reduction; large neighborhood search; parallel non-independent construction of solutions within metaheuristics; and hybridization based on complete solution archives. The authors are among the leading researchers in the hybridization of metaheuristics with other techniques for optimization, and their work reflects the broad shift to problem-oriented rather than algorithm-oriented approaches, enabling faster and more effective implementation in real-life applications. This hybridization is not restricted to different variants of metaheuristics but includes, for example, the combination of mathematical programming, dynamic programming, or constraint programming with metaheuristics, reflecting cross-fertilization in fields such as optimization, algorithmics, mathematical modeling, operations research, statistics, and simulation. The book is a valuable introduction and reference for researchers and graduate students in these domains.
Conceptual modeling has always been one of the main issues in information systems engineering as it aims to describe the general knowledge of the system at an abstract level that facilitates user understanding and software development. This collection of selected papers provides a comprehensive and extremely readable overview of what conceptual modeling is and perspectives on making it more and more relevant in our society. It covers topics like modeling the human genome, blockchain technology, model-driven software development, data integration, and wiki-like repositories and demonstrates the general applicability of conceptual modeling to various problems in diverse domains. Overall, this book is a source of inspiration for everybody in academia working on the vision of creating a strong, fruitful and creative community of conceptual modelers. With this book the editors and authors want to honor Prof. Antoni Olive for his enormous and ongoing contributions to the conceptual modeling discipline. It was presented to him on the occasion of his keynote at ER 2017 in Valencia, a conference that he has contributed to and supported for over 20 years. Thank you very much to Antoni for so many years of cooperation and friendship.
The Turing/von Neumann model of computing is dominant today but is by no means the only one. This textbook explores an important subset of alternatives, including those such as quantum and neuromorphic, which receive daily news attention. The models are organized into distinct groups. After a review of the Turing/von Neumann model to set the stage, the author discusses those that have their roots in the Turing/von Neumann model but perform potentially large numbers of computations in parallel; models that do away with the preplanned nature of the classical model and compute from just a statement of the problem; others that are simply mathematically different, such as probabilistic and reversible computation; models based on physical phenomena such as neurons; and finally those that leverage unique physical phenomena directly, such as quantum, optical, and DNA-based computing. Suggested readings provide a jumping-off point for deeper learning. A supplemental website contains chapters that did not make it into the book, as well as exercises, projects, and additional resources that will be useful for more in-depth investigations. The Zen of Exotic Computing is intended for computer science students interested in understanding alternative models of computing. It will also be of interest to researchers and practitioners interested in emerging technology such as quantum computing, machine learning, and AI.
Enterprise information systems touch every process of an organization as new functionalities in previously existing and upcoming solutions are created every day. ""Social, Managerial, and Organizational Dimensions of Enterprise Information Systems"" discusses the technological developments, main issues, challenges, opportunities, and trends impacting every part of small to medium sized enterprises. A leading resource for academicians, managers, and researchers, this advanced publication provides an integrated and progressive view into the benefits and applications of enterprise information systems.
This book systematically examines and quantifies industrial problems by assessing the complexity and safety of large systems. It includes chapters on system performance management, software reliability assessment, testing, quality management, analysis using soft computing techniques, management analytics, and business analytics, with a clear focus on exploring real-world business issues. Through contributions from researchers working in the area of performance, management, and business analytics, it explores the development of new methods and approaches to improve business by gaining knowledge from bulk data. With system performance analytics, companies are now able to drive performance and provide actionable insights for each level and for every role using key indicators, generate mobile-enabled scorecards, time series-based analysis using charts, and dashboards. In the current dynamic environment, a viable tool known as multi-criteria decision analysis (MCDA) is increasingly being adopted to deal with complex business decisions. MCDA is an important decision support tool for analyzing goals and providing optimal solutions and alternatives. It comprises several distinct techniques, which are implemented by specialized decision-making packages. This book addresses a number of important MCDA methods, such as DEMATEL, TOPSIS, AHP, MAUT, and Intuitionistic Fuzzy MCDM, which make it possible to derive maximum utility in the area of analytics. As such, it is a valuable resource for researchers and academicians, as well as practitioners and business experts.
This book contains all refereed papers accepted during the fourth asia-pacific edition & twelve edition - which were merged this year - of the CSD&M conference that took place in Beijing, People's Republic of China by 2021. Mastering complex systems requires an integrated understanding of industrial practices as well as sophisticated theoretical techniques and tools. This explains the creation of an annual go-between European and Asian forum dedicated to academic researchers & industrial actors working on complex industrial systems architecting, modeling & engineering. These proceedings cover the most recent trends in the emerging field of complex systems, both from an academic and professional perspective. A special focus was put this year on "Digital Transformation in Complex Systems Engineering". CESAM Community The CSD&M series of conferences are organized under the guidance of CESAM Community, managed by CESAMES. CESAM Community aims in organizing the sharing of good practices in systems architecting and model-based systems engineering (MBSE) and certifying the level of knowledge and proficiency in this field through the CESAM certification. The CESAM systems architecting & model-based systems engineering (MBSE) certification is especially currently the most disseminated professional certification in the world in this domain through more than 1,000 real complex system development projects on which it was operationally deployed and around 10,000 engineers who were trained on the CESAM framework at international level.
Transactions are a concept related to the logical database as seen from the perspective of database application programmers: a transaction is a sequence of database actions that is to be executed as an atomic unit of work. The processing of transactions on databases is a well- established area with many of its foundations having already been laid in the late 1970s and early 1980s. The unique feature of this textbook is that it bridges the gap between the theory of transactions on the logical database and the implementation of the related actions on the underlying physical database. The authors relate the logical database, which is composed of a dynamically changing set of data items with unique keys, and the underlying physical database with a set of fixed-size data and index pages on disk. Their treatment of transaction processing builds on the "do-redo-undo" recovery paradigm, and all methods and algorithms presented are carefully designed to be compatible with this paradigm as well as with write-ahead logging, steal-and-no-force buffering, and fine-grained concurrency control. Chapters 1 to 6 address the basics needed to fully appreciate transaction processing on a centralized database system within the context of our transaction model, covering topics like ACID properties, database integrity, buffering, rollbacks, isolation, and the interplay of logical locks and physical latches. Chapters 7 and 8 present advanced features including deadlock-free algorithms for reading, inserting and deleting tuples, while the remaining chapters cover additional advanced topics extending on the preceding foundational chapters, including multi-granular locking, bulk actions, versioning, distributed updates, and write-intensive transactions. This book is primarily intended as a text for advanced undergraduate or graduate courses on database management in general or transaction processing in particular.
This book introduces a novel framework for accurately modeling the errors in nanoscale CMOS technology and developing a smooth tool flow at high-level design abstractions to estimate and mitigate the effects of errors. The book presents novel techniques for high-level fault simulation and reliability estimation as well as architecture-level and system-level fault tolerant designs. It also presents a survey of state-of-the-art problems and solutions, offering insights into reliability issues in digital design and their cross-layer countermeasures.
The second volume in a series which aims to focus on advances in computational biology. This volume discusses such topics as: statistical analysis of protein sequences; progress in large-scale sequence analysis; and the architecture of loops in proteins.
This book explores the future of cyber technologies and cyber operations which will influence advances in social media, cyber security, cyber physical systems, ethics, law, media, economics, infrastructure, military operations and other elements of societal interaction in the upcoming decades. It provides a review of future disruptive technologies and innovations in cyber security. It also serves as a resource for wargame planning and provides a strategic vision of the future direction of cyber operations. It informs military strategist about the future of cyber warfare. Written by leading experts in the field, chapters explore how future technical innovations vastly increase the interconnectivity of our physical and social systems and the growing need for resiliency in this vast and dynamic cyber infrastructure. The future of social media, autonomy, stateless finance, quantum information systems, the internet of things, the dark web, space satellite operations, and global network connectivity is explored along with the transformation of the legal and ethical considerations which surround them. The international challenges of cyber alliances, capabilities, and interoperability is challenged with the growing need for new laws, international oversight, and regulation which informs cybersecurity studies. The authors have a multi-disciplinary scope arranged in a big-picture framework, allowing both deep exploration of important topics and high level understanding of the topic. Evolution of Cyber Technologies and Operations to 2035 is as an excellent reference for professionals and researchers working in the security field, or as government and military workers, economics, law and more. Students will also find this book useful as a reference guide or secondary text book.
Tools of data comparison and analysis are critical in the field of archaeology, and the integration of technological advancements such as geographic information systems, intelligent systems, and virtual reality reconstructions with the teaching of archaeology is crucial to the effective utilization of resources in the field. ""E-Learning Methodologies and Computer Applications in Archaeology"" presents innovative instructional approaches for archaeological e-learning based on networked technologies, providing researchers, scholars, and professionals a comprehensive global perspective on the resources, development, application, and implications of information communication technology in multimedia-based educational products and services in archaeology.
This book discusses recent developments in semigroup theory and its applications in areas such as operator algebras, operator approximations and category theory. All contributing authors are eminent researchers in their respective fields, from across the world. Their papers, presented at the 2014 International Conference on Semigroups, Algebras and Operator Theory in Cochin, India, focus on recent developments in semigroup theory and operator algebras. They highlight current research activities on the structure theory of semigroups as well as the role of semigroup theoretic approaches to other areas such as rings and algebras. The deliberations and discussions at the conference point to future research directions in these areas. This book presents 16 unpublished, high-quality and peer-reviewed research papers on areas such as structure theory of semigroups, decidability vs. undecidability of word problems, regular von Neumann algebras, operator theory and operator approximations. Interested researchers will find several avenues for exploring the connections between semigroup theory and the theory of operator algebras.
This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two independent precise faults in a single pairing computation, it has not been taken seriously either. Based on these two attacks, the book demonstrates that the assessment of physical attack complexity is error-prone, and as such cryptography should not rely on it. Cryptographic technologies have to be protected against all physical attacks, whether they have already been successfully implemented or not. The development of countermeasures does not require the successful execution of an attack but can already be carried out as soon as the principle of a side channel or a fault attack is sufficiently understood.
This volume is the first ever collection devoted to the field of proof-theoretic semantics. Contributions address topics including the systematics of introduction and elimination rules and proofs of normalization, the categorial characterization of deductions, the relation between Heyting's and Gentzen's approaches to meaning, knowability paradoxes, proof-theoretic foundations of set theory, Dummett's justification of logical laws, Kreisel's theory of constructions, paradoxical reasoning, and the defence of model theory. The field of proof-theoretic semantics has existed for almost 50 years, but the term itself was proposed by Schroeder-Heister in the 1980s. Proof-theoretic semantics explains the meaning of linguistic expressions in general and of logical constants in particular in terms of the notion of proof. This volume emerges from presentations at the Second International Conference on Proof-Theoretic Semantics in Tubingen in 2013, where contributing authors were asked to provide a self-contained description and analysis of a significant research question in this area. The contributions are representative of the field and should be of interest to logicians, philosophers, and mathematicians alike.
Python Programming Professional Made Easy 2nd Edition! Sam Key back at it again with his upgraded version of Python Going from beginner to professional? Want to skip the learning curve? Need the jargon removed so you can understand in your terms? From various programming languages to statements and Basic Operators Everything you need to know with functions and flow controls! Don't waste anytime and jump on board of Python! Start your programming right now!
"Google Earth Forensics" is the first book to explain how to use Google Earth in digital forensic investigations. This book teaches you how to leverage Google's free tool to craft compelling location-based evidence for use in investigations and in the courtroom. It shows how to extract location-based data that can be used to display evidence in compelling audiovisual manners that explain and inform the data in contextual, meaningful, and easy-to-understand ways. As mobile computing devices become more and more prevalent and powerful, they are becoming more and more useful in the field of law enforcement investigations and forensics. Of all the widely used mobile applications, none have more potential for helping solve crimes than those with geo-location tools. Written for investigators and forensic practitioners, "Google
Earth Forensics" is written by an investigator and trainer with
more than 13 years of experience in law enforcement who will show
you how to use this valuable tool anywhere at the crime scene, in
the lab, or in the courtroom.
This book provides developers, engineers, researchers and students with detailed knowledge about the High Efficiency Video Coding (HEVC) standard. HEVC is the successor to the widely successful H.264/AVC video compression standard, and it provides around twice as much compression as H.264/AVC for the same level of quality. The applications for HEVC will not only cover the space of the well-known current uses and capabilities of digital video they will also include the deployment of new services and the delivery of enhanced video quality, such as ultra-high-definition television (UHDTV) and video with higher dynamic range, wider range of representable color, and greater representation precision than what is typically found today. HEVC is the next major generation of video coding design a flexible, reliable and robust solution that will support the next decade of video applications and ease the burden of video on world-wide network traffic. This book provides a detailed explanation of the various parts of the standard, insight into how it was developed, and in-depth discussion of algorithms and architectures for its implementation."
With the growing popularity of "big data", the potential value of personal data has attracted more and more attention. Applications built on personal data can create tremendous social and economic benefits. Meanwhile, they bring serious threats to individual privacy. The extensive collection, analysis and transaction of personal data make it difficult for an individual to keep the privacy safe. People now show more concerns about privacy than ever before. How to make a balance between the exploitation of personal information and the protection of individual privacy has become an urgent issue. In this book, the authors use methodologies from economics, especially game theory, to investigate solutions to the balance issue. They investigate the strategies of stakeholders involved in the use of personal data, and try to find the equilibrium. The book proposes a user-role based methodology to investigate the privacy issues in data mining, identifying four different types of users, i.e. four user roles, involved in data mining applications. For each user role, the authors discuss its privacy concerns and the strategies that it can adopt to solve the privacy problems. The book also proposes a simple game model to analyze the interactions among data provider, data collector and data miner. By solving the equilibria of the proposed game, readers can get useful guidance on how to deal with the trade-off between privacy and data utility. Moreover, to elaborate the analysis on data collector's strategies, the authors propose a contract model and a multi-armed bandit model respectively. The authors discuss how the owners of data (e.g. an individual or a data miner) deal with the trade-off between privacy and utility in data mining. Specifically, they study users' strategies in collaborative filtering based recommendation system and distributed classification system. They built game models to formulate the interactions among data owners, and propose learning algorithms to find the equilibria.
This book teaches algebra and geometry. The authors dedicate chapters to the key issues of matrices, linear equations, matrix algorithms, vector spaces, lines, planes, second-order curves, and elliptic curves. The text is supported throughout with problems, and the authors have included source code in Python in the book. The book is suitable for advanced undergraduate and graduate students in computer science.
This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of parallel iterative linear system solvers with emphasis on scalable preconditioners, (b) parallel schemes for obtaining a few of the extreme eigenpairs or those contained in a given interval in the spectrum of a standard or generalized symmetric eigenvalue problem, and (c) parallel methods for computing a few of the extreme singular triplets. Part IV focuses on the development of parallel algorithms for matrix functions and special characteristics such as the matrix pseudospectrum and the determinant. The book also reviews the theoretical and practical background necessary when designing these algorithms and includes an extensive bibliography that will be useful to researchers and students alike. The book brings together many existing algorithms for the fundamental matrix computations that have a proven track record of efficient implementation in terms of data locality and data transfer on state-of-the-art systems, as well as several algorithms that are presented for the first time, focusing on the opportunities for parallelism and algorithm robustness.
Cyber-physical systems (CPS) can be defined as systems in which physical objects are represented in the digital world and integrated with computation, storage, and communication capabilities and are connected to each other in a network. The goal in the use of the CPS is integrating the dynamics of the physical processes with those of the software and networking, providing abstractions and modelling, design, and analysis techniques for the integrated whole. The notion of CPS is linked to concepts of robotics and sensor networks with intelligent systems proper of computational intelligence leading the pathway. Recent advances in science and engineering improve the link between computational and physical elements by means of intelligent systems, increasing the adaptability, autonomy, efficiency, functionality, reliability, safety, and usability of cyber-physical systems. The potential of cyber-physical systems will spread to several directions, including but not limited to intervention, precision manufacturing, operations in dangerous or inaccessible environments, coordination, efficiency, Maintenance 4.0, and augmentation of human capabilities. Design, Applications, and Maintenance of Cyber-Physical Systems gives insights about CPS as tools for integrating the dynamics of the physical processes with those of software and networking, providing abstractions and modelling, design, and analysis techniques for their smart manufacturing interoperation. The book will have an impact upon the research on robotics, mechatronics, integrated intelligent multibody systems, Industry 4.0, production systems management and maintenance, decision support systems, and Maintenance 4.0. The chapters discuss not only the technologies involved in CPS but also insights into how they are used in various industries. This book is ideal for engineers, practitioners, researchers, academicians, and students who are interested in a deeper understanding of cyber-physical systems (CPS), their design, application, and maintenance, with a special focus on modern technologies in Industry 4.0 and Maintenance 4.0.
This book presents the latest developments regarding a detailed mobile agent-enabled anomaly detection and verification system for resource constrained sensor networks; a number of algorithms on multi-aspect anomaly detection in sensor networks; several algorithms on mobile agent transmission optimization in resource constrained sensor networks; an algorithm on mobile agent-enabled in situ verification of anomalous sensor nodes; a detailed Petri Net-based formal modeling and analysis of the proposed system, and an algorithm on fuzzy logic-based cross-layer anomaly detection and mobile agent transmission optimization. As such, it offers a comprehensive text for interested readers from academia and industry alike. |
![]() ![]() You may like...
Precision Forming Technology of Large…
Baode Sun, Jun Wang, …
Hardcover
R5,153
Discovery Miles 51 530
Advances in Diagnostics of Processes and…
Jozef Korbicz, Krzysztof Patan, …
Hardcover
R2,874
Discovery Miles 28 740
Microbiorobotics - Biologically Inspired…
Minjun Kim, Agung Julius, …
Hardcover
R3,415
Discovery Miles 34 150
Satellite Formation Flying - Relative…
Danwei Wang, Baolin Wu, …
Hardcover
R4,547
Discovery Miles 45 470
Nonlinear Industrial Control Systems…
Michael J. Grimble, Pawel Majecki
Hardcover
R6,491
Discovery Miles 64 910
Recent Trends in Mechanical Engineering…
C. S. Ramesh, Praduymna Ghosh, …
Hardcover
R6,326
Discovery Miles 63 260
|