![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.
This unique text/reference presents a thorough introduction to the field of structural pattern recognition, with a particular focus on graph edit distance (GED). The book also provides a detailed review of a diverse selection of novel methods related to GED, and concludes by suggesting possible avenues for future research. Topics and features: formally introduces the concept of GED, and highlights the basic properties of this graph matching paradigm; describes a reformulation of GED to a quadratic assignment problem; illustrates how the quadratic assignment problem of GED can be reduced to a linear sum assignment problem; reviews strategies for reducing both the overestimation of the true edit distance and the matching time in the approximation framework; examines the improvement demonstrated by the described algorithmic framework with respect to the distance accuracy and the matching time; includes appendices listing the datasets employed for the experimental evaluations discussed in the book.
This introduction to cryptography employs a programming-oriented approach to study the most important cryptographic schemes in current use and the main cryptanalytic attacks against them. Discussion of the theoretical aspects, emphasizing precise security definitions based on methodological tools such as complexity and randomness, and of the mathematical aspects, with emphasis on number-theoretic algorithms and their applications to cryptography and cryptanalysis, is integrated with the programming approach, thus providing implementations of the algorithms and schemes as well as examples of realistic size. A distinctive feature of the author's approach is the use of Maple as a programming environment in which not just the cryptographic primitives but also the most important cryptographic schemes are implemented following the recommendations of standards bodies such as NIST, with many of the known cryptanalytic attacks implemented as well. The purpose of the Maple implementations is to let the reader experiment and learn, and for this reason the author includes numerous examples. The book discusses important recent subjects such as homomorphic encryption, identity-based cryptography and elliptic curve cryptography. The algorithms and schemes which are treated in detail and implemented in Maple include AES and modes of operation, CMAC, GCM/GMAC, SHA-256, HMAC, RSA, Rabin, Elgamal, Paillier, Cocks IBE, DSA and ECDSA. In addition, some recently introduced schemes enjoying strong security properties, such as RSA-OAEP, Rabin-SAEP, Cramer--Shoup, and PSS, are also discussed and implemented. On the cryptanalysis side, Maple implementations and examples are used to discuss many important algorithms, including birthday and man-in-the-middle attacks, integer factorization algorithms such as Pollard's rho and the quadratic sieve, and discrete log algorithms such as baby-step giant-step, Pollard's rho, Pohlig--Hellman and the index calculus method. This textbook is suitable for advanced undergraduate and graduate students of computer science, engineering and mathematics, satisfying the requirements of various types of courses: a basic introductory course; a theoretically oriented course whose focus is on the precise definition of security concepts and on cryptographic schemes with reductionist security proofs; a practice-oriented course requiring little mathematical background and with an emphasis on applications; or a mathematically advanced course addressed to students with a stronger mathematical background. The main prerequisite is a basic knowledge of linear algebra and elementary calculus, and while some knowledge of probability and abstract algebra would be helpful, it is not essential because the book includes the necessary background from these subjects and, furthermore, explores the number-theoretic material in detail. The book is also a comprehensive reference and is suitable for self-study by practitioners and programmers."
In this book, the following three approaches to data analysis are presented: - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, - Rough Sets, founded by Zdzis aw I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected."
Evolutionary algorithms (EAs) is now a mature problem-solving family of heuristics that has found its way into many important real-life problems and into leading-edge scientific research. Spatially structured EAs have different properties than standard, mixing EAs. By virtue of the structured disposition of the population members they bring about new dynamical features that can be harnessed to solve difficult problems faster and more efficiently. This book describes the state of the art in spatially structured EAs by using graph concepts as a unifying theme. The models, their analysis, and their empirical behavior are presented in detail. Moreover, there is new material on non-standard networked population structures such as small-world networks. The book should be of interest to advanced undergraduate and graduate students working in evolutionary computation, machine learning, and optimization. It should also be useful to researchers and professionals working in fields where the topological structures of populations and their evolution plays a role.
This book combines the advantages of high-dimensional data visualization and machine learning in the context of identifying complex n-D data patterns. It vastly expands the class of reversible lossless 2-D and 3-D visualization methods, which preserve the n-D information. This class of visual representations, called the General Lines Coordinates (GLCs), is accompanied by a set of algorithms for n-D data classification, clustering, dimension reduction, and Pareto optimization. The mathematical and theoretical analyses and methodology of GLC are included, and the usefulness of this new approach is demonstrated in multiple case studies. These include the Challenger disaster, world hunger data, health monitoring, image processing, text classification, market forecasts for a currency exchange rate, computer-aided medical diagnostics, and others. As such, the book offers a unique resource for students, researchers, and practitioners in the emerging field of Data Science.
One of the world s leading problems in the field of national security is protection of borders and borderlands. This book addresses multiple issues on advanced innovative methods of multi-level control of both ground (UGVs) and aerial drones (UAVs). Those objects combined with innovative algorithms become autonomous objects capable of patrolling chosen borderland areas by themselves and automatically inform the operator of the system about potential place of detection of a specific incident. This is achieved by using sophisticated methods of generation of non-collision trajectory for those types of objects and enabling automatic integration of both ground and aerial unmanned vehicles. The topics included in this book also cover presentation of complete information and communication technology (ICT) systems capable of control, observation and detection of various types of incidents and threats. This book is a valuable source of information for constructors and developers of such solutions for uniformed services. Scientists and researchers involved in computer vision, image processing, data fusion, control algorithms or IC can find many valuable suggestions and solutions. Multiple challenges for such systems are also presented. "
This book provides the basic theory, techniques, and algorithms of modern cryptography that are applicable to network and cyberspace security. It consists of the following nine main chapters: Chapter 1 provides the basic concepts and ideas of cyberspace and cyberspace security, Chapters 2 and 3 provide an introduction to mathematical and computational preliminaries, respectively. Chapters 4 discusses the basic ideas and system of secret-key cryptography, whereas Chapters 5, 6, and 7 discuss the basic ideas and systems of public-key cryptography based on integer factorization, discrete logarithms, and elliptic curves, respectively. Quantum-safe cryptography is presented in Chapter 8 and offensive cryptography, particularly cryptovirology, is covered in Chapter 9. This book can be used as a secondary text for final-year undergraduate students and first-year postgraduate students for courses in Computer, Network, and Cyberspace Security. Researchers and practitioners working in cyberspace security and network security will also find this book useful as a reference.
"Fixed-Point Algorithms for Inverse Problems in Science and Engineering" presents some of the most recent work from top-notch researchers studying projection and other first-order fixed-point algorithms in several areas of mathematics and the applied sciences. The material presented provides a survey of the state-of-the-art theory and practice in fixed-point algorithms, identifying emerging problems driven by applications, and discussing new approaches for solving these problems. This book incorporates diverse perspectives from broad-ranging areas of research including, variational analysis, numerical linear algebra, biotechnology, materials science, computational solid-state physics, and chemistry. Topics presented include: Theory of Fixed-point algorithms: convex analysis, convex optimization, subdifferential calculus, nonsmooth analysis, proximal point methods, projection methods, resolvent and related fixed-point theoretic methods, and monotone operator theory. Numerical analysis of fixed-point algorithms: choice of step lengths, of weights, of blocks for block-iterative and parallel methods, and of relaxation parameters; regularization of ill-posed problems; numerical comparison of various methods. Areas of Applications: engineering (image and signal reconstruction and decompression problems), computer tomography and radiation treatment planning (convex feasibility problems), astronomy (adaptive optics), crystallography (molecular structure reconstruction), computational chemistry (molecular structure simulation) and other areas. Because of the variety of applications presented, this book can easily serve as a basis for new and innovated research and collaboration.
This book constitutes the thoroughly refereed post conference proceedings of the 6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, held in Helsingborg, Sweden, in August 2010. The 27 revised papers were carefully selected from numerous submissions during two rounds of reviewing. They are organized in topical sections on terminology, privacy metrics, ethical, social, and legal aspects, data protection and identity management, eID cards and eID interoperability, emerging technologies, privacy for eGovernment and AAL applications, social networks and privacy, privacy policies, and usable privacy.
This book contains extended and revised versions of the best papers presented at the 18th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2010, held in Madrid, Spain, in September 2010. The 14 papers included in the book were carefully reviewed and selected from the 52 full papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research. They address the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems.
This book presents the theory of continuum mechanics for mechanical, thermodynamical, and electrodynamical systems. It shows how to obtain governing equations and it applies them by computing the reality. It uses only open-source codes developed under the FEniCS project and includes codes for 20 engineering applications from mechanics, fluid dynamics, applied thermodynamics, and electromagnetism. Moreover, it derives and utilizes the constitutive equations including coupling terms, which allow to compute multiphysics problems by incorporating interactions between primitive variables, namely, motion, temperature, and electromagnetic fields. An engineering system is described by the primitive variables satisfying field equations that are partial differential equations in space and time. The field equations are mostly coupled and nonlinear, in other words, difficult to solve. In order to solve the coupled, nonlinear system of partial differential equations, the book uses a novel collection of open-source packages developed under the FEniCS project. All primitive variables are solved at once in a fully coupled fashion by using finite difference method in time and finite element method in space.
The information infrastructure---comprising computers, embedded devices, networks and software systems---is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection V describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: Themes and Issues, Control Systems Security, Infrastructure Security, and Infrastructure Modeling and Simulation. This book is the 5th volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of 14 edited papers from the 5th Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection, held at Dartmouth College, Hanover, New Hampshire, USA in the spring of 2011. Critical Infrastructure Protection V is an important resource for researchers, faculty members and graduate students, as well as for policy makers, practitioners and other individuals with interests in homeland security. Jonathan Butts is an Assistant Professor of Computer Science at the Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio, USA. Sujeet Shenoi is the F.P. Walter Professor of Computer Science at the University of Tulsa, Tulsa, Oklahoma, USA.
The Workshop on the Economics of Information Security was established in 2002 to bring together computer scientists and economists to understand and improve the poor state of information security practice. WEIS was borne out of a realization that security often fails for non-technical reasons. Rather, the incentives of both - fender and attacker must be considered. Earlier workshops have answered questions ranging from?nding optimal levels of security investement to understanding why privacy has been eroded. In the process, WEIS has attracted participation from the diverse?elds such as law, management and psychology. WEIS has now established itself as the leading forum for interdisciplinary scholarship on information security. The eigth installment of the conference returned to the United Kingdom, hosted byUniversityCollegeLondononJune24-25,2009.Approximately100researchers, practitioners and government of?cials from across the globe convened in London to hear presentations from authors of 21 peer-reviewed papers, in addition to a panel and keynote lectures from Hal Varian (Google), Bruce Schneier (BT Co- terpane), Martin Sadler (HP Labs), and Robert Coles (Merrill Lynch). Angela Sasse and David Pym chaired the conference, while Christos Ioannidis and Tyler Moore chaired the program committee.
This book acquaints readers with recent developments in dynamical systems theory and its applications, with a strong focus on the control and estimation of nonlinear systems. Several algorithms are proposed and worked out for a set of model systems, in particular so-called input-affine or bilinear systems, which can serve to approximate a wide class of nonlinear control systems. These can either take the form of state space models or be represented by an input-output equation. The approach taken here further highlights the role of modern mathematical and conceptual tools, including differential algebraic theory, observer design for nonlinear systems and generalized canonical forms.
Throughout time, scientists have looked to nature in order to understand and model solutions for complex real-world problems. In particular, the study of self-organizing entities, such as social insect populations, presents a new opportunity within the field of artificial intelligence. >Emerging Research on Swarm Intelligence and Algorithm Optimization discusses current research analyzing how the collective behavior of decentralized systems in the natural world can be applied to intelligent system design. Discussing the application of swarm principles, optimization techniques, and key algorithms being used in the field, this publication serves as an essential reference for academicians, upper-level students, IT developers, and IT theorists.
Evolutionary Algorithms, in particular Evolution Strategies, Genetic Algorithms, or Evolutionary Programming, have found wide acceptance as robust optimization algorithms in the last ten years. Compared with the broad propagation and the resulting practical prosperity in different scientific fields, the theory has not progressed as much.This monograph provides the framework and the first steps toward the theoretical analysis of Evolution Strategies (ES). The main emphasis is on understanding the functioning of these probabilistic optimization algorithms in real-valued search spaces by investigating the dynamical properties of some well-established ES algorithms. The book introduces the basic concepts of this analysis, such as progress rate, quality gain, and self-adaptation response, and describes how to calculate these quantities. Based on the analysis, functioning principles are derived, aiming at a qualitative understanding of why and how ES algorithms work.
This book describes a novel methodology for studying algorithmic skills, intended as cognitive activities related to rule-based symbolic transformation, and argues that some human computational abilities may be interpreted and analyzed as genuine examples of extended cognition. It shows that the performance of these abilities relies not only on innate neurocognitive systems or language-related skills, but also on external tools and general agent-environment interactions. Further, it asserts that a low-level analysis, based on a set of core neurocognitive systems linking numbers and language, is not sufficient to explain some specific forms of high-level numerical skills, like those involved in algorithm execution. To this end, it reports on the design of a cognitive architecture for modeling all the relevant features involved in the execution of algorithmic strategies, including external tools, such as paper and pencils. The first part of the book discusses the philosophical premises for endorsing and justifying a position in philosophy of mind that links a modified form of computationalism with some recent theoretical and scientific developments, like those introduced by the so-called dynamical approach to cognition. The second part is dedicated to the description of a Turing-machine-inspired cognitive architecture, expressly designed to formalize all kinds of algorithmic strategies.
This book contains extended and revised versions of the best papers that were p- sented during the 16th edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 16th conference was held at the Grand Hotel of Rhodes Island, Greece (October 13-15, 2008). Previous conferences have taken place in Edinburgh, Trondheim, V- couver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth, Nice and Atlanta. VLSI-SoC 2008 was the 16th in a series of international conferences sponsored by IFIP TC 10 Working Group 10.5 and IEEE CEDA that explores the state of the art and the new developments in the field of VLSI systems and their designs. The purpose of the conference was to provide a forum to exchange ideas and to present industrial and research results in the fields of VLSI/ULSI systems, embedded systems and - croelectronic design and test.
This book gives an overview of constraint satisfaction problems (CSPs), adapts related search algorithms and consistency algorithms for applications to multi-agent systems, and consolidates recent research devoted to cooperation in such systems. The techniques introduced are applied to various problems in multi-agent systems. Among the new approaches is a hybrid-type algorithm for weak-commitment search combining backtracking and iterative improvement; also, an extension of the basic CSP formalization called partial CSP is introduced in order to handle over-constrained CSPs.The book is written for advanced students and professionals interested in multi-agent systems or, more generally, in distributed artificial intelligence and constraint satisfaction. Researchers active in the area will appreciate this book as a valuable source of reference.
This edited book first consolidates the results of the EU-funded EDISON project (Education for Data Intensive Science to Open New science frontiers), which developed training material and information to assist educators, trainers, employers, and research infrastructure managers in identifying, recruiting and inspiring the data science professionals of the future. It then deepens the presentation of the information and knowledge gained to allow for easier assimilation by the reader. The contributed chapters are presented in sequence, each chapter picking up from the end point of the previous one. After the initial book and project overview, the chapters present the relevant data science competencies and body of knowledge, the model curriculum required to teach the required foundations, profiles of professionals in this domain, and use cases and applications. The text is supported with appendices on related process models. The book can be used to develop new courses in data science, evaluate existing modules and courses, draft job descriptions, and plan and design efficient data-intensive research teams across scientific disciplines.
Cyberspace security is a critical subject of our times. On the one hand the development of Internet, mobile communications, distributed computing, computer software and databases storing essential enterprise information has helped to conduct business and personal communication between individual people. On the other hand it has created many opportunities for abuse, fraud and expensive damage. This book is a selection of the best papers presented at the NATO Advanced Research Workshop dealing with the Subject of Cyberspace Security and Defense. The level of the individual contributions in the volume is advanced and suitable for senior and graduate students, researchers and technologists who wish to get some feeling of the state of the art in several sub-disciplines of Cyberspace security. Several papers provide a broad-brush description of national security issues and brief summaries of technology states. These papers can be read and appreciated by technically enlightened managers and executives who want to understand security issues and approaches to technical solutions. An important question of our times is not "Should we do something for enhancing our digital assets security," the question is "How to do it."
New Approaches to Circle Packing into the Square is devoted to the most recent results on the densest packing of equal circles in a square. In the last few decades, many articles have considered this question, which has been an object of interest since it is a hard challenge both in discrete geometry and in mathematical programming. The authors have studied this geometrical optimization problem for a long time, and they developed several new algorithms to solve it. The book completely covers the investigations on this topic.
This book explores inductive inference using the minimum message length (MML) principle, a Bayesian method which is a realisation of Ockham's Razor based on information theory. Accompanied by a library of software, the book can assist an applications programmer, student or researcher in the fields of data analysis and machine learning to write computer programs based upon this principle. MML inference has been around for 50 years and yet only one highly technical book has been written about the subject. The majority of research in the field has been backed by specialised one-off programs but this book includes a library of general MML-based software, in Java. The Java source code is available under the GNU GPL open-source license. The software library is documented using Javadoc which produces extensive cross referenced HTML manual pages. Every probability distribution and statistical model that is described in the book is implemented and documented in the software library. The library may contain a component that directly solves a reader's inference problem, or contain components that can be put together to solve the problem, or provide a standard interface under which a new component can be written to solve the problem. This book will be of interest to application developers in the fields of machine learning and statistics as well as academics, postdocs, programmers and data scientists. It could also be used by third year or fourth year undergraduate or postgraduate students. |
![]() ![]() You may like...
Provenance in Data Science - From Data…
Leslie F Sikos, Oshani W. Seneviratne, …
Hardcover
R3,890
Discovery Miles 38 900
|