![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
This book presents a comprehensive review of key distributed graph algorithms for computer network applications, with a particular emphasis on practical implementation. Topics and features: introduces a range of fundamental graph algorithms, covering spanning trees, graph traversal algorithms, routing algorithms, and self-stabilization; reviews graph-theoretical distributed approximation algorithms with applications in ad hoc wireless networks; describes in detail the implementation of each algorithm, with extensive use of supporting examples, and discusses their concrete network applications; examines key graph-theoretical algorithm concepts, such as dominating sets, and parameters for mobility and energy levels of nodes in wireless ad hoc networks, and provides a contemporary survey of each topic; presents a simple simulator, developed to run distributed algorithms; provides practical exercises at the end of each chapter.
Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Identity-Based Encryption a useful book. Practitioners and engineers who work with real-world IBE schemes and need a proper understanding of the basic IBE techniques, will also find this book a valuable asset.
st The world of the 21 century is, more than ever, global and impersonal. Criminal and terrorist threats, both physical and on the Internet, increase by the day. The demand for better methods of identification and access control is growing, not only in companies and organisations but also in the world at large. At the same time, such security measures have to be balanced with means for protecting the privacy of users. Identity management is put under pressure, due to the growing number of frauds who want to hide their true identity. This challenges the information security research community to focus on interdisciplinary and holistic approaches while retaining the benefits of previous research efforts. In this context, the IFIP Working Group 11.6 on Identity Management has been founded in August 2006. The intention of the Working Group is to offer a broad forum for the exchange of knowledge and for the tracking and discussion of issues and new developments. In this, we take an interdisciplinary approach. Scientists as well as practitioners, from government and business, who are involved in the field of identity management are welcome to participate. The IDMAN 2007 Conference on Policies and Research in Identity Management was the very first conference organized by this Working Group. We aim to organize conferences bi-annually. The IDMAN 2007 Conference has been centered around the theme of National Identity Management or, in other words, identity management in the public sector.
Data Management is the process of planning, coordinating and controlling data resources. More often, applications need to store and search a large amount of data. Managing Data has been continuously challenged by demands from various areas and applications and has evolved in parallel with advances in hardware and computing techniques. This volume focuses on its recent advances and it is composed of five parts and a total of eighteen chapters. The first part of the book contains five contributions in the area of information retrieval and Web intelligence: a novel approach to solving index selection problem, integrated retrieval from Web of documents and data, bipolarity in database querying, deriving data summarization through ontologies, and granular computing for Web intelligence. The second part of the book contains four contributions in knowledge discovery area. Its third part contains three contributions in information integration and data security area. The remaining two parts of the book contain six contributions in the area of intelligent agents and applications of data management in medical domain.
Intelligent information and database systems are two closely related and we- established subfields of modern computer science. They focus on the integration of artificial intelligence and classic database technologies in order to create the class of next generation information systems. The major target of this new gene- tion of systems is to provide end-users with intelligent behavior: simple and/or advanced learning, problem solving, uncertain and certain reasoning, se- organization, cooperation, etc. Such intelligent abilities are implemented in classic information systems to make them autonomous and user oriented, in particular when advanced problems of multimedia information and knowledge discovery, access, retrieval and manipulation are to be solved in the context of large, distr- uted and heterogeneous environments. It means that intelligent knowledge-based information and database systems are used to solve basic problems of large coll- tions management, carry out knowledge discovery from large data collections, reason about information under uncertain conditions, support users in their for- lation of complex queries etc. Topics discussed in this volume include but are not limited to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, implementation, validation, maintenance and evolution.
This book focuses on recent research in modern optimization and its implications in control and data analysis. This book is a collection of papers from the conference "Optimization and Its Applications in Control and Data Science" dedicated to Professor Boris T. Polyak, which was held in Moscow, Russia on May 13-15, 2015. This book reflects developments in theory and applications rooted by Professor Polyak's fundamental contributions to constrained and unconstrained optimization, differentiable and nonsmooth functions, control theory and approximation. Each paper focuses on techniques for solving complex optimization problems in different application areas and recent developments in optimization theory and methods. Open problems in optimization, game theory and control theory are included in this collection which will interest engineers and researchers working with efficient algorithms and software for solving optimization problems in market and data analysis. Theoreticians in operations research, applied mathematics, algorithm design, artificial intelligence, machine learning, and software engineering will find this book useful and graduate students will find the state-of-the-art research valuable.
This book presents advances in alternative swarm development that have proved to be effective in several complex problems. Swarm intelligence (SI) is a problem-solving methodology that results from the cooperation between a set of agents with similar characteristics. The study of biological entities, such as animals and insects, manifesting social behavior has resulted in several computational models of swarm intelligence. While there are numerous books addressing the most widely known swarm methods, namely ant colony algorithms and particle swarm optimization, those discussing new alternative approaches are rare. The focus on developments based on the simple modification of popular swarm methods overlooks the opportunity to discover new techniques and procedures that can be useful in solving problems formulated by the academic and industrial communities. Presenting various novel swarm methods and their practical applications, the book helps researchers, lecturers, engineers and practitioners solve their own optimization problems.
From the reviews of the 1st edition: "This book provides a comprehensive and detailed account of different topics in algorithmic 3-dimensional topology, culminating with the recognition procedure for Haken manifolds and including the up-to-date results in computer enumeration of 3-manifolds. Originating from lecture notes of various courses given by the author over a decade, the book is intended to combine the pedagogical approach of a graduate textbook (without exercises) with the completeness and reliability of a research monograph... All the material, with few exceptions, is presented from the peculiar point of view of special polyhedra and special spines of 3-manifolds. This choice contributes to keep the level of the exposition really elementary. In conclusion, the reviewer subscribes to the quotation from the back cover: "the book fills a gap in the existing literature and will become a standard reference for algorithmic 3-dimensional topology both for graduate students and researchers." Zentralblatt fur Mathematik 2004 For this 2nd edition, new results, new proofs, and commentaries for a better orientation of the reader have been added. In particular, in Chapter 7 several new sections concerning applications of the computer program "3-Manifold Recognizer" have been included. "
Information security and copyright protection are more important today than before. Digital watermarking is one of the widely used techniques used in the world in the area of information security. This book introduces a number of digital watermarking techniques and is divided into four parts. The first part introduces the importance of watermarking techniques and intelligent technology. The second part includes a number of watermarking techniques. The third part includes the hybrid watermarking techniques and the final part presents conclusions. This book is directed to students, professors, researchers and application engineers who are interested in the area of information security.
The present book is the result of a three year research project which investigated the creative act of composing by means of algorithmic composition. Central to the investigation are the compositional strategies of 12 composers, which were documented through a dialogic and cyclic process of modelling and evaluating musical materials. The aesthetic premises and compositional approaches configure a rich spectrum of diverse positions, which is reflected also in the kinds of approaches and methods used. These approaches and methods include the generation and evaluation of chord sequences using genetic algorithms, the application of morphing strategies to research harmonic transformations, an automatic classification of personal preferences via machine learning, and an application of mathematical music theory to the analysis and resynthesis of musical material. The second part of the book features contributions by Sandeep Bhagwati, William Brooks, David Cope, Darla Crispin, Nicolas Donin, and Guerino Mazzola. These authors variously consider the project from different perspectives, offer independent approaches, or provide more general reflections from their respective research fields.
In recent years Genetic Algorithms (GA) and Artificial Neural
Networks (ANN) have progressively increased in importance amongst
the techniques routinely used in chemometrics. This book contains
contributions from experts in the field is divided in two sections
(GA and ANN). In each part, tutorial chapters are included in which
the theoretical bases of each technique are expertly (but simply)
described. These are followed by application chapters in which
special emphasis will be given to the advantages of the application
of GA or ANN to that specific problem, compared to classical
techniques, and to the risks connected with its misuse.
This book contains the collection of papers presented at the conference of the International Federation for Information Processing Working Group 8.2 "Information and Organizations." The conference took place during June 21-24, 2009 at the Universidade do Minho in Guimaraes, Portugal. The conference entitled "CreativeSME - The Role of IS in Leveraging the Intelligence and Creativity of SME's" attracted high-quality submissions from across the world. Each paper was reviewed by at least two reviewers in a double-blind review process. In addition to the 19 papers presented at the conference, there were five panels and four workshops, which covered a range of issues relevant to SMEs, creativity and information systems. We would like to show our appreciation of the efforts of our two invited keynote speakers, Michael Dowling of the University of Regensburg, Germany and Carlos Zorrinho, Portuguese coordinator of the Lisbon Strategy and the Technological Plan. The following organizations supported the conference through financial or other contributions and we would like to thank them for their engagement: "
This book offers a coherent and comprehensive approach to feature subset selection in the scope of classification problems, explaining the foundations, real application problems and the challenges of feature selection for high-dimensional data. The authors first focus on the analysis and synthesis of feature selection algorithms, presenting a comprehensive review of basic concepts and experimental results of the most well-known algorithms. They then address different real scenarios with high-dimensional data, showing the use of feature selection algorithms in different contexts with different requirements and information: microarray data, intrusion detection, tear film lipid layer classification and cost-based features. The book then delves into the scenario of big dimension, paying attention to important problems under high-dimensional spaces, such as scalability, distributed processing and real-time processing, scenarios that open up new and interesting challenges for researchers. The book is useful for practitioners, researchers and graduate students in the areas of machine learning and data mining.
Calculus has been used in solving many scientific and engineering problems. For optimization problems, however, the differential calculus technique sometimes has a drawback when the objective function is step-wise, discontinuous, or multi-modal, or when decision variables are discrete rather than continuous. Thus, researchers have recently turned their interests into metaheuristic algorithms that have been inspired by natural phenomena such as evolution, animal behavior, or metallic annealing. This book especially focuses on a music-inspired metaheuristic algorithm, harmony search. Interestingly, there exists an analogy between music and optimization: each musical instrument corresponds to each decision variable; musical note corresponds to variable value; and harmony corresponds to solution vector. Just like musicians in Jazz improvisation play notes randomly or based on experiences in order to find fantastic harmony, variables in the harmony search algorithm have random values or previously-memorized good values in order to find optimal solution.
The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.
This book presents an overview of the differential evolution algorithm. In the last few years the evolutionary computation domain has developed rapidly, and differential evolution is one of the representatives of this domain. It is a recently invented evolutionary algorithm that is gaining more and more popularity. Originally proposed for continuous unconstraint optimization, it was enlarged both for mixed optimization and for handling nonlinear constraints. Later on, new strategies, tuning, and adaptation of control parameters, ways of hybridization were elaborated. Attempts at theoretical analysis were accomplished as well. Moreover, the algorithm has a huge number of practical applications in different areas of science and industry.
This unique text/reference presents a thorough introduction to the field of structural pattern recognition, with a particular focus on graph edit distance (GED). The book also provides a detailed review of a diverse selection of novel methods related to GED, and concludes by suggesting possible avenues for future research. Topics and features: formally introduces the concept of GED, and highlights the basic properties of this graph matching paradigm; describes a reformulation of GED to a quadratic assignment problem; illustrates how the quadratic assignment problem of GED can be reduced to a linear sum assignment problem; reviews strategies for reducing both the overestimation of the true edit distance and the matching time in the approximation framework; examines the improvement demonstrated by the described algorithmic framework with respect to the distance accuracy and the matching time; includes appendices listing the datasets employed for the experimental evaluations discussed in the book.
In this book, the following three approaches to data analysis are presented: - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, - Rough Sets, founded by Zdzis aw I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected."
Evolutionary algorithms (EAs) is now a mature problem-solving family of heuristics that has found its way into many important real-life problems and into leading-edge scientific research. Spatially structured EAs have different properties than standard, mixing EAs. By virtue of the structured disposition of the population members they bring about new dynamical features that can be harnessed to solve difficult problems faster and more efficiently. This book describes the state of the art in spatially structured EAs by using graph concepts as a unifying theme. The models, their analysis, and their empirical behavior are presented in detail. Moreover, there is new material on non-standard networked population structures such as small-world networks. The book should be of interest to advanced undergraduate and graduate students working in evolutionary computation, machine learning, and optimization. It should also be useful to researchers and professionals working in fields where the topological structures of populations and their evolution plays a role.
This book provides the basic theory, techniques, and algorithms of modern cryptography that are applicable to network and cyberspace security. It consists of the following nine main chapters: Chapter 1 provides the basic concepts and ideas of cyberspace and cyberspace security, Chapters 2 and 3 provide an introduction to mathematical and computational preliminaries, respectively. Chapters 4 discusses the basic ideas and system of secret-key cryptography, whereas Chapters 5, 6, and 7 discuss the basic ideas and systems of public-key cryptography based on integer factorization, discrete logarithms, and elliptic curves, respectively. Quantum-safe cryptography is presented in Chapter 8 and offensive cryptography, particularly cryptovirology, is covered in Chapter 9. This book can be used as a secondary text for final-year undergraduate students and first-year postgraduate students for courses in Computer, Network, and Cyberspace Security. Researchers and practitioners working in cyberspace security and network security will also find this book useful as a reference.
This book contains extended and revised versions of the best papers presented at the 18th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2010, held in Madrid, Spain, in September 2010. The 14 papers included in the book were carefully reviewed and selected from the 52 full papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research. They address the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems.
One of the world s leading problems in the field of national security is protection of borders and borderlands. This book addresses multiple issues on advanced innovative methods of multi-level control of both ground (UGVs) and aerial drones (UAVs). Those objects combined with innovative algorithms become autonomous objects capable of patrolling chosen borderland areas by themselves and automatically inform the operator of the system about potential place of detection of a specific incident. This is achieved by using sophisticated methods of generation of non-collision trajectory for those types of objects and enabling automatic integration of both ground and aerial unmanned vehicles. The topics included in this book also cover presentation of complete information and communication technology (ICT) systems capable of control, observation and detection of various types of incidents and threats. This book is a valuable source of information for constructors and developers of such solutions for uniformed services. Scientists and researchers involved in computer vision, image processing, data fusion, control algorithms or IC can find many valuable suggestions and solutions. Multiple challenges for such systems are also presented. "
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. "
This book constitutes the thoroughly refereed post conference proceedings of the 6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, held in Helsingborg, Sweden, in August 2010. The 27 revised papers were carefully selected from numerous submissions during two rounds of reviewing. They are organized in topical sections on terminology, privacy metrics, ethical, social, and legal aspects, data protection and identity management, eID cards and eID interoperability, emerging technologies, privacy for eGovernment and AAL applications, social networks and privacy, privacy policies, and usable privacy.
"Fixed-Point Algorithms for Inverse Problems in Science and Engineering" presents some of the most recent work from top-notch researchers studying projection and other first-order fixed-point algorithms in several areas of mathematics and the applied sciences. The material presented provides a survey of the state-of-the-art theory and practice in fixed-point algorithms, identifying emerging problems driven by applications, and discussing new approaches for solving these problems. This book incorporates diverse perspectives from broad-ranging areas of research including, variational analysis, numerical linear algebra, biotechnology, materials science, computational solid-state physics, and chemistry. Topics presented include: Theory of Fixed-point algorithms: convex analysis, convex optimization, subdifferential calculus, nonsmooth analysis, proximal point methods, projection methods, resolvent and related fixed-point theoretic methods, and monotone operator theory. Numerical analysis of fixed-point algorithms: choice of step lengths, of weights, of blocks for block-iterative and parallel methods, and of relaxation parameters; regularization of ill-posed problems; numerical comparison of various methods. Areas of Applications: engineering (image and signal reconstruction and decompression problems), computer tomography and radiation treatment planning (convex feasibility problems), astronomy (adaptive optics), crystallography (molecular structure reconstruction), computational chemistry (molecular structure simulation) and other areas. Because of the variety of applications presented, this book can easily serve as a basis for new and innovated research and collaboration. |
You may like...
Protecting Children in the Age of…
Radha Jagannathan, Michael J Camasso
Hardcover
R2,005
Discovery Miles 20 050
Into A Raging Sea - Great South African…
Tony Weaver, Andrew Ingram
Paperback
(2)R539 Discovery Miles 5 390
Child Psychopathology
Katherine Nguyen Williams, David Wolfe, …
Paperback
Practical and Political Approaches to…
Jacques Boulet, Linette Hawkins
Hardcover
R5,411
Discovery Miles 54 110
|