![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
This book presents the theory of continuum mechanics for mechanical, thermodynamical, and electrodynamical systems. It shows how to obtain governing equations and it applies them by computing the reality. It uses only open-source codes developed under the FEniCS project and includes codes for 20 engineering applications from mechanics, fluid dynamics, applied thermodynamics, and electromagnetism. Moreover, it derives and utilizes the constitutive equations including coupling terms, which allow to compute multiphysics problems by incorporating interactions between primitive variables, namely, motion, temperature, and electromagnetic fields. An engineering system is described by the primitive variables satisfying field equations that are partial differential equations in space and time. The field equations are mostly coupled and nonlinear, in other words, difficult to solve. In order to solve the coupled, nonlinear system of partial differential equations, the book uses a novel collection of open-source packages developed under the FEniCS project. All primitive variables are solved at once in a fully coupled fashion by using finite difference method in time and finite element method in space.
The information infrastructure---comprising computers, embedded devices, networks and software systems---is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection V describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: Themes and Issues, Control Systems Security, Infrastructure Security, and Infrastructure Modeling and Simulation. This book is the 5th volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of 14 edited papers from the 5th Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection, held at Dartmouth College, Hanover, New Hampshire, USA in the spring of 2011. Critical Infrastructure Protection V is an important resource for researchers, faculty members and graduate students, as well as for policy makers, practitioners and other individuals with interests in homeland security. Jonathan Butts is an Assistant Professor of Computer Science at the Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio, USA. Sujeet Shenoi is the F.P. Walter Professor of Computer Science at the University of Tulsa, Tulsa, Oklahoma, USA.
The 20th century saw tremendous achievements and progress in
science and
The Workshop on the Economics of Information Security was established in 2002 to bring together computer scientists and economists to understand and improve the poor state of information security practice. WEIS was borne out of a realization that security often fails for non-technical reasons. Rather, the incentives of both - fender and attacker must be considered. Earlier workshops have answered questions ranging from?nding optimal levels of security investement to understanding why privacy has been eroded. In the process, WEIS has attracted participation from the diverse?elds such as law, management and psychology. WEIS has now established itself as the leading forum for interdisciplinary scholarship on information security. The eigth installment of the conference returned to the United Kingdom, hosted byUniversityCollegeLondononJune24-25,2009.Approximately100researchers, practitioners and government of?cials from across the globe convened in London to hear presentations from authors of 21 peer-reviewed papers, in addition to a panel and keynote lectures from Hal Varian (Google), Bruce Schneier (BT Co- terpane), Martin Sadler (HP Labs), and Robert Coles (Merrill Lynch). Angela Sasse and David Pym chaired the conference, while Christos Ioannidis and Tyler Moore chaired the program committee.
This book contains extended and revised versions of the best papers that were p- sented during the 16th edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 16th conference was held at the Grand Hotel of Rhodes Island, Greece (October 13-15, 2008). Previous conferences have taken place in Edinburgh, Trondheim, V- couver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth, Nice and Atlanta. VLSI-SoC 2008 was the 16th in a series of international conferences sponsored by IFIP TC 10 Working Group 10.5 and IEEE CEDA that explores the state of the art and the new developments in the field of VLSI systems and their designs. The purpose of the conference was to provide a forum to exchange ideas and to present industrial and research results in the fields of VLSI/ULSI systems, embedded systems and - croelectronic design and test.
Evolutionary Algorithms, in particular Evolution Strategies, Genetic Algorithms, or Evolutionary Programming, have found wide acceptance as robust optimization algorithms in the last ten years. Compared with the broad propagation and the resulting practical prosperity in different scientific fields, the theory has not progressed as much.This monograph provides the framework and the first steps toward the theoretical analysis of Evolution Strategies (ES). The main emphasis is on understanding the functioning of these probabilistic optimization algorithms in real-valued search spaces by investigating the dynamical properties of some well-established ES algorithms. The book introduces the basic concepts of this analysis, such as progress rate, quality gain, and self-adaptation response, and describes how to calculate these quantities. Based on the analysis, functioning principles are derived, aiming at a qualitative understanding of why and how ES algorithms work.
This book gives an overview of constraint satisfaction problems (CSPs), adapts related search algorithms and consistency algorithms for applications to multi-agent systems, and consolidates recent research devoted to cooperation in such systems. The techniques introduced are applied to various problems in multi-agent systems. Among the new approaches is a hybrid-type algorithm for weak-commitment search combining backtracking and iterative improvement; also, an extension of the basic CSP formalization called partial CSP is introduced in order to handle over-constrained CSPs.The book is written for advanced students and professionals interested in multi-agent systems or, more generally, in distributed artificial intelligence and constraint satisfaction. Researchers active in the area will appreciate this book as a valuable source of reference.
Throughout time, scientists have looked to nature in order to understand and model solutions for complex real-world problems. In particular, the study of self-organizing entities, such as social insect populations, presents a new opportunity within the field of artificial intelligence. >Emerging Research on Swarm Intelligence and Algorithm Optimization discusses current research analyzing how the collective behavior of decentralized systems in the natural world can be applied to intelligent system design. Discussing the application of swarm principles, optimization techniques, and key algorithms being used in the field, this publication serves as an essential reference for academicians, upper-level students, IT developers, and IT theorists.
This book describes a novel methodology for studying algorithmic skills, intended as cognitive activities related to rule-based symbolic transformation, and argues that some human computational abilities may be interpreted and analyzed as genuine examples of extended cognition. It shows that the performance of these abilities relies not only on innate neurocognitive systems or language-related skills, but also on external tools and general agent-environment interactions. Further, it asserts that a low-level analysis, based on a set of core neurocognitive systems linking numbers and language, is not sufficient to explain some specific forms of high-level numerical skills, like those involved in algorithm execution. To this end, it reports on the design of a cognitive architecture for modeling all the relevant features involved in the execution of algorithmic strategies, including external tools, such as paper and pencils. The first part of the book discusses the philosophical premises for endorsing and justifying a position in philosophy of mind that links a modified form of computationalism with some recent theoretical and scientific developments, like those introduced by the so-called dynamical approach to cognition. The second part is dedicated to the description of a Turing-machine-inspired cognitive architecture, expressly designed to formalize all kinds of algorithmic strategies.
The book presents a unified treatment of integer programming and network models with topics ranging from exact and heuristic algorithms to network flows, traveling salesman tours, and traffic assignment problems. While the emphasis of the book is on models and applications, the most important methods and algorithms are described in detail and illustrated by numerical examples. The formulations and the discussion of a large variety of models provides insight into their structures that allows the user to better evaluate the solutions to the problems.
Cyberspace security is a critical subject of our times. On the one hand the development of Internet, mobile communications, distributed computing, computer software and databases storing essential enterprise information has helped to conduct business and personal communication between individual people. On the other hand it has created many opportunities for abuse, fraud and expensive damage. This book is a selection of the best papers presented at the NATO Advanced Research Workshop dealing with the Subject of Cyberspace Security and Defense. The level of the individual contributions in the volume is advanced and suitable for senior and graduate students, researchers and technologists who wish to get some feeling of the state of the art in several sub-disciplines of Cyberspace security. Several papers provide a broad-brush description of national security issues and brief summaries of technology states. These papers can be read and appreciated by technically enlightened managers and executives who want to understand security issues and approaches to technical solutions. An important question of our times is not "Should we do something for enhancing our digital assets security," the question is "How to do it."
This book bridges the widening gap between two crucial constituents of computational intelligence: the rapidly advancing technologies of machine learning in the digital information age, and the relatively slow-moving field of general-purpose search and optimization algorithms. With this in mind, the book serves to offer a data-driven view of optimization, through the framework of memetic computation (MC). The authors provide a summary of the complete timeline of research activities in MC - beginning with the initiation of memes as local search heuristics hybridized with evolutionary algorithms, to their modern interpretation as computationally encoded building blocks of problem-solving knowledge that can be learned from one task and adaptively transmitted to another. In the light of recent research advances, the authors emphasize the further development of MC as a simultaneous problem learning and optimization paradigm with the potential to showcase human-like problem-solving prowess; that is, by equipping optimization engines to acquire increasing levels of intelligence over time through embedded memes learned independently or via interactions. In other words, the adaptive utilization of available knowledge memes makes it possible for optimization engines to tailor custom search behaviors on the fly - thereby paving the way to general-purpose problem-solving ability (or artificial general intelligence). In this regard, the book explores some of the latest concepts from the optimization literature, including, the sequential transfer of knowledge across problems, multitasking, and large-scale (high dimensional) search, systematically discussing associated algorithmic developments that align with the general theme of memetics. The presented ideas are intended to be accessible to a wide audience of scientific researchers, engineers, students, and optimization practitioners who are familiar with the commonly used terminologies of evolutionary computation. A full appreciation of the mathematical formalizations and algorithmic contributions requires an elementary background in probability, statistics, and the concepts of machine learning. A prior knowledge of surrogate-assisted/Bayesian optimization techniques is useful, but not essential.
New Approaches to Circle Packing into the Square is devoted to the most recent results on the densest packing of equal circles in a square. In the last few decades, many articles have considered this question, which has been an object of interest since it is a hard challenge both in discrete geometry and in mathematical programming. The authors have studied this geometrical optimization problem for a long time, and they developed several new algorithms to solve it. The book completely covers the investigations on this topic.
This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to reusable, adaptable and scalable code fragments. This book also serves as a GPU implementation manual for many numerical algorithms, sharing tips on GPUs that can increase application efficiency. The valuable insights into parallelization strategies for GPUs are supplemented by ready-to-use code fragments. Numerical Computations with GPUs targets professionals and researchers working in high performance computing and GPU programming. Advanced-level students focused on computer science and mathematics will also find this book useful as secondary text book or reference.
This unique text/reference reviews algorithms for the exact or approximate solution of shortest-path problems, with a specific focus on a class of algorithms called rubberband algorithms. Discussing each concept and algorithm in depth, the book includes mathematical proofs for many of the given statements. Topics and features: provides theoretical and programming exercises at the end of each chapter; presents a thorough introduction to shortest paths in Euclidean geometry, and the class of algorithms called rubberband algorithms; discusses algorithms for calculating exact or approximate ESPs in the plane; examines the shortest paths on 3D surfaces, in simple polyhedrons and in cube-curves; describes the application of rubberband algorithms for solving art gallery problems, including the safari, zookeeper, watchman, and touring polygons route problems; includes lists of symbols and abbreviations, in addition to other appendices.
The latest edition of a classic text on concurrency and distributed programming - from a winner of the ACM/SIGCSE Award for Outstanding Contribution to Computer Science Education.
A modern information retrieval system must have the capability to find, organize and present very different manifestations of information - such as text, pictures, videos or database records - any of which may be of relevance to the user. However, the concept of relevance, while seemingly intuitive, is actually hard to define, and it's even harder to model in a formal way. Lavrenko does not attempt to bring forth a new definition of relevance, nor provide arguments as to why any particular definition might be theoretically superior or more complete. Instead, he takes a widely accepted, albeit somewhat conservative definition, makes several assumptions, and from them develops a new probabilistic model that explicitly captures that notion of relevance. With this book, he makes two major contributions to the field of information retrieval: first, a new way to look at topical relevance, complementing the two dominant models, i.e., the classical probabilistic model and the language modeling approach, and which explicitly combines documents, queries, and relevance in a single formalism; second, a new method for modeling exchangeable sequences of discrete random variables which does not make any structural assumptions about the data and which can also handle rare events. Thus his book is of major interest to researchers and graduate students in information retrieval who specialize in relevance modeling, ranking algorithms, and language modeling.
In recent years, IT application scenarios have evolved in very
innovative ways. Highly distributed networks have now become a
common platform for large-scale distributed programming, high
bandwidth communications are inexpensive and widespread, and most
of our work tools are equipped with processors enabling us to
perform a multitude of tasks. In addition, mobile computing
(referring specifically to wireless devices and, more broadly, to
dynamically configured systems) has made it possible to exploit
interaction in novel ways. -Algorithms, Complexity and Models of Computation;
This book introduces wireless personal communications from the point of view of wireless communication system researchers. Existing sources on wireless communications put more emphasis on simulation and fundamental principles of how to build a study model. In this volume, the aim is to pass on to readers as much knowledge as is essential for completing model building of wireless communications, focusing on wireless personal area networks (WPANs). This book is the first of its kind that gives step-by-step details on how to build the WPANs simulation model. It is most helpful for readers to get a clear picture of the whole wireless simulation model by being presented with many study models. The book is also the first treatise on wireless communication that gives a comprehensive introduction to data-length complexity and the computational complexity of the processed data and the error control schemes. This volume is useful for all academic and technical staff in the fields of telecommunications and wireless communications, as it presents many scenarios for enhancing techniques for weak error control performance and other scenarios for complexity reduction of the wireless data and image transmission. Many examples are given to help readers to understand the material covered in the book. Additional resources such as the MATLAB codes for some of the examples also are presented.
This book contains a collection of survey papers in the areas of algorithms, lan guages and complexity, the three areas in which Professor Ronald V. Book has made significant contributions. As a fonner student and a co-author who have been influenced by him directly, we would like to dedicate this book to Professor Ronald V. Book to honor and celebrate his sixtieth birthday. Professor Book initiated his brilliant academic career in 1958, graduating from Grinnell College with a Bachelor of Arts degree. He obtained a Master of Arts in Teaching degree in 1960 and a Master of Arts degree in 1964 both from Wesleyan University, and a Doctor of Philosophy degree from Harvard University in 1969, under the guidance of Professor Sheila A. Greibach. Professor Book's research in discrete mathematics and theoretical com puter science is reflected in more than 150 scientific publications. These works have made a strong impact on the development of several areas of theoretical computer science. A more detailed summary of his scientific research appears in this volume separately."
Robust Technology with Analysis of Interference in Signal Processing discusses for the first time the theoretical fundamentals and algorithms of analysis of noise as an information carrier. On their basis the robust technology of noisy signals processing is developed. This technology can be applied to solving the problems of control, identification, diagnostics, and pattern recognition in petrochemistry, energetics, geophysics, medicine, physics, aviation, and other sciences and industries. The text explores the emergent possibility of forecasting failures on various objects, in conjunction with the fact that failures follow the hidden microchanges revealed via interference estimates. This monograph is of interest to students, postgraduates, engineers, scientific associates and others who are concerned with the processing of measuring information on computers.
This book is an up-to-date self-contained compendium of the research carried out by the authors on model-based diagnosis of a class of discrete-event systems called active systems. After defining the diagnosis problem, the book copes with a variety of reasoning mechanisms that generate the diagnosis, possibly within a monitoring setting. The book is structured into twelve chapters, each of which has its own introduction and concludes with bibliographic notes and itemized summaries. Concepts and techniques are presented with the help of numerous examples, figures, and tables, and when appropriate these concepts are formalized into propositions and theorems, while detailed algorithms are expressed in pseudocode. This work is primarily intended for researchers, professionals, and graduate students in the fields of artificial intelligence and control theory.
Privacy requirements have an increasing impact on the realization of modern applications. Commercial and legal regulations demand that privacy guarantees be provided whenever sensitive information is stored, processed, or communicated to external parties. Current approaches encrypt sensitive data, thus reducing query execution efficiency and preventing selective information release. Preserving Privacy in Data Outsourcing presents a comprehensive approach for protecting highly sensitive information when it is stored on systems that are not under the data owner's control. The approach illustrated combines access control and encryption, enforcing access control via structured encryption. This solution, coupled with efficient algorithms for key derivation and distribution, provides efficient and secure authorization management on outsourced data, allowing the data owner to outsource not only the data but the security policy itself. To reduce the amount of data to be encrypted the book also investigates data fragmentation as a possible way to protect privacy of data associations and provide fragmentation as a complementary means for protecting privacy: associations broken by fragmentation will be visible only to users authorized (by knowing the proper key) to join fragments. The book finally investigates the problem of executing queries over possible data distributed at different servers and which must be controlled to ensure sensitive information and sensitive associations be visible only to parties authorized for that. Case Studies are provided throughout the book. Privacy, data mining, data protection, data outsourcing, electronic commerce, machine learning professionals and others working in these related fields will find this book a valuable asset, as well as primary associations such as ACM, IEEE and Management Science. This book is also suitable for advanced level students and researchers concentrating on computer science as a secondary text or reference book.
This book provides an extensive review of three interrelated issues: land fragmentation, land consolidation, and land reallocation, and it presents in detail the theoretical background, design, development and application of a prototype integrated planning and decision support system for land consolidation. The system integrates geographic information systems (GIS) and artificial intelligence techniques including expert systems (ES) and genetic algorithms (GAs) with multi-criteria decision methods (MCDM), both multi-attribute (MADM) and multi-objective (MODM). The system is based on four modules for measuring land fragmentation; automatically generating alternative land redistribution plans; evaluating those plans; and automatically designing the land partitioning plan. The presented research provides a new scientific framework for land-consolidation planning both in terms of theory and practice, by presenting new findings and by developing better tools and methods embedded in an integrated GIS environment. It also makes a valuable contribution to the fields of GIS and spatial planning, as it provides new methods and ideas that could be applied to improve the former for the benefit of the latter in the context of planning support systems. From the 1960s, ambitious research activities set out to observe regarding IT-support of the complex and time consuming redistribution processes within land consolidation without any practically relevant results, until now. This scientific work is likely to close that gap. This distinguished publication is highly recommended to land consolidation planning experts, researchers and academics alike. Prof. Dr.-Ing. Joachim Thomas, Munster/ Germany Prof. Michael Batty, University College London" |
You may like...
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
Test Generation of Crosstalk Delay…
S. Jayanthy, M.C. Bhuvaneswari
Hardcover
R3,785
Discovery Miles 37 850
Cohesive Subgraph Computation over Large…
Lijun Chang, Lu Qin
Hardcover
R1,408
Discovery Miles 14 080
|