Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing > Data structures
Cyberspace security is a critical subject of our times. On the one hand the development of Internet, mobile communications, distributed computing, computer software and databases storing essential enterprise information has helped to conduct business and personal communication between individual people. On the other hand it has created many opportunities for abuse, fraud and expensive damage. This book is a selection of the best papers presented at the NATO Advanced Research Workshop dealing with the Subject of Cyberspace Security and Defense. The level of the individual contributions in the volume is advanced and suitable for senior and graduate students, researchers and technologists who wish to get some feeling of the state of the art in several sub-disciplines of Cyberspace security. Several papers provide a broad-brush description of national security issues and brief summaries of technology states. These papers can be read and appreciated by technically enlightened managers and executives who want to understand security issues and approaches to technical solutions. An important question of our times is not "Should we do something for enhancing our digital assets security," the question is "How to do it."
This monograph addresses advances in representation learning, a cutting-edge research area of machine learning. Representation learning refers to modern data transformation techniques that convert data of different modalities and complexity, including texts, graphs, and relations, into compact tabular representations, which effectively capture their semantic properties and relations. The monograph focuses on (i) propositionalization approaches, established in relational learning and inductive logic programming, and (ii) embedding approaches, which have gained popularity with recent advances in deep learning. The authors establish a unifying perspective on representation learning techniques developed in these various areas of modern data science, enabling the reader to understand the common underlying principles and to gain insight using selected examples and sample Python code. The monograph should be of interest to a wide audience, ranging from data scientists, machine learning researchers and students to developers, software engineers and industrial researchers interested in hands-on AI solutions.
New Approaches to Circle Packing into the Square is devoted to the most recent results on the densest packing of equal circles in a square. In the last few decades, many articles have considered this question, which has been an object of interest since it is a hard challenge both in discrete geometry and in mathematical programming. The authors have studied this geometrical optimization problem for a long time, and they developed several new algorithms to solve it. The book completely covers the investigations on this topic.
The book presents a unified treatment of integer programming and network models with topics ranging from exact and heuristic algorithms to network flows, traveling salesman tours, and traffic assignment problems. While the emphasis of the book is on models and applications, the most important methods and algorithms are described in detail and illustrated by numerical examples. The formulations and the discussion of a large variety of models provides insight into their structures that allows the user to better evaluate the solutions to the problems.
Bioinformatics is growing by leaps and bounds; theories/algorithms/statistical techniques are constantly evolving. Nevertheless, a core body of algorithmic ideas have emerged and researchers are beginning to adopt a "problem solving" approach to bioinformatics, wherein they use solutions to well-abstracted problems as building blocks to solve larger scope problems. "Problem Solving Handbook for Computational Biology" and Bioinformatics is an edited volume contributed by world renowned leaders in this field. This comprehensive handbook with problem solving emphasis, covers all relevant areas of computational biology and bioinformatics. Web resources and related themes are highlighted at every opportunity in this central easy-to-read reference. Designed for advanced-level students, researchers and professors in computer science and bioengineering as a reference or secondary text, this handbook is also suitable for professionals working in this industry.
This book introduces wireless personal communications from the point of view of wireless communication system researchers. Existing sources on wireless communications put more emphasis on simulation and fundamental principles of how to build a study model. In this volume, the aim is to pass on to readers as much knowledge as is essential for completing model building of wireless communications, focusing on wireless personal area networks (WPANs). This book is the first of its kind that gives step-by-step details on how to build the WPANs simulation model. It is most helpful for readers to get a clear picture of the whole wireless simulation model by being presented with many study models. The book is also the first treatise on wireless communication that gives a comprehensive introduction to data-length complexity and the computational complexity of the processed data and the error control schemes. This volume is useful for all academic and technical staff in the fields of telecommunications and wireless communications, as it presents many scenarios for enhancing techniques for weak error control performance and other scenarios for complexity reduction of the wireless data and image transmission. Many examples are given to help readers to understand the material covered in the book. Additional resources such as the MATLAB codes for some of the examples also are presented.
This unique text/reference reviews algorithms for the exact or approximate solution of shortest-path problems, with a specific focus on a class of algorithms called rubberband algorithms. Discussing each concept and algorithm in depth, the book includes mathematical proofs for many of the given statements. Topics and features: provides theoretical and programming exercises at the end of each chapter; presents a thorough introduction to shortest paths in Euclidean geometry, and the class of algorithms called rubberband algorithms; discusses algorithms for calculating exact or approximate ESPs in the plane; examines the shortest paths on 3D surfaces, in simple polyhedrons and in cube-curves; describes the application of rubberband algorithms for solving art gallery problems, including the safari, zookeeper, watchman, and touring polygons route problems; includes lists of symbols and abbreviations, in addition to other appendices.
Universally acclaimed as the book on garbage collection. A complete and up-to-date revision of the 2012 Garbage Collection Handbook. Thorough coverage of parallel, concurrent and real-time garbage collection algortithms including C4, Garbage First, LXR, Shenandoah, Transactional Sapphire and ZGC, and garbage collection on the GPU. Clear explanation of the trickier aspects of garbage collection, including the interface to the run-time system, handling of finalisation and weak references, and support for dynamic languages. New chapters on energy aware garbage collection, and persistence and garbage collection. The e-book includes more than 40,000 hyperlinks to algorithms, figures, glossary entries, indexed items, original research papers and much more. Backed by a comprehensive online database of over 3,400 garbage collection-related publications
Privacy requirements have an increasing impact on the realization of modern applications. Commercial and legal regulations demand that privacy guarantees be provided whenever sensitive information is stored, processed, or communicated to external parties. Current approaches encrypt sensitive data, thus reducing query execution efficiency and preventing selective information release. Preserving Privacy in Data Outsourcing presents a comprehensive approach for protecting highly sensitive information when it is stored on systems that are not under the data owner's control. The approach illustrated combines access control and encryption, enforcing access control via structured encryption. This solution, coupled with efficient algorithms for key derivation and distribution, provides efficient and secure authorization management on outsourced data, allowing the data owner to outsource not only the data but the security policy itself. To reduce the amount of data to be encrypted the book also investigates data fragmentation as a possible way to protect privacy of data associations and provide fragmentation as a complementary means for protecting privacy: associations broken by fragmentation will be visible only to users authorized (by knowing the proper key) to join fragments. The book finally investigates the problem of executing queries over possible data distributed at different servers and which must be controlled to ensure sensitive information and sensitive associations be visible only to parties authorized for that. Case Studies are provided throughout the book. Privacy, data mining, data protection, data outsourcing, electronic commerce, machine learning professionals and others working in these related fields will find this book a valuable asset, as well as primary associations such as ACM, IEEE and Management Science. This book is also suitable for advanced level students and researchers concentrating on computer science as a secondary text or reference book.
Covering all the essential components of Unix/Linux, including process management, concurrent programming, timer and time service, file systems and network programming, this textbook emphasizes programming practice in the Unix/Linux environment. Systems Programming in Unix/Linux is intended as a textbook for systems programming courses in technically-oriented Computer Science/Engineering curricula that emphasize both theory and programming practice. The book contains many detailed working example programs with complete source code. It is also suitable for self-study by advanced programmers and computer enthusiasts. Systems programming is an indispensable part of Computer Science/Engineering education. After taking an introductory programming course, this book is meant to further knowledge by detailing how dynamic data structures are used in practice, using programming exercises and programming projects on such topics as C structures, pointers, link lists and trees. This book provides a wide range of knowledge about computer systemsoftware and advanced programming skills, allowing readers to interface with operatingsystem kernel, make efficient use of system resources and develop application software.It also prepares readers with the needed background to pursue advanced studies inComputer Science/Engineering, such as operating systems, embedded systems, databasesystems, data mining, artificial intelligence, computer networks, network security,distributed and parallel computing.
A modern information retrieval system must have the capability to find, organize and present very different manifestations of information - such as text, pictures, videos or database records - any of which may be of relevance to the user. However, the concept of relevance, while seemingly intuitive, is actually hard to define, and it's even harder to model in a formal way. Lavrenko does not attempt to bring forth a new definition of relevance, nor provide arguments as to why any particular definition might be theoretically superior or more complete. Instead, he takes a widely accepted, albeit somewhat conservative definition, makes several assumptions, and from them develops a new probabilistic model that explicitly captures that notion of relevance. With this book, he makes two major contributions to the field of information retrieval: first, a new way to look at topical relevance, complementing the two dominant models, i.e., the classical probabilistic model and the language modeling approach, and which explicitly combines documents, queries, and relevance in a single formalism; second, a new method for modeling exchangeable sequences of discrete random variables which does not make any structural assumptions about the data and which can also handle rare events. Thus his book is of major interest to researchers and graduate students in information retrieval who specialize in relevance modeling, ranking algorithms, and language modeling.
This volume contains a collection of research and survey papers written by some of the most eminent mathematicians in the international community and is dedicated to Helmut Maier, whose own research has been groundbreaking and deeply influential to the field. Specific emphasis is given to topics regarding exponential and trigonometric sums and their behavior in short intervals, anatomy of integers and cyclotomic polynomials, small gaps in sequences of sifted prime numbers, oscillation theorems for primes in arithmetic progressions, inequalities related to the distribution of primes in short intervals, the Moebius function, Euler's totient function, the Riemann zeta function and the Riemann Hypothesis. Graduate students, research mathematicians, as well as computer scientists and engineers who are interested in pure and interdisciplinary research, will find this volume a useful resource. Contributors to this volume: Bill Allombert, Levent Alpoge, Nadine Amersi, Yuri Bilu, Regis de la Breteche, Christian Elsholtz, John B. Friedlander, Kevin Ford, Daniel A. Goldston, Steven M. Gonek, Andrew Granville, Adam J. Harper, Glyn Harman, D. R. Heath-Brown, Aleksandar Ivic, Geoffrey Iyer, Jerzy Kaczorowski, Daniel M. Kane, Sergei Konyagin, Dimitris Koukoulopoulos, Michel L. Lapidus, Oleg Lazarev, Andrew H. Ledoan, Robert J. Lemke Oliver, Florian Luca, James Maynard, Steven J. Miller, Hugh L. Montgomery, Melvyn B. Nathanson, Ashkan Nikeghbali, Alberto Perelli, Amalia Pizarro-Madariaga, Janos Pintz, Paul Pollack, Carl Pomerance, Michael Th. Rassias, Maksym Radziwill, Joel Rivat, Andras Sarkoezy, Jeffrey Shallit, Terence Tao, Gerald Tenenbaum, Laszlo Toth, Tamar Ziegler, Liyang Zhang.
This book gathers selected papers presented at the International Conference on Advancements in Computing and Management (ICACM 2019). Discussing current research in the field of artificial intelligence and machine learning, cloud computing, recent trends in security, natural language processing and machine translation, parallel and distributed algorithms, as well as pattern recognition and analysis, it is a valuable resource for academics, practitioners in industry and decision-makers.
In recent years, IT application scenarios have evolved in very
innovative ways. Highly distributed networks have now become a
common platform for large-scale distributed programming, high
bandwidth communications are inexpensive and widespread, and most
of our work tools are equipped with processors enabling us to
perform a multitude of tasks. In addition, mobile computing
(referring specifically to wireless devices and, more broadly, to
dynamically configured systems) has made it possible to exploit
interaction in novel ways. -Algorithms, Complexity and Models of Computation;
This book contains a collection of survey papers in the areas of algorithms, lan guages and complexity, the three areas in which Professor Ronald V. Book has made significant contributions. As a fonner student and a co-author who have been influenced by him directly, we would like to dedicate this book to Professor Ronald V. Book to honor and celebrate his sixtieth birthday. Professor Book initiated his brilliant academic career in 1958, graduating from Grinnell College with a Bachelor of Arts degree. He obtained a Master of Arts in Teaching degree in 1960 and a Master of Arts degree in 1964 both from Wesleyan University, and a Doctor of Philosophy degree from Harvard University in 1969, under the guidance of Professor Sheila A. Greibach. Professor Book's research in discrete mathematics and theoretical com puter science is reflected in more than 150 scientific publications. These works have made a strong impact on the development of several areas of theoretical computer science. A more detailed summary of his scientific research appears in this volume separately."
This book provides an extensive review of three interrelated issues: land fragmentation, land consolidation, and land reallocation, and it presents in detail the theoretical background, design, development and application of a prototype integrated planning and decision support system for land consolidation. The system integrates geographic information systems (GIS) and artificial intelligence techniques including expert systems (ES) and genetic algorithms (GAs) with multi-criteria decision methods (MCDM), both multi-attribute (MADM) and multi-objective (MODM). The system is based on four modules for measuring land fragmentation; automatically generating alternative land redistribution plans; evaluating those plans; and automatically designing the land partitioning plan. The presented research provides a new scientific framework for land-consolidation planning both in terms of theory and practice, by presenting new findings and by developing better tools and methods embedded in an integrated GIS environment. It also makes a valuable contribution to the fields of GIS and spatial planning, as it provides new methods and ideas that could be applied to improve the former for the benefit of the latter in the context of planning support systems. From the 1960s, ambitious research activities set out to observe regarding IT-support of the complex and time consuming redistribution processes within land consolidation without any practically relevant results, until now. This scientific work is likely to close that gap. This distinguished publication is highly recommended to land consolidation planning experts, researchers and academics alike. Prof. Dr.-Ing. Joachim Thomas, Munster/ Germany Prof. Michael Batty, University College London"
Robust Technology with Analysis of Interference in Signal Processing discusses for the first time the theoretical fundamentals and algorithms of analysis of noise as an information carrier. On their basis the robust technology of noisy signals processing is developed. This technology can be applied to solving the problems of control, identification, diagnostics, and pattern recognition in petrochemistry, energetics, geophysics, medicine, physics, aviation, and other sciences and industries. The text explores the emergent possibility of forecasting failures on various objects, in conjunction with the fact that failures follow the hidden microchanges revealed via interference estimates. This monograph is of interest to students, postgraduates, engineers, scientific associates and others who are concerned with the processing of measuring information on computers.
These proceedings contain the papers of IFIP/SEC 2010. It was a special honour and privilege to chair the Program Committee and prepare the proceedings for this conf- ence, which is the 25th in a series of well-established international conferences on security and privacy organized annually by Technical Committee 11 (TC-11) of IFIP. Moreover, in 2010 it is part of the IFIP World Computer Congress 2010 celebrating both the Golden Jubilee of IFIP (founded in 1960) and the Silver Jubilee of the SEC conference in the exciting city of Brisbane, Australia, during September 20-23. The call for papers went out with the challenging motto of "Security & Privacy Silver Linings in the Cloud" building a bridge between the long standing issues of security and privacy and the most recent developments in information and commu- cation technology. It attracted 102 submissions. All of them were evaluated on the basis of their significance, novelty, and technical quality by at least five member of the Program Committee. The Program Committee meeting was held electronically over a period of a week. Of the papers submitted, 25 were selected for presentation at the conference; the acceptance rate was therefore as low as 24. 5% making SEC 2010 a highly competitive forum. One of those 25 submissions could unfortunately not be included in the proceedings, as none of its authors registered in time to present the paper at the conference.
Speech Dereverberation gathers together an overview, a mathematical formulation of the problem and the state-of-the-art solutions for dereverberation. Speech Dereverberation presents current approaches to the problem of reverberation. It provides a review of topics in room acoustics and also describes performance measures for dereverberation. The algorithms are then explained with mathematical analysis and examples that enable the reader to see the strengths and weaknesses of the various techniques, as well as giving an understanding of the questions still to be addressed. Techniques rooted in speech enhancement are included, in addition to a treatment of multichannel blind acoustic system identification and inversion. The TRINICON framework is shown in the context of dereverberation to be a generalization of the signal processing for a range of analysis and enhancement techniques. Speech Dereverberation is suitable for students at masters and doctoral level, as well as established researchers.
Evolutionary Algorithms for Embedded System Design describes how Evolutionary Algorithm (EA) concepts can be applied to circuit and system design - an area where time-to-market demands are critical. EAs create an interesting alternative to other approaches since they can be scaled with the problem size and can be easily run on parallel computer systems. This book presents several successful EA techniques and shows how they can be applied at different levels of the design process. Starting on a high-level abstraction, where software components are dominant, several optimization steps are demonstrated, including DSP code optimization and test generation. Throughout the book, EAs are tested on real-world applications and on large problem instances. For each application the main criteria for the successful application in the corresponding domain are discussed. In addition, contributions from leading international researchers provide the reader with a variety of perspectives, including a special focus on the combination of EAs with problem specific heuristics. Evolutionary Algorithms for Embedded System Design is an excellent reference for both practitioners working in the area of circuit and system design and for researchers in the field of evolutionary concepts.
Designing Sorting Networks: A New Paradigm provides an in-depth guide to maximizing the efficiency of sorting networks, and uses 0/1 cases, partially ordered sets and Haase diagrams to closely analyze their behavior in an easy, intuitive manner. This book also outlines new ideas and techniques for designing faster sorting networks using Sortnet, and illustrates how these techniques were used to design faster 12-key and 18-key sorting networks through a series of case studies. Finally, it examines and explains the mysterious behavior exhibited by the fastest-known 9-step 16-key network. Designing Sorting Networks: A New Paradigm is intended for advanced-level students, researchers and practitioners as a reference book. Academics in the fields of computer science, engineering and mathematics will also find this book invaluable.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: evolutionary constraints, relaxation of selection mechanisms, diversity preservation strategies, flexing fitness evaluation, evolution in dynamic environments, multi-objective and multi-modal selection, foundations of evolvability, evolvable and adaptive evolutionary operators, foundation of injecting expert knowledge in evolutionary search, analysis of problem difficulty and required GP algorithm complexity, foundations in running GP on the cloud - communication, cooperation, flexible implementation, and ensemble methods. Additional focal points for GP symbolic regression are: (1) The need to guarantee convergence to solutions in the function discovery mode; (2) Issues on model validation; (3) The need for model analysis workflows for insight generation based on generated GP solutions - model exploration, visualization, variable selection, dimensionality analysis; (4) Issues in combining different types of data. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
This book contains extended and revised versions of the best papers that were presented during the fifteenth edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 15th conference was held at the Georgia Institute of Technology, Atlanta, USA (October 15-17, 2007). Previous conferences have taken place in Edinburgh, Trondheim, Vancouver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth and Nice. The purpose of this conference, sponsored by IFIP TC 10 Working Group 10.5 and by the IEEE Council on Electronic Design Automation (CEDA), is to provide a forum to exchange ideas and show industrial and academic research results in the field of microelectronics design. The current trend toward increasing chip integration and technology process advancements brings about stimulating new challenges both at the physical and system-design levels, as well in the test of these systems. VLSI-SoC conferences aim to address these exciting new issues. |
You may like...
MATLAB Applications in Engineering
Constantin Volosencu
Hardcover
|