![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
This book constitutes the refereed proceedings of the 22nd International Symposium on String Processing and Information Retrieval, SPIRE 2015, held in London, UK, in September 2015. The 28 full and 6 short papers included in this volume were carefully reviewed and selected from 90 submissions. The papers cover research in all aspects of string processing, information retrieval, computational biology, pattern matching, semi-structured data, and related applications.
The explosion of information technology has led to substantial growth of web-accessible linguistic data in terms of quantity, diversity and complexity. These resources become even more useful when interlinked with each other to generate network effects. The general trend of providing data online is thus accompanied by newly developing methodologies to interconnect linguistic data and metadata. This includes linguistic data collections, general-purpose knowledge bases (e.g., the DBpedia, a machine-readable edition of the Wikipedia), and repositories with specific information about languages, linguistic categories and phenomena. The Linked Data paradigm provides a framework for interoperability and access management, and thereby allows to integrate information from such a diverse set of resources. The contributions assembled in this volume illustrate the band-width of applications of the Linked Data paradigm for representative types of language resources. They cover lexical-semantic resources, annotated corpora, typological databases as well as terminology and metadata repositories. The book includes representative applications from diverse fields, ranging from academic linguistics (e.g., typology and corpus linguistics) over applied linguistics (e.g., lexicography and translation studies) to technical applications (in computational linguistics, Natural Language Processing and information technology). This volume accompanies the Workshop on Linked Data in Linguistics 2012 (LDL-2012) in Frankfurt/M., Germany, organized by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation (OKFN). It assembles contributions of the workshop participants and, beyond this, it summarizes initial steps in the formation of a Linked Open Data cloud of linguistic resources, the Linguistic Linked Open Data cloud (LLOD).
This volume constitutes the proceedings of the 9th International Conference on Hybrid Artificial Intelligent Systems, HAIS 2014, held in Salamanca, Spain, in June 2014. The 61 papers published in this volume were carefully reviewed and selected from 199 submissions. They are organized in topical sessions on HAIS applications; data mining and knowledge discovery; video and image analysis; bio-inspired models and evolutionary computation; learning algorithms; hybrid intelligent systems for data mining and applications and classification and cluster analysis.
This book constitutes thoroughly refereed post-conference proceedings of the workshops of the 19th International Conference on Parallel Computing, Euro-Par 2013, held in Aachen, Germany in August 2013. The 99 papers presented were carefully reviewed and selected from 145 submissions. The papers include seven workshops that have been co-located with Euro-Par in the previous years: - Big Data Cloud (Second Workshop on Big Data Management in Clouds) - Hetero Par (11th Workshop on Algorithms, Models and Tools for Parallel Computing on Heterogeneous Platforms) - HiBB (Fourth Workshop on High Performance Bioinformatics and Biomedicine) - OMHI (Second Workshop on On-chip Memory Hierarchies and Interconnects) - PROPER (Sixth Workshop on Productivity and Performance) - Resilience (Sixth Workshop on Resiliency in High Performance Computing with Clusters, Clouds, and Grids) - UCHPC (Sixth Workshop on Un Conventional High Performance Computing) as well as six newcomers: - DIHC (First Workshop on Dependability and Interoperability in Heterogeneous Clouds) - Fed ICI (First Workshop on Federative and Interoperable Cloud Infrastructures) - LSDVE (First Workshop on Large Scale Distributed Virtual Environments on Clouds and P2P) - MHPC (Workshop on Middleware for HPC and Big Data Systems) -PADABS ( First Workshop on Parallel and Distributed Agent Based Simulations) - ROME (First Workshop on Runtime and Operating Systems for the Many core Era) All these workshops focus on promotion and advancement of all aspects of parallel and distributed computing.
The two main themes of this book, logic and complexity, are both essential for understanding the main problems about the foundations of mathematics. Logical Foundations of Mathematics and Computational Complexity covers a broad spectrum of results in logic and set theory that are relevant to the foundations, as well as the results in computational complexity and the interdisciplinary area of proof complexity. The author presents his ideas on how these areas are connected, what are the most fundamental problems and how they should be approached. In particular, he argues that complexity is as important for foundations as are the more traditional concepts of computability and provability. Emphasis is on explaining the essence of concepts and the ideas of proofs, rather than presenting precise formal statements and full proofs. Each section starts with concepts and results easily explained, and gradually proceeds to more difficult ones. The notes after each section present some formal definitions, theorems and proofs. Logical Foundations of Mathematics and Computational Complexity is aimed at graduate students of all fields of mathematics who are interested in logic, complexity and foundations. It will also be of interest for both physicists and philosophers who are curious to learn the basics of logic and complexity theory.
This volume contains the post-proceedings of the 9th Doctoral Workshop on Mathematical and Engineering Methods in Computer Science, MEMICS 2014, held in Telc, Czech Republic, in October 2014. The 13 thoroughly revised papers were carefully selected out of 28 submissions and are presented together with 4 invited papers. The topics covered by the papers include: algorithms, logic, and games; high performance computing; computer aided analysis, verification, and testing; hardware design and diagnostics; computer graphics and image processing; and artificial intelligence and natural language processing.
Beginning Oracle SQL is your introduction to the interactive query tools and specific dialect of SQL used with Oracle Database. These tools include SQL*Plus and SQL Developer. SQL*Plus is the one tool any Oracle developer or database administrator can always count on, and it is widely used in creating scripts to automate routine tasks. SQL Developer is a powerful, graphical environment for developing and debugging queries. Oracle's is possibly the most valuable dialect of SQL from a career standpoint. Oracle's database engine is widely used in corporate environments worldwide. It is also found in many government applications. Oracle SQL implements many features not found in competing products. No developer or DBA working with Oracle can afford to be without knowledge of these features and how they work, because of the performance and expressiveness they bring to the table. Written in an easygoing and example-based style, Beginning Oracle SQL is the book that will get you started down the path to successfully writing SQL statements and getting results from Oracle Database. Takes an example-based approach, with clear and authoritative explanations Introduces both SQL and the query tools used to execute SQL statements Shows how to create tables, populate them with data, and then query that data to generate business results
This book constitutes the proceedings of the 5th International Meeting on Algebraic and Algorithmic Aspects of Differential and Integral Operators, AADIOS 2012, held at the Applications of Computer Algebra Conference in Sofia, Bulgaria, on June 25-28, 2012. The total of 9 papers presented in this volume consists of 2 invited papers and 7 regular papers which were carefully reviewed and selected from 13 submissions. The topics of interest are: symbolic computation for operator algebras, factorization of differential/integral operators, linear boundary problems and green's operators, initial value problems for differential equations, symbolic integration and differential galois theory, symbolic operator calculi, algorithmic D-module theory, rota-baxter algebra, differential algebra, as well as discrete analogs and software aspects of the above.
The transformation towards EPCglobal networks requires technical equipment for capturing event data and IT systems to store and exchange them with supply chain participants. For the very first time, supply chain participants thus need to face the automatic exchange of event data with business partners. Data protection of sensitive business secrets is therefore the major aspect that needs to be clarified before companies will start to adopt EPCglobal networks. This book contributes to this proposition as follows: it defines the design of transparent real-time security extensions for EPCglobal networks based on in-memory technology. For that, it defines authentication protocols for devices with low computational resources, such as passive RFID tags, and evaluates their applicability. Furthermore, it outlines all steps for implementing history-based access control for EPCglobal software components, which enables a continuous control of access based on the real-time analysis of the complete query history and a fine-grained filtering of event data. The applicability of these innovative data protection mechanisms is underlined by their exemplary integration in the FOSSTRAK architecture.
The importance of benchmarking in the service sector is well recognized as it helps in continuous improvement in products and work processes. Through benchmarking, companies have strived to implement best practices in order to remain competitive in the product- market in which they operate. However studies on benchmarking, particularly in the software development sector, have neglected using multiple variables and therefore have not been as comprehensive. Information Theory and Best Practices in the IT Industry fills this void by examining benchmarking in the business of software development and studying how it is affected by development process, application type, hardware platforms used, and many other variables. Information Theory and Best Practices in the IT Industry begins by examining practices of benchmarking productivity and critically appraises them. Next the book identifies different variables which affect productivity and variables that affect quality, developing useful equations that explaining their relationships. Finally these equations and findings are applied to case studies. Utilizing this book, practitioners can decide about what emphasis they should attach to different variables in their own companies, while seeking to optimize productivity and defect density.
This book constitutes the refereed proceedings of the 11th Latin American Symposium on Theoretical Informatics, LATIN 2014, held in Montevideo, Uruguay, in March/April 2014. The 65 papers presented together with 5 abstracts were carefully reviewed and selected from 192 submissions. The papers address a variety of topics in theoretical computer science with a certain focus on complexity, computational geometry, graph drawing, automata, computability, algorithms on graphs, algorithms, random structures, complexity on graphs, analytic combinatorics, analytic and enumerative combinatorics, approximation algorithms, analysis of algorithms, computational algebra, applications to bioinformatics, budget problems and algorithms and data structures.
This book constitutes the refereed proceedings of the 11th International Conference on Economics of Grids, Clouds, Systems, and Services, GECON 2014, held in Cardiff, UK, in September 2014. The 8 revised full papers and 7 paper-in-progress presented were carefully reviewed and selected from 24 submissions. The presentation sessions that have been set up are: Cloud Adoption, Work in Progress on Market Dynamics, Cost Optimization, Work in Progress on Pricing, Contracts and Service Selection and Economic Aspects of Quality of Service.
Semidefinite programs constitute one of the largest classes of optimization problems that can be solved with reasonable efficiency - both in theory and practice. They play a key role in a variety of research areas, such as combinatorial optimization, approximation algorithms, computational complexity, graph theory, geometry, real algebraic geometry and quantum computing. This book is an introduction to selected aspects of semidefinite programming and its use in approximation algorithms. It covers the basics but also a significant amount of recent and more advanced material. There are many computational problems, such as MAXCUT, for which one cannot reasonably expect to obtain an exact solution efficiently, and in such case, one has to settle for approximate solutions. For MAXCUT and its relatives, exciting recent results suggest that semidefinite programming is probably the ultimate tool. Indeed, assuming the Unique Games Conjecture, a plausible but as yet unproven hypothesis, it was shown that for these problems, known algorithms based on semidefinite programming deliver the best possible approximation ratios among all polynomial-time algorithms. This book follows the "semidefinite side" of these developments, presenting some of the main ideas behind approximation algorithms based on semidefinite programming. It develops the basic theory of semidefinite programming, presents one of the known efficient algorithms in detail, and describes the principles of some others. It also includes applications, focusing on approximation algorithms."
Fine pitch high lead count integrated circuit packages represent a dramatic change from the conventional methods of assembling electronic components to a printed interconnect circuit board. To some, these FPTpackages appear to bean extension of the assembly technology called surface mount or SMT. Many of us who have spent a significant amount of time developing the process and design techniques for these fine pitchpackages haveconcluded that these techniquesgobeyondthose commonly useed for SMT. In 1987 the presentauthor, convincedofthe uniqueness ofthe assembly and design demands ofthese packages, chaired ajoint committee where the members agreed to use fine pitch technology (FPT) as the defining term for these demands. The committee was unique in several ways, one being that it was the first time three U. S. standards organizations, the IPC (Lincolnwood, IL), theEIA(Washington, D. C. ),and theASTM (Philadelphia),cametogether tocreate standards before a technology was in high demand. The term fine pitch technology and its acronym FPT have since become widely accepted in the electronics industry. The knowledge of the terms and demands of FPT currently exceed the usage of FPT packaged components, but this is changing rapidly because of the size, performance, and cost savings of FPT. I have resisted several past invitations to write other technical texts. However, I feel there are important advantages and significant difficulties to be encountered with FPT.
Image registration is required whenever images need to be compared, merged or integrated after they have been taken at different times, from different viewpoints, and/or by different sensors. Registration, also known as alignment, fusion, or warping, is the process of transforming data into a common reference frame. This book provides an overview of state-of-the-art registration techniques from theory to practice, plus numerous exercises designed to enhance readers' understanding of the principles and mechanisms of the described techniques. It also provides, via a supplementary Web page, free access to FAIR.m, a package that is based on the MATLAB(r) software environment, which enables readers to experiment with the proposed algorithms and explore the presented examples in much more depth. Written from an interdisciplinary point of view, this book will appeal to mathematicians, medical imaging professionals and computer scientists and engineers.
This book constitutes the thoroughly refereed post-conference proceedings of the 1st International Conference on Swarm Intelligence Based Optimization, ICSIBO 2014, held in Mulhouse, France, in May 2014. The 20 full papers presented were carefully reviewed and selected from 48 submissions. Topics of interest presented and discussed in the conference focuses on the theoretical progress of swarm intelligence metaheuristics and their applications in areas such as: theoretical advances of swarm intelligence metaheuristics, combinatorial, discrete, binary, constrained, multi-objective, multi-modal, dynamic, noisy, and large-scale optimization, artificial immune systems, particle swarms, ant colony, bacterial foraging, artificial bees, fireflies algorithm, hybridization of algorithms, parallel/distributed computing, machine learning, data mining, data clustering, decision making and multi-agent systems based on swarm intelligence principles, adaptation and applications of swarm intelligence principles to real world problems in various domains.
Scientific Workflow has seen massive growth in recent years as science becomes increasingly reliant on the analysis of massive data sets and the use of distributed resources. The workflow programming paradigm is seen as a means of managing the complexity in defining the analysis, executing the necessary computations on distributed resources, collecting information about the analysis results, and providing means to record and reproduce the scientific analysis. Workflows for e-Science presents an overview of the current state of the art in the field. It brings together research from many of leading computer scientists in the workflow area and provides real world examples from domain scientists actively involved in e-Science. The computer science topics addressed in the book provide a broad overview of active research focusing on the areas of workflow representations and process models, component and service-based workflows, standardization efforts, workflow frameworks and tools, and problem solving environments and portals. The topics covered represent a broad range of scientific workflow and will be of interest to a wide range of computer science researchers, domain scientists interested in applying workflow technologies in their work, and engineers wanting to develop workflow systems and tools. As such Workflows for e-Science is an invaluable resource for potential or existing users of workflow technologies and a benchmark for developers and researchers. Ian Taylor is Lecturer in Computer Science at Cardiff University, and coordinator of Triana activities at Cardiff. He is the author of "From P2P to Web Services and Grids," also published by Springer. Ewa Deelman is a Research Assistant Professor at the USC Computer Science Department and a Research Team Leader at the Center for Grid Technologies at the USC Information Sciences Institute. Dennis Gannon is a professor of Computer Science in the School of Informatics at Indiana University. He is also Science Director for the Indiana Pervasive Technology Labs.. Dr Shields is a research associate at Cardiff and one of two lead developers for the Triana project.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: evolutionary constraints, relaxation of selection mechanisms, diversity preservation strategies, flexing fitness evaluation, evolution in dynamic environments, multi-objective and multi-modal selection, foundations of evolvability, evolvable and adaptive evolutionary operators, foundation of injecting expert knowledge in evolutionary search, analysis of problem difficulty and required GP algorithm complexity, foundations in running GP on the cloud - communication, cooperation, flexible implementation, and ensemble methods. Additional focal points for GP symbolic regression are: (1) The need to guarantee convergence to solutions in the function discovery mode; (2) Issues on model validation; (3) The need for model analysis workflows for insight generation based on generated GP solutions - model exploration, visualization, variable selection, dimensionality analysis; (4) Issues in combining different types of data. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
This volume constitutes the refereed proceedings of the 10th International Conference on Energy Minimization Methods in Computer Vision and Pattern Recognition, EMMCVPR 2015, held in Hong Kong, China, in January 2015. The 36 revised full papers were carefully reviewed and selected from 45 submissions. The papers are organized in topical sections on discrete and continuous optimization; image restoration and inpainting; segmentation; PDE and variational methods; motion, tracking and multiview reconstruction; statistical methods and learning; and medical image analysis.
This book constitutes the proceedings of the 22nd International Symposium on Graph Drawing, GD 2014, held in Wurzburg, Germany, in September 2014. The 41 full papers presented in this volume were carefully reviewed and selected from 72 submissions. The back matter of the book also contains 2 page poster papers presented at the conference. The contributions are organized in topical sections named: planar subgraphs; simultaneous embeddings; applications; contact representations; k-planar graphs; crossing minimization; level drawings; theory; fixed edge directions; drawing under constraints; clustered planarity; and greedy graphs.
This book constitutes the thoroughly refereed post-conference proceedings of the 5th International Workshop, COSADE 2014, held in Paris, France, in April 2014. The 20 revised full papers presented together with two invited talks were carefully selected from 51 submissions and collect truly existing results in cryptographic engineering, from concepts to artifacts, from software to hardware, from attack to countermeasure.
Certification and Security in Inter-Organizational E-Services presents the proceedings of CSES 2004 - the 2nd International Workshop on Certification and Security in Inter-Organizational E-Services held within IFIP WCC in August 2004 in Toulouse, France. Certification and security share a common technological basis in the reliable and efficient monitoring of executed and running processes; they likewise depend on the same fundamental organizational and economic principles. As the range of services managed and accessed through communication networks grows throughout society, and given the legal value that is often attached to data treated or exchanged, it is critical to be able to certify the network transactions and ensure that the integrity of the involved computer-based systems is maintained. This collection of papers documents several important developments, and offers real-life application experiences, research results and methodological proposals of direct interest to systems experts and users in governmental, industrial and academic communities.
The Euclidean shortest path (ESP) problem asks the question: what is the path of minimum length connecting two points in a 2- or 3-dimensional space? Variants of this industrially-significant computational geometry problem also require the path to pass through specified areas and avoid defined obstacles. This unique text/reference reviews algorithms for the exact or approximate solution of shortest-path problems, with a specific focus on a class of algorithms called rubberband algorithms. Discussing each concept and algorithm in depth, the book includes mathematical proofs for many of the given statements. Suitable for a second- or third-year university algorithms course, the text enables readers to understand not only the algorithms and their pseudocodes, but also the correctness proofs, the analysis of time complexities, and other related topics. Topics and features: provides theoretical and programming exercises at the end of each chapter; presents a thorough introduction to shortest paths in Euclidean geometry, and the class of algorithms called rubberband algorithms; discusses algorithms for calculating exact or approximate ESPs in the plane; examines the shortest paths on 3D surfaces, in simple polyhedrons and in cube-curves; describes the application of rubberband algorithms for solving art gallery problems, including the safari, zookeeper, watchman, and touring polygons route problems; includes lists of symbols and abbreviations, in addition to other appendices. This hands-on guide will be of interest to undergraduate students in computer science, IT, mathematics, and engineering. Programmers, mathematicians, and engineers dealing with shortest-path problems in practical applications will also find the book a useful resource.
Designing Sorting Networks: A New Paradigm provides an in-depth
guide to maximizing the efficiency of sorting networks, and uses
0/1 cases, partially ordered sets and Haase diagrams to closely
analyze their behavior in an easy, intuitive manner.
This book constitutes the proceedings of the 21st International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2015, which took place in London, UK, in April 2015, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2015. The 45 papers included in this volume, consisting of 27 research papers, 2 case-study papers, 7 regular tool papers and 9 tool demonstration papers, were carefully reviewed and selected from 164 submissions. In addition, the book contains one invited contribution. The papers have been organized in topical sections on hybrid systems; program analysis; verification and abstraction; tool demonstrations; stochastic models; SAT and SMT; partial order reduction, bisimulation, and fairness; competition on software verification; parameter synthesis; program synthesis; program and runtime verification; temporal logic and automata and model checking. |
You may like...
Gaming and Technology Addiction…
Information Reso Management Association
Hardcover
R7,716
Discovery Miles 77 160
Visual and Linguistic Representations of…
Maria Pia Pozzato
Hardcover
The Cartography of Chinese Syntax - The…
Wei-Tien Dylan Tsai
Hardcover
R3,565
Discovery Miles 35 650
Directed Algebraic Topology and…
Lisbeth Fajstrup, Eric Goubault, …
Hardcover
R3,273
Discovery Miles 32 730
|