![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
This book constitutes the thoroughly refereed post-conference proceedings of the International Dagstuhl-Seminar on Efficient Algorithms for Global Optimization Methods in Computer Vision, held in Dagstuhl Castle, Germany, in November 2011. The 8 revised full papers presented were carefully reviewed and selected by 12 lectures given at the seminar. The seminar focused on the entire algorithmic development pipeline for global optimization problems in computer vision: modelling, mathematical analysis, numerical solvers and parallelization. In particular, the goal of the seminar was to bring together researchers from all four fields to analyze and discuss the connections between the different stages of the algorithmic design pipeline.
This volume constitutes the refereed proceedings of the 16th International Workshop on Combinatorial Image Analysis, IWCIA 2014, held in Brno, Czech Republic, in May 2014. The 20 revised full papers and 3 invited papers presented were carefully reviewed and selected from numerous submissions. The topics covered include discrete geometry and topology in imaging science, new results in image representation, segmentation, grouping, and reconstruction, medical image processing.
This monograph presents examples of best practices when combining bioinspired algorithms with parallel architectures. The book includes recent work by leading researchers in the field and offers a map with the main paths already explored and new ways towards the future. Parallel Architectures and Bioinspired Algorithms will be of value to both specialists in Bioinspired Algorithms, Parallel and Distributed Computing, as well as computer science students trying to understand the present and the future of Parallel Architectures and Bioinspired Algorithms.
This book constitutes the proceedings of the 9th International Computer Science Symposium in Russia, CSR 2014, held in Moscow, Russia, in June 2014. The 27 full papers presented in this volume were carefully reviewed and selected from 76 submissions. In addition the book contains 4 invited lectures. The scope of the proposed topics is quite broad and covers a wide range of areas in theoretical computer science and its applications.
The new multimedia standards (for example, MPEG-21) facilitate the seamless integration of multiple modalities into interoperable multimedia frameworks, transforming the way people work and interact with multimedia data. These key technologies and multimedia solutions interact and collaborate with each other in increasingly effective ways, contributing to the multimedia revolution and having a significant impact across a wide spectrum of consumer, business, healthcare, education, and governmental domains. This book aims to provide a complete coverage of the areas outlined and to bring together the researchers from academic and industry as well as practitioners to share ideas, challenges, and solutions relating to the multifaceted aspects of this field.
Transactions are a concept related to the logical database as seen from the perspective of database application programmers: a transaction is a sequence of database actions that is to be executed as an atomic unit of work. The processing of transactions on databases is a well- established area with many of its foundations having already been laid in the late 1970s and early 1980s. The unique feature of this textbook is that it bridges the gap between the theory of transactions on the logical database and the implementation of the related actions on the underlying physical database. The authors relate the logical database, which is composed of a dynamically changing set of data items with unique keys, and the underlying physical database with a set of fixed-size data and index pages on disk. Their treatment of transaction processing builds on the "do-redo-undo" recovery paradigm, and all methods and algorithms presented are carefully designed to be compatible with this paradigm as well as with write-ahead logging, steal-and-no-force buffering, and fine-grained concurrency control. Chapters 1 to 6 address the basics needed to fully appreciate transaction processing on a centralized database system within the context of our transaction model, covering topics like ACID properties, database integrity, buffering, rollbacks, isolation, and the interplay of logical locks and physical latches. Chapters 7 and 8 present advanced features including deadlock-free algorithms for reading, inserting and deleting tuples, while the remaining chapters cover additional advanced topics extending on the preceding foundational chapters, including multi-granular locking, bulk actions, versioning, distributed updates, and write-intensive transactions. This book is primarily intended as a text for advanced undergraduate or graduate courses on database management in general or transaction processing in particular.
This book is the outcome of the Dagstuhl Seminar 13201 on Information Visualization - Towards Multivariate Network Visualization, held in Dagstuhl Castle, Germany in May 2013. The goal of this Dagstuhl Seminar was to bring together theoreticians and practitioners from Information Visualization, HCI and Graph Drawing with a special focus on multivariate network visualization, i.e., on graphs where the nodes and/or edges have additional (multidimensional) attributes. The integration of multivariate data into complex networks and their visual analysis is one of the big challenges not only in visualization, but also in many application areas. Thus, in order to support discussions related to the visualization of real world data, also invited researchers from selected application areas, especially bioinformatics, social sciences and software engineering. The unique "Dagstuhl climate" ensured an open and undisturbed atmosphere to discuss the state-of-the-art, new directions and open challenges of multivariate network visualization.
Librarians have been providing support to researchers for many years, typically with a focus on responding to researchers' needs for access to the existing literature. However, librarians' skills and expertise make them uniquely suited to provide a wide range of assistance to researchers across the entire research process, from conception of the research question to archiving of collected data at the project's conclusion. In response to increasingly stringent demands on researchers to share their data, and as computationally intensive and primarily data-driven scientific methods begin to take the place of traditional lab-based research, the "research informationist" has emerged as a new information profession. With a background in library and information sciences, as well as expertise in best practices for data management, grant funder policies, and informatics tools, the research informationist is capable of implementing a full suite of research support services. This book will discuss how the research informationist role has developed out of the previously established clinical informationist model and how it expands on the model of embedded librarianship. The book will also examine core competencies for the successful research informationist and the training and preparation necessary for students in library and information sciences programs, as well as currently practicing librarians. Finally, this book will consider how research informationists can form collaborative partnerships with research teams and build their services outside the walls of the library, citing practical examples of the types of support research informationists can offer.
The importance of benchmarking in the service sector is well recognized as it helps in continuous improvement in products and work processes. Through benchmarking, companies have strived to implement best practices in order to remain competitive in the product- market in which they operate. However studies on benchmarking, particularly in the software development sector, have neglected using multiple variables and therefore have not been as comprehensive. Information Theory and Best Practices in the IT Industry fills this void by examining benchmarking in the business of software development and studying how it is affected by development process, application type, hardware platforms used, and many other variables. Information Theory and Best Practices in the IT Industry begins by examining practices of benchmarking productivity and critically appraises them. Next the book identifies different variables which affect productivity and variables that affect quality, developing useful equations that explaining their relationships. Finally these equations and findings are applied to case studies. Utilizing this book, practitioners can decide about what emphasis they should attach to different variables in their own companies, while seeking to optimize productivity and defect density.
The vast area of Scientific Computing, which is concerned with the computer- aided simulation of various processes in engineering, natural, economical, or social sciences, now enjoys rapid progress owing to the development of new efficient symbolic, numeric, and symbolic/numeric algorithms. There has already been for a long time a worldwide recognition of the fact that the mathematical term algorithm takes its origin from the Latin word algo- ritmi, which is in turn a Latin transliteration of the Arab name "AI Khoresmi" of the Khoresmian mathematician Moukhammad Khoresmi, who lived in the Khoresm khanate during the years 780 - 850. The Khoresm khanate took sig- nificant parts of the territories of present-day TUrkmenistan and Uzbekistan. Such towns of the Khoresm khanate as Bukhara and Marakanda (the present- day Samarkand) were the centers of mathematical science and astronomy. The great Khoresmian mathematician M. Khoresmi introduced the Indian decimal positional system into everyday's life; this system is based on using the famil- iar digits 1,2,3,4,5,6,7,8,9,0. M. Khoresmi had presented the arithmetic in the decimal positional calculus (prior to him, the Indian positional system was the subject only for jokes and witty disputes). Khoresmi's Book of Addition and Subtraction by Indian Method (Arithmetic) differs little from present-day arith- metic. This book was translated into Latin in 1150; the last reprint was produced in Rome in 1957.
This book constitutes the refereed proceedings of the 25th Annual Symposium on Combinatorial Pattern Matching, CPM 2014, held in Moscow, Russia, in June 2014. The 28 revised full papers presented together with 5 invited talks were carefully reviewed and selected from 54 submissions. The papers address issues of searching and matching strings and more complicated patterns such as trees; regular expressions; graphs; point sets; and arrays. The goal is to derive combinatorial properties of such structures and to exploit these properties in order to achieve superior performance for the corresponding computational problems. The meeting also deals with problems in computational biology; data compression and data mining; coding; information retrieval; natural language processing; and pattern recognition.
The present book is the result of a three year research project which investigated the creative act of composing by means of algorithmic composition. Central to the investigation are the compositional strategies of 12 composers, which were documented through a dialogic and cyclic process of modelling and evaluating musical materials. The aesthetic premises and compositional approaches configure a rich spectrum of diverse positions, which is reflected also in the kinds of approaches and methods used. These approaches and methods include the generation and evaluation of chord sequences using genetic algorithms, the application of morphing strategies to research harmonic transformations, an automatic classification of personal preferences via machine learning, and an application of mathematical music theory to the analysis and resynthesis of musical material. The second part of the book features contributions by Sandeep Bhagwati, William Brooks, David Cope, Darla Crispin, Nicolas Donin, and Guerino Mazzola. These authors variously consider the project from different perspectives, offer independent approaches, or provide more general reflections from their respective research fields.
Articles in this book examine various materials and how to determine directly the limit state of a structure, in the sense of limit analysis and shakedown analysis. Apart from classical applications in mechanical and civil engineering contexts, the book reports on the emerging field of material design beyond the elastic limit, which has further industrial design and technological applications. Readers will discover that "Direct Methods" and the techniques presented here can in fact be used to numerically estimate the strength of structured materials such as composites or nano-materials, which represent fruitful fields of future applications. Leading researchers outline the latest computational tools and optimization techniques and explore the possibility of obtaining information on the limit state of a structure whose post-elastic loading path and constitutive behavior are not well defined or well known. Readers will discover how Direct Methods allow rapid and direct access to requested information in mathematically constructive manners without cumbersome step-by-step computation. Both researchers already interested or involved in the field and practical engineers who want to have a panorama of modern methods for structural safety assessment will find this book valuable. It provides the reader with the latest developments and a significant amount of references on the topic.
This timely text/reference explores the business and technical issues involved in the management of information systems in the era of big data and beyond. Topics and features: presents review questions and discussion topics in each chapter for classroom group work and individual research assignments; discusses the potential use of a variety of big data tools and techniques in a business environment, explaining how these can fit within an information systems strategy; reviews existing theories and practices in information systems, and explores their continued relevance in the era of big data; describes the key technologies involved in information systems in general and big data in particular, placing these technologies in an historic context; suggests areas for further research in this fast moving domain; equips readers with an understanding of the important aspects of a data scientist's job; provides hands-on experience to further assist in the understanding of the technologies involved.
This book constitutes the refereed proceedings of the Second IFIP TC 5/8 International Conference on Information and Communication Technology, ICT-Eur Asia 2014, with the collocation of Asia ARES 2014 as a special track on Availability, Reliability and Security, held in Bali, Indonesia, in April 2014. The 70 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers have been organized in the following topical sections: applied modeling and simulation; mobile computing; advanced urban-scale ICT applications; semantic web and knowledge management; cloud computing; image processing; software engineering; collaboration technologies and systems; e-learning; data warehousing and data mining; e-government and e-health; biometric and bioinformatics systems; network security; dependable systems and applications; privacy and trust management; cryptography; multimedia security and dependable systems and applications.
This book constitutes the proceedings of the 33rd Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2014, held in Copenhagen, Denmark, in May 2014. The 38 full papers included in this volume were carefully reviewed and selected from 197 submissions. They deal with public key cryptanalysis, identity-based encryption, key derivation and quantum computing, secret-key analysis and implementations, obfuscation and multi linear maps, authenticated encryption, symmetric encryption, multi-party encryption, side-channel attacks, signatures and public-key encryption, functional encryption, foundations and multi-party computation.
This two-volume-set (LNCS 8384 and 8385) constitutes the refereed proceedings of the 10th International Conference of Parallel Processing and Applied Mathematics, PPAM 2013, held in Warsaw, Poland, in September 2013. The 143 revised full papers presented in both volumes were carefully reviewed and selected from numerous submissions. The papers cover important fields of parallel/distributed/cloud computing and applied mathematics, such as numerical algorithms and parallel scientific computing; parallel non-numerical algorithms; tools and environments for parallel/distributed/cloud computing; applications of parallel computing; applied mathematics, evolutionary computing and metaheuristics.
This book is designed both for FPGA users interested in developing new, specific components - generally for reducing execution times -and IP core designers interested in extending their catalog of specific components. The main focus is circuit synthesis and the discussion shows, for example, how a given algorithm executing some complex function can be translated to a synthesizable circuit description, as well as which are the best choices the designer can make to reduce the circuit cost, latency, or power consumption. This is not a book on algorithms. It is a book that shows how to translate efficiently an algorithm to a circuit, using techniques such as parallelism, pipeline, loop unrolling, and others. Numerous examples of FPGA implementation are described throughout this book and the circuits are modeled in VHDL. Complete and synthesizable source files are available for download."
This book constitutes thoroughly refereed post-conference proceedings of the workshops of the 19th International Conference on Parallel Computing, Euro-Par 2013, held in Aachen, Germany in August 2013. The 99 papers presented were carefully reviewed and selected from 145 submissions. The papers include seven workshops that have been co-located with Euro-Par in the previous years: - Big Data Cloud (Second Workshop on Big Data Management in Clouds) - Hetero Par (11th Workshop on Algorithms, Models and Tools for Parallel Computing on Heterogeneous Platforms) - HiBB (Fourth Workshop on High Performance Bioinformatics and Biomedicine) - OMHI (Second Workshop on On-chip Memory Hierarchies and Interconnects) - PROPER (Sixth Workshop on Productivity and Performance) - Resilience (Sixth Workshop on Resiliency in High Performance Computing with Clusters, Clouds, and Grids) - UCHPC (Sixth Workshop on Un Conventional High Performance Computing) as well as six newcomers: - DIHC (First Workshop on Dependability and Interoperability in Heterogeneous Clouds) - Fed ICI (First Workshop on Federative and Interoperable Cloud Infrastructures) - LSDVE (First Workshop on Large Scale Distributed Virtual Environments on Clouds and P2P) - MHPC (Workshop on Middleware for HPC and Big Data Systems) -PADABS ( First Workshop on Parallel and Distributed Agent Based Simulations) - ROME (First Workshop on Runtime and Operating Systems for the Many core Era) All these workshops focus on promotion and advancement of all aspects of parallel and distributed computing.
The comprehension of a traffic situation plays a major role in driving a vehicle. Interpretable information forms a basis for future projection, decision making and action performing, such as navigating, maneuvering and driving control. Michael Huelsen provides an ontology-based generic traffic situation description capable of supplying various advanced driver assistance systems with relevant information about the current traffic situation of a vehicle and its environment. These systems are enabled to perform reasonable actions and approach visionary goals such as injury and accident free driving, substantial assistance in arbitrary situations up to even autonomous driving.
This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference for graduates, lecturers, engineers and researchers in computer science, evolutionary computing, artificial intelligence, machine learning, computational intelligence, data mining, engineering optimization and designs.
Scientific Workflow has seen massive growth in recent years as science becomes increasingly reliant on the analysis of massive data sets and the use of distributed resources. The workflow programming paradigm is seen as a means of managing the complexity in defining the analysis, executing the necessary computations on distributed resources, collecting information about the analysis results, and providing means to record and reproduce the scientific analysis. Workflows for e-Science presents an overview of the current state of the art in the field. It brings together research from many of leading computer scientists in the workflow area and provides real world examples from domain scientists actively involved in e-Science. The computer science topics addressed in the book provide a broad overview of active research focusing on the areas of workflow representations and process models, component and service-based workflows, standardization efforts, workflow frameworks and tools, and problem solving environments and portals. The topics covered represent a broad range of scientific workflow and will be of interest to a wide range of computer science researchers, domain scientists interested in applying workflow technologies in their work, and engineers wanting to develop workflow systems and tools. As such Workflows for e-Science is an invaluable resource for potential or existing users of workflow technologies and a benchmark for developers and researchers. Ian Taylor is Lecturer in Computer Science at Cardiff University, and coordinator of Triana activities at Cardiff. He is the author of "From P2P to Web Services and Grids," also published by Springer. Ewa Deelman is a Research Assistant Professor at the USC Computer Science Department and a Research Team Leader at the Center for Grid Technologies at the USC Information Sciences Institute. Dennis Gannon is a professor of Computer Science in the School of Informatics at Indiana University. He is also Science Director for the Indiana Pervasive Technology Labs.. Dr Shields is a research associate at Cardiff and one of two lead developers for the Triana project.
This is the first book devoted to the systematic study of sparse graphs and sparse finite structures. Although the notion of sparsity appears in various contexts and is a typical example of a hard to define notion, the authors devised an unifying classification of general classes of structures. This approach is very robust and it has many remarkable properties. For example the classification is expressible in many different ways involving most extremal combinatorial invariants. This study of sparse structures found applications in such diverse areas as algorithmic graph theory, complexity of algorithms, property testing, descriptive complexity and mathematical logic (homomorphism preservation,fixed parameter tractability and constraint satisfaction problems). It should be stressed that despite of its generality this approach leads to linear (and nearly linear) algorithms. Jaroslav Nesetril is a professor at Charles University, Prague; Patrice Ossona de Mendez is a CNRS researcher et EHESS, Paris. This book is related to the material presented by the first author at ICM 2010.
Grids, P2P and Services Computing, the 12th volume of the CoreGRID series, is based on the CoreGrid ERCIM Working Group Workshop on Grids, P2P and Service Computing in Conjunction with EuroPar 2009. The workshop will take place August 24th, 2009 in Delft, The Netherlands. Grids, P2P and Services Computing, an edited volume contributed by well-established researchers worldwide, will focus on solving research challenges for Grid and P2P technologies. Topics of interest include: Service Level Agreement, Data & Knowledge Management, Scheduling, Trust and Security, Network Monitoring and more. Grids are a crucial enabling technology for scientific and industrial development. This book also includes new challenges related to service-oriented infrastructures. Grids, P2P and Services Computing is designed for a professional audience composed of researchers and practitioners within the Grid community industry. This volume is also suitable for advanced-level students in computer science.
This book constitutes the thoroughly refereed post-conference proceedings of the 6th Conference on Theory of Quantum Computation, Communication, and Cryptography, TQC 2011, held in Madrid, Spain, in May 2011. The 14 revised papers presented were carefully selected from numerous submissions. The papers present new and original research and cover a large range of topics in quantum computation, communication and cryptography, a new and interdisciplinary field at the intersection of computer science, information theory and quantum mechanics. |
![]() ![]() You may like...
Pesticide Contamination in Freshwater…
Mohammad Aneesul Mehmood, Khalid Rehman Hakeem, …
Hardcover
R4,383
Discovery Miles 43 830
Architectures of Hurry-Mobilities…
Phillip Gordon Mackintosh, Richard Dennis, …
Paperback
R1,384
Discovery Miles 13 840
Informing Water Policies in South Asia
Anjal Prakash, Chanda Gurung Goodrich, …
Hardcover
R4,293
Discovery Miles 42 930
|