![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
This book constitutes the proceedings of the 22nd International Symposium on Graph Drawing, GD 2014, held in Wurzburg, Germany, in September 2014. The 41 full papers presented in this volume were carefully reviewed and selected from 72 submissions. The back matter of the book also contains 2 page poster papers presented at the conference. The contributions are organized in topical sections named: planar subgraphs; simultaneous embeddings; applications; contact representations; k-planar graphs; crossing minimization; level drawings; theory; fixed edge directions; drawing under constraints; clustered planarity; and greedy graphs.
This volume constitutes the refereed proceedings of the 10th International Conference on Energy Minimization Methods in Computer Vision and Pattern Recognition, EMMCVPR 2015, held in Hong Kong, China, in January 2015. The 36 revised full papers were carefully reviewed and selected from 45 submissions. The papers are organized in topical sections on discrete and continuous optimization; image restoration and inpainting; segmentation; PDE and variational methods; motion, tracking and multiview reconstruction; statistical methods and learning; and medical image analysis.
The two volumes LNCS 8805 and 8806 constitute the thoroughly refereed post-conference proceedings of 18 workshops held at the 20th International Conference on Parallel Computing, Euro-Par 2014, in Porto, Portugal, in August 2014. The 100 revised full papers presented were carefully reviewed and selected from 173 submissions. The volumes include papers from the following workshops: APCI&E (First Workshop on Applications of Parallel Computation in Industry and Engineering - BigDataCloud (Third Workshop on Big Data Management in Clouds) - DIHC (Second Workshop on Dependability and Interoperability in Heterogeneous Clouds) - FedICI (Second Workshop on Federative and Interoperable Cloud Infrastructures) - Hetero Par (12th International Workshop on Algorithms, Models and Tools for Parallel Computing on Heterogeneous Platforms) - HiBB (5th Workshop on High Performance Bioinformatics and Biomedicine) - LSDVE (Second Workshop on Large Scale Distributed Virtual Environments on Clouds and P2P) - MuCoCoS (7th International Workshop on Multi-/Many-core Computing Systems) - OMHI (Third Workshop on On-chip Memory Hierarchies and Interconnects) - PADAPS (Second Workshop on Parallel and Distributed Agent-Based Simulations) - PROPER (7th Workshop on Productivity and Performance) - Resilience (7th Workshop on Resiliency in High Performance Computing with Clusters, Clouds, and Grids) - REPPAR (First International Workshop on Reproducibility in Parallel Computing) - ROME (Second Workshop on Runtime and Operating Systems for the Many Core Era) - SPPEXA (Workshop on Software for Exascale Computing) - TASUS (First Workshop on Techniques and Applications for Sustainable Ultrascale Computing Systems) - UCHPC (7th Workshop on Un Conventional High Performance Computing) and VHPC (9th Workshop on Virtualization in High-Performance Cloud Computing.
This book constitutes the proceedings of the 5th International Meeting on Algebraic and Algorithmic Aspects of Differential and Integral Operators, AADIOS 2012, held at the Applications of Computer Algebra Conference in Sofia, Bulgaria, on June 25-28, 2012. The total of 9 papers presented in this volume consists of 2 invited papers and 7 regular papers which were carefully reviewed and selected from 13 submissions. The topics of interest are: symbolic computation for operator algebras, factorization of differential/integral operators, linear boundary problems and green's operators, initial value problems for differential equations, symbolic integration and differential galois theory, symbolic operator calculi, algorithmic D-module theory, rota-baxter algebra, differential algebra, as well as discrete analogs and software aspects of the above.
Designing Sorting Networks: A New Paradigm provides an in-depth
guide to maximizing the efficiency of sorting networks, and uses
0/1 cases, partially ordered sets and Haase diagrams to closely
analyze their behavior in an easy, intuitive manner.
This book constitutes the proceedings of the 20th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2014, which took place in Grenoble, France, in April 2014, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2014. The total of 42 papers included in this volume, consisting of 26 research papers, 3 case study papers, 6 regular tool papers and 7 tool demonstrations papers, were carefully reviewed and selected from 161 submissions. In addition the book contains one invited contribution. The papers are organized in topical sections named: decision procedures and their application in analysis; complexity and termination analysis; modeling and model checking discrete systems; timed and hybrid systems; monitoring, fault detection and identification; competition on software verification; specifying and checking linear time properties; synthesis and learning; quantum and probabilistic systems; as well as tool demonstrations and case studies.
This book constitutes the proceedings of the 21st International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2015, which took place in London, UK, in April 2015, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2015. The 45 papers included in this volume, consisting of 27 research papers, 2 case-study papers, 7 regular tool papers and 9 tool demonstration papers, were carefully reviewed and selected from 164 submissions. In addition, the book contains one invited contribution. The papers have been organized in topical sections on hybrid systems; program analysis; verification and abstraction; tool demonstrations; stochastic models; SAT and SMT; partial order reduction, bisimulation, and fairness; competition on software verification; parameter synthesis; program synthesis; program and runtime verification; temporal logic and automata and model checking.
Among the group of physics honors students huddled in 1957 on a Colorado mountain watching Sputnik bisect the heavens, one young scientist was destined, three short years later, to become a key player in America s own top-secret spy satellite program. One of our era s most prolific mathematicians, Karl Gustafson was given just two weeks to write the first US spy satellite s software. The project would fundamentally alter America s Cold War strategy, and this autobiographical account of a remarkable academic life spent in the top flight tells this fascinating inside story for the first time. Gustafson takes you from his early pioneering work in computing, through fascinating encounters with Nobel laureates and Fields medalists, to his current observations on mathematics, science and life. He tells of brushes with death, being struck by lightning, and the beautiful women who have been a part of his journey."
Semidefinite programs constitute one of the largest classes of optimization problems that can be solved with reasonable efficiency - both in theory and practice. They play a key role in a variety of research areas, such as combinatorial optimization, approximation algorithms, computational complexity, graph theory, geometry, real algebraic geometry and quantum computing. This book is an introduction to selected aspects of semidefinite programming and its use in approximation algorithms. It covers the basics but also a significant amount of recent and more advanced material. There are many computational problems, such as MAXCUT, for which one cannot reasonably expect to obtain an exact solution efficiently, and in such case, one has to settle for approximate solutions. For MAXCUT and its relatives, exciting recent results suggest that semidefinite programming is probably the ultimate tool. Indeed, assuming the Unique Games Conjecture, a plausible but as yet unproven hypothesis, it was shown that for these problems, known algorithms based on semidefinite programming deliver the best possible approximation ratios among all polynomial-time algorithms. This book follows the "semidefinite side" of these developments, presenting some of the main ideas behind approximation algorithms based on semidefinite programming. It develops the basic theory of semidefinite programming, presents one of the known efficient algorithms in detail, and describes the principles of some others. It also includes applications, focusing on approximation algorithms."
Solders have given the designer of modern consumer, commercial, and military electronic systems a remarkable flexibility to interconnect electronic components. The properties of solder have facilitated broad assembly choices that have fueled creative applications to advance technology. Solder is the electrical and me chanical "glue" of electronic assemblies. This pervasive dependency on solder has stimulated new interest in applica tions as well as a more concerted effort to better understand materials properties. We need not look far to see solder being used to interconnect ever finer geo metries. Assembly of micropassive discrete devices that are hardly visible to the unaided eye, of silicon chips directly to ceramic and plastic substrates, and of very fine peripheral leaded packages constitute a few of solder's uses. There has been a marked increase in university research related to solder. New electronic packaging centers stimulate applications, and materials engineering and science departments have demonstrated a new vigor to improve both the materials and our understanding of them. Industrial research and development continues to stimulate new application, and refreshing new packaging ideas are emerging. New handbooks have been published to help both the neophyte and seasoned packaging engineer.
Temporal Information Systems in Medicine introduces the engineering of information systems for medically-related problems and applications. The chapters are organized into four parts; fundamentals, temporal reasoning & maintenance in medicine, time in clinical tasks, and the display of time-oriented clinical information. The chapters are self-contained with pointers to other relevant chapters or sections in this book when necessary. Time is of central importance and is a key component of the engineering process for information systems. This book is designed as a secondary text or reference book for upper -undergraduate level students and graduate level students concentrating on computer science, biomedicine and engineering. Industry professionals and researchers working in health care management, information systems in medicine, medical informatics, database management and AI will also find this book a valuable asset.
The development of effective methods for the prediction of ontological annotations is an important goal in computational biology, yet evaluating their performance is difficult due to problems caused by the structure of biomedical ontologies and incomplete annotations of genes. This work proposes an information-theoretic framework to evaluate the performance of computational protein function prediction. A Bayesian network is used, structured according to the underlying ontology, to model the prior probability of a protein's function. The concepts of misinformation and remaining uncertainty are then defined, that can be seen as analogs of precision and recall. Finally, semantic distance is proposed as a single statistic for ranking classification models. The approach is evaluated by analyzing three protein function predictors of gene ontology terms. The work addresses several weaknesses of current metrics, and provides valuable insights into the performance of protein function prediction tools.
The LNCS journal Transactions on Large-Scale Data- and Knowledge-Centered Systems focuses on data management, knowledge discovery, and knowledge processing, which are core and hot topics in computer science. Since the 1990s, the Internet has become the main driving force behind application development in all domains. An increase in the demand for resource sharing across different sites connected through networks has led to an evolution of data- and knowledge-management systems from centralized systems to decentralized systems enabling large-scale distributed applications providing high scalability. Current decentralized systems still focus on data and knowledge as their main resource. Feasibility of these systems relies basically on P2P (peer-to-peer) techniques and the support of agent systems with scaling and decentralized control. Synergy between grids, P2P systems, and agent technologies is the key to data- and knowledge-centered systems in large-scale environments. This, the 18th issue of Transactions on Large-Scale Data- and Knowledge-Centered Systems, contains extended and revised versions of seven papers presented at the 24th International Conference on Database and Expert Systems Applications, DEXA 2013, held in Prague, in the Czech Republic, in August 2013. Following the conference, and two further rounds of reviewing and selection, five extended papers and two invited keynote papers were chosen for inclusion in this special issue. The subject areas covered include argumentation, e-government, business processes, predictive traffic estimation, semantic model integration, top-k query processing, uncertainty handling, graph comparison, community detection, genetic programming, and web services.
This book constitutes the refereed proceedings of the 18th National Conference on Computer Engineering and Technology, NCCET 2014, held in Guiyang, China, during July/August 2014. The 18 papers presented were carefully reviewed and selected from 85 submissions. They are organized in topical sections on processor architecture; computer application and software optimization; technology on the horizon.
The Proceedings of SocProS 2014 serves as an academic bonanza for scientists and researchers working in the field of Soft Computing. This book contains theoretical as well as practical aspects using fuzzy logic, neural networks, evolutionary algorithms, swarm intelligence algorithms, etc., with many applications under the umbrella of 'Soft Computing'. The book is beneficial for young as well as experienced researchers dealing across complex and intricate real world problems for which finding a solution by traditional methods is a difficult task. The different application areas covered in the Proceedings are: Image Processing, Cryptanalysis, Industrial Optimization, Supply Chain Management, Newly Proposed Nature Inspired Algorithms, Signal Processing, Problems related to Medical and Healthcare, Networking Optimization Problems, etc.
This book constitutes the thoroughly refereed post-conference proceedings of the 4th International ICST Conference on Sensor Systems and Software, S-Cube 2013, held in Lucca, Italy, 2013. The 8 revised full papers and 2 invited papers presented cover contributions on different technologies for wireless sensor networks, including security protocols, middleware, analysis tools and frameworks.
This volume constitutes the refereed proceedings of the following 9 international workshops: OTM Academy, OTM Industry Case Studies Program, Cloud and Trusted Computing, C&TC, Enterprise Integration, Interoperability, and Networking, EI2N, Industrial and Business Applications of Semantic Web Technologies, INBAST, Information Systems, om Distributed Environment, ISDE, Methods, Evaluation, Tools and Applications for the Creation and Consumption of Structured Data for the e-Society, META4eS, Mobile and Social Computing for collaborative interactions, MSC, and Ontology Content, OnToContent 2014. These workshops were held as associated events at OTM 2014, the federated conferences "On The Move Towards Meaningful Internet Systems and Ubiquitous Computing", in Amantea, Italy, in October 2014. The 56 full papers presented together with 8 short papers, 6 posters and 5 keynotes were carefully reviewed and selected from a total of 96 submissions. The focus of the workshops were on the following subjects models for interoperable infrastructures, applications, privacy and access control, reliability and performance, cloud and configuration management, interoperability in (System-of-)Systems, distributed information systems applications, architecture and process in distributed information system, distributed information system development and operational environment, ontology is use for eSociety, knowledge management and applications for eSociety, social networks and social services, social and mobile intelligence, and multimodal interaction and collaboration.
* Provides simple, conceptual descriptions of everyday technologies * Includes clear examples and diagrams that demonstrate the principles and techniques, not just a "how-to" punch list * Covers advanced topics for readers who want to dive into the deep end of the technology pool * Avoids jargon-where terminology does appear, the text will provide clear, concise definitions
Big Data Application Architecture Pattern Recipes provides an insight into heterogeneous infrastructures, databases, and visualization and analytics tools used for realizing the architectures of big data solutions. Its problem-solution approach helps in selecting the right architecture to solve the problem at hand. In the process of reading through these problems, you will learn harness the power of new big data opportunities which various enterprises use to attain real-time profits. Big Data Application Architecture Pattern Recipes answers one of the most critical questions of this time 'how do you select the best end-to-end architecture to solve your big data problem?'. The book deals with various mission critical problems encountered by solution architects, consultants, and software architects while dealing with the myriad options available for implementing a typical solution, trying to extract insight from huge volumes of data in real--time and across multiple relational and non-relational data types for clients from industries like retail, telecommunication, banking, and insurance.The patterns in this book provide the strong architectural foundation required to launch your next big data application. The architectures for realizing these opportunities are based on relatively less expensive and heterogeneous infrastructures compared to the traditional monolithic and hugely expensive options that exist currently. This book describes and evaluates the benefits of heterogeneity which brings with it multiple options of solving the same problem, evaluation of trade-offs and validation of 'fitness-for-purpose' of the solution. What you'll learn * Major considerations in building a big data solution * Big data application architectures problems for specific industries * What are the components one needs to build and end-to-end big data solution? * Does one really need a real-time big data solution or an off-line analytics batch solution? * What are the operations and support architectures for a big data solution? * What are the scalability considerations, and options for a Hadoop installation?Who this book is for * CIOs, CTOs, enterprise architects, and software architects * Consultants, solution architects, and information management (IM) analysts who want to architect a big data solution for their enterprise
Grids, P2P and Services Computing, the 12th volume of the CoreGRID series, is based on the CoreGrid ERCIM Working Group Workshop on Grids, P2P and Service Computing in Conjunction with EuroPar 2009. The workshop will take place August 24th, 2009 in Delft, The Netherlands. Grids, P2P and Services Computing, an edited volume contributed by well-established researchers worldwide, will focus on solving research challenges for Grid and P2P technologies. Topics of interest include: Service Level Agreement, Data & Knowledge Management, Scheduling, Trust and Security, Network Monitoring and more. Grids are a crucial enabling technology for scientific and industrial development. This book also includes new challenges related to service-oriented infrastructures. Grids, P2P and Services Computing is designed for a professional audience composed of researchers and practitioners within the Grid community industry. This volume is also suitable for advanced-level students in computer science.
This book constitutes the proceedings of the 14th IMA International Conference on Cryptography and Coding, IMACC 2013, held at Oxford, UK, in December 2013. The 20 papers presented were carefully reviewed and selected for inclusion in this book. They are organized in topical sections named: bits and booleans; homomorphic encryption; codes and applications; cryptanalysis; protecting against leakage; hash functions; key issues and public key primitives.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics include: modularity and scalability; evolvability; human-competitive results; the need for important high-impact GP-solvable problems;; the risks of search stagnation and of cutting off paths to solutions; the need for novelty; empowering GP search with expert knowledge; In addition, GP symbolic regression is thoroughly discussed, addressing such topics as guaranteed reproducibility of SR; validating SR results, measuring and controlling genotypic complexity; controlling phenotypic complexity; identifying, monitoring, and avoiding over-fitting; finding a comprehensive collection of SR benchmarks, comparing SR to machine learning. This text is for all GP explorers. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different scenarios of experimental analysis. The first part overviews the main issues in the experimental analysis of algorithms, and discusses the experimental cycle of algorithm development; the second part treats the characterization by means of statistical distributions of algorithm performance in terms of solution quality, runtime and other measures; and the third part collects advanced methods from experimental design for configuring and tuning algorithms on a specific class of instances with the goal of using the least amount of experimentation. The contributor list includes leading scientists in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies. This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment of optimization algorithms and, consequently, their design.
This book constitutes the refereed proceedings of the 7th International Symposium on Engineering Secure Software and Systems, ESSoS 2015, held in Milan, Italy, in March 2015. The 11 full papers presented together with 5 short papers were carefully reviewed and selected from 41 submissions. The symposium features the following topics: formal methods; cloud passwords; machine learning; measurements ontologies; and access control.
Dynamic logic (DL) recently had a highest impact on the development in several areas of modeling and algorithm design. The book discusses classical algorithms used for 30 to 50 years (where improvements are often measured by signal-to-clutter ratio), and also new areas, which did not previously exist. These achievements were recognized by National and International awards. Emerging areas include cognitive, emotional, intelligent systems, data mining, modeling of the mind, higher cognitive functions, evolution of languages and other. Classical areas include detection, recognition, tracking, fusion, prediction, inverse scattering, and financial prediction. All these classical areas are extended to using mixture models, which previously was considered unsolvable in most cases. Recent neuroimaging experiments proved that the brain-mind actually uses DL. "Emotional Cognitive Neural Algorithms with Engineering Applications" is written for professional scientists and engineers developing computer and information systems, for professors teaching modeling and algorithms, and for students working on Masters and Ph.D. degrees in these areas. The book will be of interest to psychologists and neuroscientists interested in mathematical models of the brain and min das well. |
You may like...
Concept Parsing Algorithms (CPA) for…
Uri Shafrir, Masha Etkind
Hardcover
R3,276
Discovery Miles 32 760
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
Principles of Radio Navigation for…
Sauta O.I., Shatrakov A.Y., …
Hardcover
R2,653
Discovery Miles 26 530
Computational Intelligence for Machine…
Rajshree Srivastava, Pradeep Kumar Mallick, …
Hardcover
R3,875
Discovery Miles 38 750
|