![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
In this book, the following three approaches to data analysis are presented: - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, - Rough Sets, founded by Zdzislaw I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected.
The two volume-set, LNCS 8616 and LNCS 8617, constitutes the refereed proceedings of the 34th Annual International Cryptology Conference, CRYPTO 2014, held in Santa Barbara, CA, USA, in August 2014. The 60 revised full papers presented in LNCS 8616 and LNCS 8617 were carefully reviewed and selected from 227 submissions. The papers are organized in topical sections on symmetric encryption and PRFs; formal methods; hash functions; groups and maps; lattices; asymmetric encryption and signatures; side channels and leakage resilience; obfuscation; FHE; quantum cryptography; foundations of hardness; number-theoretic hardness; information-theoretic security; key exchange and secure communication; zero knowledge; composable security; secure computation - foundations; secure computation - implementations.
This two volume set LNCS 8634 and LNCS 8635 constitutes the refereed conference proceedings of the 39th International Symposium on Mathematical Foundations of Computer Science, MFCS 2014, held in Budapest, Hungary, in August 2014. The 95 revised full papers presented together with 6 invited talks were carefully selected from 270 submissions. The focus of the conference was on following topics: Logic, Semantics, Automata, Theory of Programming, Algorithms, Complexity, Parallel and Distributed Computing, Quantum Computing, Automata, Grammars and Formal Languages, Combinatorics on Words, Trees and Games.
This book - in conjunction with the volumes LNCS 8588 and LNBI 8590 - constitutes the refereed proceedings of the 10th International Conference on Intelligent Computing, ICIC 2014, held in Taiyuan, China, in August 2014. The 85 papers of this volume were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections such as soft computing; artificial bee colony algorithms; unsupervised learning; kernel methods and supporting vector machines; machine learning; fuzzy theory and algorithms; image processing; intelligent computing in computer vision; intelligent computing in communication networks; intelligent image/document retrievals; intelligent data analysis and prediction; intelligent agent and Web applications; intelligent fault diagnosis; knowledge representation/reasoning; knowledge discovery and data mining; natural language processing and computational linguistics; next gen sequencing and metagenomics; intelligent computing in scheduling and engineering optimization; advanced modeling, control and optimization techniques for complex engineering systems; complex networks and their applications; time series forecasting and analysis using artificial neural networks; computer human interaction using multiple visual cues and intelligent computing; biometric system and security for intelligent computing.
This book constitutes the thoroughly refereed post-conference proceedings of the Joint International Conference on Pervasive Computing and Web Society, ICPCA/SWS 2013, held in Vina de Mar, Chile, in December 2013. The 56 revised full papers presented together with 29 poster papers were carefully reviewed and selected from 156 submissions. The papers are organized in topical sections on infrastructure and devices; service and solution; data and knowledge; as well as community.
This two volume set LNCS 9234 and 9235 constitutes the refereed conference proceedings of the 40th International Symposium on Mathematical Foundations of Computer Science, MFCS 2015, held in Milan, Italy, in August 2015. The 82 revised full papers presented together with 5 invited talks were carefully selected from 201 submissions. The papers feature high-quality research in all branches of theoretical computer science. They have been organized in the following topical main sections: logic, semantics, automata, and theory of programming (volume 1) and algorithms, complexity, and games (volume 2).
This book constitutes the thoroughly refereed post-conference proceedings of the 22nd International Workshop on Fast Software Encryption, held in Istanbul, Turkey, March 8-11, 2015. The 28 revised full papers presented were carefully reviewed and selected from 71 initial submissions. The papers are organized in topical sections on block cipher cryptanalysis; understanding attacks; implementation issues; more block cipher cryptanalysis; cryptanalysis of authenticated encryption schemes; proofs; design; lightweight; cryptanalysis of hash functions and stream ciphers; and mass surveillance.
Lars Dannecker developed a novel online forecasting process that significantly improves how forecasts are calculated. It increases forecasting efficiency and accuracy, as well as allowing the process to adapt to different situations and applications. Improving the forecasting efficiency is a key pre-requisite for ensuring stable electricity grids in the face of an increasing amount of renewable energy sources. It is also important to facilitate the move from static day ahead electricity trading towards more dynamic real-time marketplaces. The online forecasting process is realized by a number of approaches on the logical as well as on the physical layer that we introduce in the course of this book. Nominated for the Georg-Helm-Preis 2015 awarded by the Technische Universitat Dresden.
Dynamic secrets are constantly generated and updated from messages exchanged between two communication users. When dynamic secrets are used as a complement to existing secure communication systems, a stolen key or password can be quickly and automatically reverted to its secret status without disrupting communication. "Dynamic Secrets in Communication Security" presents unique security properties and application studies for this technology. Password theft and key theft no longer pose serious security threats when parties frequently use dynamic secrets. This book also illustrates that a dynamic secret based security scheme guarantees impersonation attacks are detected even if an adversary steals a user's password or their key is lost. Practitioners and researchers working in network security or wireless communications will find this book a must-have reference. "Dynamic Secrets in Communication Security" is also a valuable secondary text for advanced-level students in computer science and electrical engineering.
Machine learning, one of the top emerging sciences, has an extremely broad range of applications. However, many books on the subject provide only a theoretical approach, making it difficult for a newcomer to grasp the subject material. This book provides a more practical approach by explaining the concepts of machine learning algorithms and describing the areas of application for each algorithm, using simple practical examples to demonstrate each algorithm and showing how different issues related to these algorithms are applied.
This volume constitutes the proceedings of the 9th International Conference on Hybrid Artificial Intelligent Systems, HAIS 2014, held in Salamanca, Spain, in June 2014. The 61 papers published in this volume were carefully reviewed and selected from 199 submissions. They are organized in topical sessions on HAIS applications; data mining and knowledge discovery; video and image analysis; bio-inspired models and evolutionary computation; learning algorithms; hybrid intelligent systems for data mining and applications and classification and cluster analysis.
This book constitutes the proceedings of the International Conference on the Integration of Artificial Intelligence (AI) and Operations Research (OR) Techniques in Constraint Programming, CPAIOR 2014, held in Cork, Ireland, in May 2014. The 33 papers presented in this volume were carefully reviewed and selected from 70 submissions. The papers focus on constraint programming and global constraints; scheduling modelling; encodings and SAT logistics; MIP; CSP and complexity; parallelism and search; and data mining and machine learning.
The two volume set LNAI 8481 and 8482 constitutes the refereed conference proceedings of the 27th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE 2014, held in Kaohsiung, Taiwan, in June 2014. The total of 106 papers selected for the proceedings were carefully reviewed and selected from various submissions. The papers deal with a wide range of topics from applications of applied intelligent systems to solve real-life problems in all areas including engineering, science, industry, automation and robotics, business and finance, medicine and biomedicine, bioinformatics, cyberspace and human-machine interaction.
This monograph presents examples of best practices when combining bioinspired algorithms with parallel architectures. The book includes recent work by leading researchers in the field and offers a map with the main paths already explored and new ways towards the future. Parallel Architectures and Bioinspired Algorithms will be of value to both specialists in Bioinspired Algorithms, Parallel and Distributed Computing, as well as computer science students trying to understand the present and the future of Parallel Architectures and Bioinspired Algorithms.
This book is the outcome of the Dagstuhl Seminar 13201 on Information Visualization - Towards Multivariate Network Visualization, held in Dagstuhl Castle, Germany in May 2013. The goal of this Dagstuhl Seminar was to bring together theoreticians and practitioners from Information Visualization, HCI and Graph Drawing with a special focus on multivariate network visualization, i.e., on graphs where the nodes and/or edges have additional (multidimensional) attributes. The integration of multivariate data into complex networks and their visual analysis is one of the big challenges not only in visualization, but also in many application areas. Thus, in order to support discussions related to the visualization of real world data, also invited researchers from selected application areas, especially bioinformatics, social sciences and software engineering. The unique "Dagstuhl climate" ensured an open and undisturbed atmosphere to discuss the state-of-the-art, new directions and open challenges of multivariate network visualization.
The importance of benchmarking in the service sector is well recognized as it helps in continuous improvement in products and work processes. Through benchmarking, companies have strived to implement best practices in order to remain competitive in the product- market in which they operate. However studies on benchmarking, particularly in the software development sector, have neglected using multiple variables and therefore have not been as comprehensive. Information Theory and Best Practices in the IT Industry fills this void by examining benchmarking in the business of software development and studying how it is affected by development process, application type, hardware platforms used, and many other variables. Information Theory and Best Practices in the IT Industry begins by examining practices of benchmarking productivity and critically appraises them. Next the book identifies different variables which affect productivity and variables that affect quality, developing useful equations that explaining their relationships. Finally these equations and findings are applied to case studies. Utilizing this book, practitioners can decide about what emphasis they should attach to different variables in their own companies, while seeking to optimize productivity and defect density.
The vast area of Scientific Computing, which is concerned with the computer- aided simulation of various processes in engineering, natural, economical, or social sciences, now enjoys rapid progress owing to the development of new efficient symbolic, numeric, and symbolic/numeric algorithms. There has already been for a long time a worldwide recognition of the fact that the mathematical term algorithm takes its origin from the Latin word algo- ritmi, which is in turn a Latin transliteration of the Arab name "AI Khoresmi" of the Khoresmian mathematician Moukhammad Khoresmi, who lived in the Khoresm khanate during the years 780 - 850. The Khoresm khanate took sig- nificant parts of the territories of present-day TUrkmenistan and Uzbekistan. Such towns of the Khoresm khanate as Bukhara and Marakanda (the present- day Samarkand) were the centers of mathematical science and astronomy. The great Khoresmian mathematician M. Khoresmi introduced the Indian decimal positional system into everyday's life; this system is based on using the famil- iar digits 1,2,3,4,5,6,7,8,9,0. M. Khoresmi had presented the arithmetic in the decimal positional calculus (prior to him, the Indian positional system was the subject only for jokes and witty disputes). Khoresmi's Book of Addition and Subtraction by Indian Method (Arithmetic) differs little from present-day arith- metic. This book was translated into Latin in 1150; the last reprint was produced in Rome in 1957.
This book constitutes the refereed proceedings of the Second IFIP TC 5/8 International Conference on Information and Communication Technology, ICT-Eur Asia 2014, with the collocation of Asia ARES 2014 as a special track on Availability, Reliability and Security, held in Bali, Indonesia, in April 2014. The 70 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers have been organized in the following topical sections: applied modeling and simulation; mobile computing; advanced urban-scale ICT applications; semantic web and knowledge management; cloud computing; image processing; software engineering; collaboration technologies and systems; e-learning; data warehousing and data mining; e-government and e-health; biometric and bioinformatics systems; network security; dependable systems and applications; privacy and trust management; cryptography; multimedia security and dependable systems and applications.
This book is designed both for FPGA users interested in developing new, specific components - generally for reducing execution times -and IP core designers interested in extending their catalog of specific components. The main focus is circuit synthesis and the discussion shows, for example, how a given algorithm executing some complex function can be translated to a synthesizable circuit description, as well as which are the best choices the designer can make to reduce the circuit cost, latency, or power consumption. This is not a book on algorithms. It is a book that shows how to translate efficiently an algorithm to a circuit, using techniques such as parallelism, pipeline, loop unrolling, and others. Numerous examples of FPGA implementation are described throughout this book and the circuits are modeled in VHDL. Complete and synthesizable source files are available for download."
Scientific Workflow has seen massive growth in recent years as science becomes increasingly reliant on the analysis of massive data sets and the use of distributed resources. The workflow programming paradigm is seen as a means of managing the complexity in defining the analysis, executing the necessary computations on distributed resources, collecting information about the analysis results, and providing means to record and reproduce the scientific analysis. Workflows for e-Science presents an overview of the current state of the art in the field. It brings together research from many of leading computer scientists in the workflow area and provides real world examples from domain scientists actively involved in e-Science. The computer science topics addressed in the book provide a broad overview of active research focusing on the areas of workflow representations and process models, component and service-based workflows, standardization efforts, workflow frameworks and tools, and problem solving environments and portals. The topics covered represent a broad range of scientific workflow and will be of interest to a wide range of computer science researchers, domain scientists interested in applying workflow technologies in their work, and engineers wanting to develop workflow systems and tools. As such Workflows for e-Science is an invaluable resource for potential or existing users of workflow technologies and a benchmark for developers and researchers. Ian Taylor is Lecturer in Computer Science at Cardiff University, and coordinator of Triana activities at Cardiff. He is the author of "From P2P to Web Services and Grids," also published by Springer. Ewa Deelman is a Research Assistant Professor at the USC Computer Science Department and a Research Team Leader at the Center for Grid Technologies at the USC Information Sciences Institute. Dennis Gannon is a professor of Computer Science in the School of Informatics at Indiana University. He is also Science Director for the Indiana Pervasive Technology Labs.. Dr Shields is a research associate at Cardiff and one of two lead developers for the Triana project.
This volume contains the post-proceedings of the 9th Doctoral Workshop on Mathematical and Engineering Methods in Computer Science, MEMICS 2014, held in Telc, Czech Republic, in October 2014. The 13 thoroughly revised papers were carefully selected out of 28 submissions and are presented together with 4 invited papers. The topics covered by the papers include: algorithms, logic, and games; high performance computing; computer aided analysis, verification, and testing; hardware design and diagnostics; computer graphics and image processing; and artificial intelligence and natural language processing.
Beginning Oracle SQL is your introduction to the interactive query tools and specific dialect of SQL used with Oracle Database. These tools include SQL*Plus and SQL Developer. SQL*Plus is the one tool any Oracle developer or database administrator can always count on, and it is widely used in creating scripts to automate routine tasks. SQL Developer is a powerful, graphical environment for developing and debugging queries. Oracle's is possibly the most valuable dialect of SQL from a career standpoint. Oracle's database engine is widely used in corporate environments worldwide. It is also found in many government applications. Oracle SQL implements many features not found in competing products. No developer or DBA working with Oracle can afford to be without knowledge of these features and how they work, because of the performance and expressiveness they bring to the table. Written in an easygoing and example-based style, Beginning Oracle SQL is the book that will get you started down the path to successfully writing SQL statements and getting results from Oracle Database. Takes an example-based approach, with clear and authoritative explanations Introduces both SQL and the query tools used to execute SQL statements Shows how to create tables, populate them with data, and then query that data to generate business results
This book constitutes the refereed proceedings of the Third International Conference on Principles of Security and Trust, POST 2014, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2014, Grenoble, France, in April 2014. The 15 papers presented in this volume were carefully reviewed and selected from 55 submissions. They are organized in topical sections named: analysis of cryptographic protocols; quantitative aspects of information flow; information flow control in programming languages; cryptography in implementations and policies and attacks.
The explosion of information technology has led to substantial growth of web-accessible linguistic data in terms of quantity, diversity and complexity. These resources become even more useful when interlinked with each other to generate network effects. The general trend of providing data online is thus accompanied by newly developing methodologies to interconnect linguistic data and metadata. This includes linguistic data collections, general-purpose knowledge bases (e.g., the DBpedia, a machine-readable edition of the Wikipedia), and repositories with specific information about languages, linguistic categories and phenomena. The Linked Data paradigm provides a framework for interoperability and access management, and thereby allows to integrate information from such a diverse set of resources. The contributions assembled in this volume illustrate the band-width of applications of the Linked Data paradigm for representative types of language resources. They cover lexical-semantic resources, annotated corpora, typological databases as well as terminology and metadata repositories. The book includes representative applications from diverse fields, ranging from academic linguistics (e.g., typology and corpus linguistics) over applied linguistics (e.g., lexicography and translation studies) to technical applications (in computational linguistics, Natural Language Processing and information technology). This volume accompanies the Workshop on Linked Data in Linguistics 2012 (LDL-2012) in Frankfurt/M., Germany, organized by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation (OKFN). It assembles contributions of the workshop participants and, beyond this, it summarizes initial steps in the formation of a Linked Open Data cloud of linguistic resources, the Linguistic Linked Open Data cloud (LLOD).
This book constitutes the refereed proceedings of the 17th International Conference on Practice and Theory in Public-Key Cryptography, PKC 2014, held in Buenos Aires, Argentina, in March 2014. The 38 papers presented were carefully reviewed and selected from 145 submissions. The papers are organized in topical sections on chosen ciphertext security, re-encryption, verifiable outsourcing, cryptanalysis, identity and attribute-based encryption, enhanced encryption, signature schemes, related-key security, functional authentication, quantum impossibility, privacy, protocols. |
You may like...
When Compressive Sensing Meets Mobile…
Linghe Kong, Bowen Wang, …
Hardcover
R2,653
Discovery Miles 26 530
A Real-Time In-Memory Discovery Service…
Jurgen Muller
Hardcover
Automated Workflow Scheduling in…
G. Kousalya, P. Balakrishnan, …
Hardcover
R1,600
Discovery Miles 16 000
Cohesive Subgraph Computation over Large…
Lijun Chang, Lu Qin
Hardcover
R1,408
Discovery Miles 14 080
MATLAB (R) for Engineers Explained
Fredrik Gustafsson, Niclas Bergman
Hardcover
R1,432
Discovery Miles 14 320
|