![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
The application of data warehousing and data mining techniques to computer security is an important emerging area, as information processing and internet accessibility costs decline and more and more organizations become vulnerable to cyber attacks. These security breaches include attacks on single computers, computer networks, wireless networks, databases, or authentication compromises. This book describes data warehousing and data mining techniques that can be used to detect attacks. It is designed to be a useful handbook for practitioners and researchers in industry, and is also suitable as a text for advanced-level students in computer science.
This book collects survey papers in the fields of entropy, search and complexity, summarizing the latest developments in their respective areas. More than half of the papers belong to search theory which lies on the borderline of mathematics and computer science, information theory and combinatorics, respectively. The book will be useful to experienced researchers as well as young scientists and students both in mathematics and computer science.
With the advances of the digital information revolution and the societal changes they have prompted, it has become critical to facilitate secure management of content usage and delivery across communication networks. Data hiding and digital watermarking are promising new technologies for multimedia information protection and rights management. Multimedia Data Hiding addresses the theory, methods, and design of multimedia data hiding and its application to multimedia rights management, information security, and communication. It offers theoretical and practical aspects, and both design and attack problems. Applications discussed include: annotation, tamper detection, copy/access control, fingerprinting, and ownership protection. Countermeasures for attacks on data hiding are discussed, and a chapter assesses attack problems on digital music protection under a unique competitive environment. Topics and features: * Comprehensive and practical coverage of data hiding for various media types, including binary image, grayscale and color images and video, and audio * Provides unique analysis of problems and solutions, such as data hiding in binary signature and generic binary documents, block concealment attacks, and attacks on audio watermarking * Authoritative discussion and analysis of data hiding and effective countermeasures, supported by concrete application examples * Accessible, well-organized progression from the fundamentals to specific approaches to various data-hiding problems This work offers a state-of-the-art presentation covering theoretical, algorithmic, and design topics for digital content/data security protection, and rights management. It is an essential resource for multimedia security researchers and professionals in electrical engineering, computer science, IT, and digital rights management.
This book constitutes the thoroughly refereed post-proceedings of the 17th Annual International Workshop on Selected Areas in Cryptography, SAC 2010, held in Waterloo, Ontario, Canada in August 2010. The 24 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 90 submissions. The papers are organized in topical sections on hash functions, stream ciphers, efficient implementations, coding and combinatorics, block ciphers, side channel attacks, and mathematical aspects.
This book constitutes the refereed proceedings of the 5th
International Conference on Information Theoretic Security, held in
Amsterdam, The Netherlands, in May 2011.
Since the advent of optical communications, a greattechnological effort has been devoted to the exploitation of the huge bandwidth of optical fibers. Sta- ing from a few Mb/s single channel systems, a fast and constant technological development has led to the actual 10 Gb/s per channel dense wavelength - vision multiplexing (DWDM) systems, with dozens of channels on a single fiber. Transmitters and receivers are now ready for 40 Gb/s, whereas hundreds of channels can be simultaneously amplified by optical amplifiers. Nevertheless, despite such a pace in technological progress, optical c- munications are still in a primitive stage if compared, for instance, to radio communications: the widely spread on-off keying (OOK) modulation format is equivalent to the rough amplitude modulation (AM) format, whereas the DWDM technique is nothing more than the optical version of the frequency - vision multiplexing (FDM) technique. Moreover, adaptive equalization, ch- nel coding or maximum likelihood detection are still considered something "exotic" in the optical world. This is mainly due to the favourable char- teristics of the fiber optic channel (large bandwidth, low attenuation, channel stability, ...), which so far allowed us to use very simple transmission and detection techniques.
Computer Security in the 21st Century shares some of the emerging important research trends reflected in recent advances in computer security, including: security protocol design, secure peer-to-peer and ad hoc networks, multimedia security, and intrusion detection, defense and measurement. Highlights include presentations of: - Fundamental new security - Cryptographic protocols and design, - A new way of measuring network vulnerability: attack surfaces, - Network vulnerability and building impenetrable systems, - Multimedia content protection including a new standard for photographic images, JPEG2000. Researchers and computer security developers will find in this book interesting and useful insights into building computer systems that protect against computer worms, computer viruses, and other related concerns.
RFID Security: Techniques, Protocols and System-On-Chip Design is an edited book covering fundamentals, security theories and protocols, and hardware implementations for cryptography algorithms and security techniques in RFID. The volume is structured in three parts. Part 1 deals with RFID fundamentals, including system architectures and applications. Part 2 addresses RFID security protocols and techniques with a comprehensive collection of the recent state-of-art protocols and techniques to secure RFID avoiding all potential security forces and cracks. Finally, the book discusses hardware implementation of security algorithms. This section deals with the hardware implementations of cryptography algorithms and protocols dedicated to RFID platforms and chips.
This book constitutes the refereed proceedings of the 30th Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2011, held in Tallinn, Estonia, in May 2011. The 31 papers, presented together with 2 invited talks, were carefully reviewed and selected from 167 submissions. The papers are organized in topical sections on lattice-base cryptography, implementation and side channels, homomorphic cryptography, signature schemes, information-theoretic cryptography, symmetric key cryptography, attacks and algorithms, secure computation, composability, key dependent message security, and public key encryption.
Nowadays it is hard to find an electronic device which does not use codes: for example, we listen to music via heavily encoded audio CD's and we watch movies via encoded DVD's. There is at least one area where the use of encoding/decoding is not so developed, yet: Flash non-volatile memories. Flash memory high-density, low power, cost effectiveness, and scalable design make it an ideal choice to fuel the explosion of multimedia products, like USB keys, MP3 players, digital cameras and solid-state disk. In ECC for Non-Volatile Memories the authors expose the basics of coding theory needed to understand the application to memories, as well as the relevant design topics, with reference to both NOR and NAND Flash architectures. A collection of software routines is also included for better understanding. The authors form a research group (now at Qimonda) which is the typical example of a fruitful collaboration between mathematicians and engineers.
Reflects recent developments in its emphasis on randomized and approximation algorithms and communication models
This book constitutes the thoroughly refereed proceedings of the 8th Theory of Cryptography Conference, TCC 2011, held in Providence, Rhode Island, USA, in March 2011. The 35 revised full papers are presented together with 2 invited talks and were carefully reviewed and selected from 108 submissions. The papers are organized in topical sections on hardness amplification, leakage resilience, tamper resilience, encryption, composable security, secure computation, privacy, coin tossing and pseudorandomness, black-box constructions and separations, and black box separations.
This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation.
This book constitutes the revised selected papers of the 20th International Workshop on Combinatorial Algorithms, held in June/July 2009 in the castle of Hradec nad Moravici, Czech Republic. The 41 papers included in this volume together with 5 invited papers were carefully reviewed and selected from over 100 submissions. The topics dealt with are algorithms and data structures, applications, combinatorial enumeration, combinatorial optimization, complexity theory, computational biology, databases, decompositions and combinatorial designs, discrete and computational geometry, including graph drawing, and graph theory and combinatorics.
In 1974, the British government admitted that its WWII secret intelligence organization had read Germany's ciphers on a massive scale. The intelligence from these decrypts influenced the Atlantic, the Eastern Front and Normandy. Why did the Germans never realize the Allies had so thoroughly penetrated their communications? As German intelligence experts conducted numerous internal investigations that all certified their ciphers' security, the Allies continued to break more ciphers and plugged their own communication leaks. How were the Allies able to so thoroughly exploit Germany's secret messages? How did they keep their tremendous success a secret? What flaws in Germany's organization allowed this counterintelligence failure and how can today's organizations learn to avoid similar disasters? This book, the first comparative study of WWII SIGINT (Signals Intelligence), analyzes the characteristics that allowed the Allies SIGINT success and that fostered the German blindness to Enigma's compromise.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
This book constitutes the thoroughly refereed post-conference proceedings of the Second International Conference on Information Theoretic Security, ICITS 2007, held in Madrid, Spain, in May 2007. The 13 revised full papers presented in this volume were carefully reviewed and selected from 26 submissions. There were one invited keynote speech and 3 invited talks to the conference. The topics covered are authentication, group cryptography, private and reliable message transmission, secret sharing, and applications of information theory.
Learning to program isn't just learning the details of a programming language: to become a good programmer you have to become expert at debugging, testing, writing clear code and generally unsticking yourself when you get stuck, while to do well in a programming course you have to learn to score highly in coursework and exams. Featuring tips, stories and explanations of key terms, this book teaches these skills explicitly. Examples in Python, Java and Haskell are included, helping you to gain transferable programming skills whichever language you are learning. Intended for students in Higher or Further Education studying early programming courses, it will help you succeed in, and get the most out of, your course, and support you in developing the software engineering habits that lead to good programs.
This book constitutes the thoroughly refereed post-conference proceedings of the 5th International ICST Conference, SecureComm 2009, held in September 2009 in Athens, Greece. The 19 revised full papers and 7 revised short papers were carefully reviewed and selected from 76 submissions. The papers cover various topics such as wireless network security, network intrusion detection, security and privacy for the general internet, malware and misbehavior, sensor networks, key management, credentials and authentications, as well as secure multicast and emerging technologies.
This book constitutes the refereed proceedings of the 12th International Conference on Information Security Conference, ISC 2009, held in Pisa, Italy, September 7-9, 2009. The 29 revised full papers and 9 revised short papers presented were carefully reviewed and selected from 105 submissions. The papers are organized in topical sections on analysis techniques, hash functions, database security and biometrics, algebraic attacks and proxy re-encryption, distributed system security, identity management and authentication, applied cryptography, access control, MAC and nonces, and P2P and Web services.
This book constitutes the refereed proceedings of the 9th International Conference on Next Generation Teletraffic and Wired/Wireless Advanced Networking, NEW2AN 2009, held in conjunction with the Second Conference on Smart Spaces, ruSMART 2009 in St. Petersburg, Russia, in September 2009. The 32 revised full papers presented were carefully reviewed and selected from a total of 82 submissions. The NEW2AN papers are organized in topical sections on teletraffic issues; traffic measurements, modeling, and control; peer-to-peer systems; security issues; wireless networks: ad hoc and mesh; and wireless networks: capacity and mobility. The ruSMART papers start with an invited talk followed by 10 papers on smart spaces.
Biometrics is a rapidly evolving field with applications ranging from accessing one 's computer to gaining entry into a country. The deployment of large-scale biometric systems in both commercial and government applications has increased public awareness of this technology. Recent years have seen significant growth in biometric research resulting in the development of innovative sensors, new algorithms, enhanced test methodologies and novel applications. This book addresses this void by inviting some of the prominent researchers in Biometrics to contribute chapters describing the fundamentals as well as the latest innovations in their respective areas of expertise.
Privacy, Security and Trust within the Context of Pervasive Computing is an edited volume based on a post workshop at the second international conference on Pervasive Computing. The workshop was held April18-23, 2004, in Vienna, Austria. The goal of the workshop was not to focus on specific, even novel mechanisms, but rather on the interfaces between mechanisms in different technical and social problem spaces. An investigation of the interfaces between the notions of context, privacy, security, and trust will result in a deeper understanding of the "atomic" problems, leading to a more complete understanding of the social and technical issues in pervasive computing.
The NordSec workshops were started in 1996 with the aim of bringing together - searchers and practitioners within computer security in the Nordic countries - thereby establishing a forum for discussions and co-operation between universities, industry and computer societies. Since then, the workshop has developed into a fully fledged inter- tional information security conference, held in the Nordic countries on a round robin basis. The 14th Nordic Conference on Secure IT Systems was held in Oslo on 14-16 October 2009. Under the theme Identity and Privacy in the Internet Age, this year's conference explored policies, strategies and technologies for protecting identities and the growing flow of personal information passing through the Internet and mobile n- works under an increasingly serious threat picture. Among the contemporary security issues discussed were security services modeling, Petri nets, attack graphs, electronic voting schemes, anonymous payment schemes, mobile ID-protocols, SIM cards, n- work embedded systems, trust, wireless sensor networks, privacy, privacy disclosure regulations, financial cryptography, PIN verification, temporal access control, random number generators, and some more. As a pre-cursor to the conference proper, the Nordic Security Day on Wednesday 14 October hosted talks by leading representatives from industry, academia and the g- ernment sector, and a press conference was given.
In recent years there has been growing scientific interest in the triangular relationship between knowledge. complexity and innovation systems. The concept of'innovation systems' carries the idea that innovations do not originate as isolated discrete phenomena, but are generated through the interaction of a number of actors or agents. This set of actors and interactions possess certain specific characteristics that tend to remain over time. Such characteristics are also shared by national, regional, sectoral and technological interaction systems. They can all be represented as sets of institutional] actors and interactions, whose ultimate goal is the production and diffusion of knowledge. The major theoretical and policy problem posed by these systems is that knowledge is generated not only by individuals and organisations, but also by the often complex pattern of interaction between them. To understand how organisations create new products, new production techniques and new organisational forms is important. An even more fundamental need is to understand how organisations create new knowledge if this knowledge creation lies in the mobilisation and conversion of tacit knowledge. Although much has been written about the importance of knowledge in management, little attention has been paid to how knowledge is created and how the knowledge creation process is managed. The third component of the research triangle concerns complexity." |
You may like...
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R539
Discovery Miles 5 390
Analysis, Cryptography And Information…
Panos M. Pardalos, Nicholas J. Daras, …
Hardcover
R2,356
Discovery Miles 23 560
|