![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
This book constitutes the thoroughly refereed post-proceedings of the 17th Annual International Workshop on Selected Areas in Cryptography, SAC 2010, held in Waterloo, Ontario, Canada in August 2010. The 24 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 90 submissions. The papers are organized in topical sections on hash functions, stream ciphers, efficient implementations, coding and combinatorics, block ciphers, side channel attacks, and mathematical aspects.
This book constitutes the refereed proceedings of the 5th
International Conference on Information Theoretic Security, held in
Amsterdam, The Netherlands, in May 2011.
Since the advent of optical communications, a greattechnological effort has been devoted to the exploitation of the huge bandwidth of optical fibers. Sta- ing from a few Mb/s single channel systems, a fast and constant technological development has led to the actual 10 Gb/s per channel dense wavelength - vision multiplexing (DWDM) systems, with dozens of channels on a single fiber. Transmitters and receivers are now ready for 40 Gb/s, whereas hundreds of channels can be simultaneously amplified by optical amplifiers. Nevertheless, despite such a pace in technological progress, optical c- munications are still in a primitive stage if compared, for instance, to radio communications: the widely spread on-off keying (OOK) modulation format is equivalent to the rough amplitude modulation (AM) format, whereas the DWDM technique is nothing more than the optical version of the frequency - vision multiplexing (FDM) technique. Moreover, adaptive equalization, ch- nel coding or maximum likelihood detection are still considered something "exotic" in the optical world. This is mainly due to the favourable char- teristics of the fiber optic channel (large bandwidth, low attenuation, channel stability, ...), which so far allowed us to use very simple transmission and detection techniques.
Computer Security in the 21st Century shares some of the emerging important research trends reflected in recent advances in computer security, including: security protocol design, secure peer-to-peer and ad hoc networks, multimedia security, and intrusion detection, defense and measurement. Highlights include presentations of: - Fundamental new security - Cryptographic protocols and design, - A new way of measuring network vulnerability: attack surfaces, - Network vulnerability and building impenetrable systems, - Multimedia content protection including a new standard for photographic images, JPEG2000. Researchers and computer security developers will find in this book interesting and useful insights into building computer systems that protect against computer worms, computer viruses, and other related concerns.
RFID Security: Techniques, Protocols and System-On-Chip Design is an edited book covering fundamentals, security theories and protocols, and hardware implementations for cryptography algorithms and security techniques in RFID. The volume is structured in three parts. Part 1 deals with RFID fundamentals, including system architectures and applications. Part 2 addresses RFID security protocols and techniques with a comprehensive collection of the recent state-of-art protocols and techniques to secure RFID avoiding all potential security forces and cracks. Finally, the book discusses hardware implementation of security algorithms. This section deals with the hardware implementations of cryptography algorithms and protocols dedicated to RFID platforms and chips.
This book constitutes the refereed proceedings of the 30th Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2011, held in Tallinn, Estonia, in May 2011. The 31 papers, presented together with 2 invited talks, were carefully reviewed and selected from 167 submissions. The papers are organized in topical sections on lattice-base cryptography, implementation and side channels, homomorphic cryptography, signature schemes, information-theoretic cryptography, symmetric key cryptography, attacks and algorithms, secure computation, composability, key dependent message security, and public key encryption.
Nowadays it is hard to find an electronic device which does not use codes: for example, we listen to music via heavily encoded audio CD's and we watch movies via encoded DVD's. There is at least one area where the use of encoding/decoding is not so developed, yet: Flash non-volatile memories. Flash memory high-density, low power, cost effectiveness, and scalable design make it an ideal choice to fuel the explosion of multimedia products, like USB keys, MP3 players, digital cameras and solid-state disk. In ECC for Non-Volatile Memories the authors expose the basics of coding theory needed to understand the application to memories, as well as the relevant design topics, with reference to both NOR and NAND Flash architectures. A collection of software routines is also included for better understanding. The authors form a research group (now at Qimonda) which is the typical example of a fruitful collaboration between mathematicians and engineers.
Reflects recent developments in its emphasis on randomized and approximation algorithms and communication models
This book constitutes the thoroughly refereed proceedings of the 8th Theory of Cryptography Conference, TCC 2011, held in Providence, Rhode Island, USA, in March 2011. The 35 revised full papers are presented together with 2 invited talks and were carefully reviewed and selected from 108 submissions. The papers are organized in topical sections on hardness amplification, leakage resilience, tamper resilience, encryption, composable security, secure computation, privacy, coin tossing and pseudorandomness, black-box constructions and separations, and black box separations.
This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation.
This book constitutes the revised selected papers of the 20th International Workshop on Combinatorial Algorithms, held in June/July 2009 in the castle of Hradec nad Moravici, Czech Republic. The 41 papers included in this volume together with 5 invited papers were carefully reviewed and selected from over 100 submissions. The topics dealt with are algorithms and data structures, applications, combinatorial enumeration, combinatorial optimization, complexity theory, computational biology, databases, decompositions and combinatorial designs, discrete and computational geometry, including graph drawing, and graph theory and combinatorics.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
This book constitutes the thoroughly refereed post-conference proceedings of the Second International Conference on Information Theoretic Security, ICITS 2007, held in Madrid, Spain, in May 2007. The 13 revised full papers presented in this volume were carefully reviewed and selected from 26 submissions. There were one invited keynote speech and 3 invited talks to the conference. The topics covered are authentication, group cryptography, private and reliable message transmission, secret sharing, and applications of information theory.
This book constitutes the thoroughly refereed post-conference proceedings of the 5th International ICST Conference, SecureComm 2009, held in September 2009 in Athens, Greece. The 19 revised full papers and 7 revised short papers were carefully reviewed and selected from 76 submissions. The papers cover various topics such as wireless network security, network intrusion detection, security and privacy for the general internet, malware and misbehavior, sensor networks, key management, credentials and authentications, as well as secure multicast and emerging technologies.
This book constitutes the refereed proceedings of the 12th International Conference on Information Security Conference, ISC 2009, held in Pisa, Italy, September 7-9, 2009. The 29 revised full papers and 9 revised short papers presented were carefully reviewed and selected from 105 submissions. The papers are organized in topical sections on analysis techniques, hash functions, database security and biometrics, algebraic attacks and proxy re-encryption, distributed system security, identity management and authentication, applied cryptography, access control, MAC and nonces, and P2P and Web services.
This book constitutes the refereed proceedings of the 9th International Conference on Next Generation Teletraffic and Wired/Wireless Advanced Networking, NEW2AN 2009, held in conjunction with the Second Conference on Smart Spaces, ruSMART 2009 in St. Petersburg, Russia, in September 2009. The 32 revised full papers presented were carefully reviewed and selected from a total of 82 submissions. The NEW2AN papers are organized in topical sections on teletraffic issues; traffic measurements, modeling, and control; peer-to-peer systems; security issues; wireless networks: ad hoc and mesh; and wireless networks: capacity and mobility. The ruSMART papers start with an invited talk followed by 10 papers on smart spaces.
In 1974, the British government admitted that its WWII secret intelligence organization had read Germany's ciphers on a massive scale. The intelligence from these decrypts influenced the Atlantic, the Eastern Front and Normandy. Why did the Germans never realize the Allies had so thoroughly penetrated their communications? As German intelligence experts conducted numerous internal investigations that all certified their ciphers' security, the Allies continued to break more ciphers and plugged their own communication leaks. How were the Allies able to so thoroughly exploit Germany's secret messages? How did they keep their tremendous success a secret? What flaws in Germany's organization allowed this counterintelligence failure and how can today's organizations learn to avoid similar disasters? This book, the first comparative study of WWII SIGINT (Signals Intelligence), analyzes the characteristics that allowed the Allies SIGINT success and that fostered the German blindness to Enigma's compromise.
Biometrics is a rapidly evolving field with applications ranging from accessing one 's computer to gaining entry into a country. The deployment of large-scale biometric systems in both commercial and government applications has increased public awareness of this technology. Recent years have seen significant growth in biometric research resulting in the development of innovative sensors, new algorithms, enhanced test methodologies and novel applications. This book addresses this void by inviting some of the prominent researchers in Biometrics to contribute chapters describing the fundamentals as well as the latest innovations in their respective areas of expertise.
Privacy, Security and Trust within the Context of Pervasive Computing is an edited volume based on a post workshop at the second international conference on Pervasive Computing. The workshop was held April18-23, 2004, in Vienna, Austria. The goal of the workshop was not to focus on specific, even novel mechanisms, but rather on the interfaces between mechanisms in different technical and social problem spaces. An investigation of the interfaces between the notions of context, privacy, security, and trust will result in a deeper understanding of the "atomic" problems, leading to a more complete understanding of the social and technical issues in pervasive computing.
The NordSec workshops were started in 1996 with the aim of bringing together - searchers and practitioners within computer security in the Nordic countries - thereby establishing a forum for discussions and co-operation between universities, industry and computer societies. Since then, the workshop has developed into a fully fledged inter- tional information security conference, held in the Nordic countries on a round robin basis. The 14th Nordic Conference on Secure IT Systems was held in Oslo on 14-16 October 2009. Under the theme Identity and Privacy in the Internet Age, this year's conference explored policies, strategies and technologies for protecting identities and the growing flow of personal information passing through the Internet and mobile n- works under an increasingly serious threat picture. Among the contemporary security issues discussed were security services modeling, Petri nets, attack graphs, electronic voting schemes, anonymous payment schemes, mobile ID-protocols, SIM cards, n- work embedded systems, trust, wireless sensor networks, privacy, privacy disclosure regulations, financial cryptography, PIN verification, temporal access control, random number generators, and some more. As a pre-cursor to the conference proper, the Nordic Security Day on Wednesday 14 October hosted talks by leading representatives from industry, academia and the g- ernment sector, and a press conference was given.
In recent years there has been growing scientific interest in the triangular relationship between knowledge. complexity and innovation systems. The concept of'innovation systems' carries the idea that innovations do not originate as isolated discrete phenomena, but are generated through the interaction of a number of actors or agents. This set of actors and interactions possess certain specific characteristics that tend to remain over time. Such characteristics are also shared by national, regional, sectoral and technological interaction systems. They can all be represented as sets of institutional] actors and interactions, whose ultimate goal is the production and diffusion of knowledge. The major theoretical and policy problem posed by these systems is that knowledge is generated not only by individuals and organisations, but also by the often complex pattern of interaction between them. To understand how organisations create new products, new production techniques and new organisational forms is important. An even more fundamental need is to understand how organisations create new knowledge if this knowledge creation lies in the mobilisation and conversion of tacit knowledge. Although much has been written about the importance of knowledge in management, little attention has been paid to how knowledge is created and how the knowledge creation process is managed. The third component of the research triangle concerns complexity."
FastSoftwareEncryption2009wasthe16thin a seriesofworkshopsonsymm- ric key cryptography. Starting from 2002, it is sponsored by the International Association for Cryptologic Research (IACR). FSE 2009 was held in Leuven, Belgium, after previous venues held in Cambridge, UK (1993, 1996), Leuven, Belgium (1994, 2002), Haifa, Israel (1997), Paris, France (1998, 2005), Rome, Italy (1999), New York, USA (2000), Yokohama, Japan (2001), Lund, Sweden (2003), New Delhi, India (2004), Graz, Austria (2006), Luxembourg, Lux- bourg (2007), and Lausanne, Switzerland (2008). The workshop's main topic is symmetric key cryptography, including the designoffast andsecuresymmetrickeyprimitives,suchas block ciphers,stream ciphers, hash functions, message authentication codes, modes of operation and iteration, as well as the theoretical foundations of these primitives. This year, 76 papers were submitted to FSE including a large portion of papers on hash functions, following the NIST SHA-3 competition, whose wo- shop was held just after FSE in the same location. From the 76 papers, 24 were accepted for presentation. It is my pleasure to thank all the authors of all s- missions for the high-quality research, which is the base for the scienti?c value of the workshop. The review process was thorough (each submission received the attention of at least three reviewers), and at the end, besides the accepted papers, the Committee decided that the merits of the paper "Blockcipher-Based Hashing Revisited" entitled the authors to receive the best paper award. I wish to thank all Committee members and the referees for their hard and dedicated work.
As e-learning increases in popularity and reach, more people are taking online courses and need to understand the relevant security issues. This book discusses typical threats to e-learning projects, introducing how they have been and should be addressed.
This book provides a good introduction to the classical elementary number theory and the modern algorithmic number theory, and their applications in computing and information technology, including computer systems design, cryptography and network security. In this second edition proofs of many theorems have been provided, further additions and corrections were made.
As cyberspace continues to rapidly expand, its infrastructure is now an in- gral part of the world's economy and social structure. Given this increasing int- connectivity and interdependence, what progress has been made in developing an ecosystem of safety and security? This study is the second phase of an initial - tempt to survey and catalog the multitude of emerging organizations promoting global initiatives to secure cyberspace. The authors provide a breakdown and analysis of organizations by type, - cluding international, regional, private-public, and non-governmental organi- tions. Concluding with a discussion of the progress made in recent years, the study explores current trends regarding the effectiveness and scope of coverage provided by these organizations and addresses several questions concerning the overall state of international cyber security. The authors would like to thank Mr. Anthony Rutkowski for generously p- viding his time, guidance, and support. The authors would also like to thank the International Telecommunication Union (ITU) Telecommunication Development Sector (ITU-D) and the United States National Science Foundation (NSF Grant R3772) for partially supporting the research conducted in this study. In addition, the authors would like to thank the Georgia Institute of Technology's Center for International Strategy, Technology, and Policy (CISTP) for assistance in hosting the Cyber Security Organization Catalog, and the Georgia Tech Information Se- rity Center (GTISC) for cooperation and promotion of this study. Table of Contents 1 The International Landscape of Cyber Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2 A Brief History of Global Responses to Cyber Threats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
You may like...
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R691
Discovery Miles 6 910
Performances of Peace: Utrecht 1713
Renger Bruin, Cornelis Haven, …
Hardcover
R4,705
Discovery Miles 47 050
Information Security - Foundations…
Ali Ismail Awad, Michael Fairhurst
Hardcover
|