![]() |
![]() |
Your cart is empty |
||
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows."
This book introduces turbo error correcting concept in a simple language, including a general theory and the algorithms for decoding turbo-like code. It presents a unified framework for the design and analysis of turbo codes and LDPC codes and their decoding algorithms. A major focus is on high speed turbo decoding, which targets applications with data rates of several hundred million bits per second (Mbps).
Coordinated Multiuser Communications provides for the first time a unified treatment of multiuser detection and multiuser decoding in a single volume. Many communications systems, such as cellular mobile radio and wireless local area networks, are subject to multiple-access interference, caused by a multitude of users sharing a common transmission medium. The performance of receiver systems in such cases can be greatly improved by the application of joint detection and decoding methods. Multiuser detection and decoding not only improve system reliability and capacity, they also simplify the problem of resource allocation. Coordinated Multiuser Communications provides the reader with tools for the design and analysis of joint detection and joint decoding methods. These methods are developed within a unified framework of linear multiple-access channels, which includes code-division multiple-access, multiple antenna channels and orthogonal frequency division multiple access. Emphasis is placed on practical implementation aspects and modern iterative processing techniques for systems both with, and without integrated error control coding. Focusing on the theory and practice of unifying accessing and transmission aspects of communications, this book is a valuable reference for students, researchers and practicing engineers.
This edition has been called startlingly up-to-date, and in this corrected second printing you can be sure that it 's even more contemporaneous. It surveys from a unified point of view both the modern state and the trends of continuing development in various branches of number theory. Illuminated by elementary problems, the central ideas of modern theories are laid bare. Some topics covered include non-Abelian generalizations of class field theory, recursive computability and Diophantine equations, zeta- and L-functions. This substantially revised and expanded new edition contains several new sections, such as Wiles' proof of Fermat's Last Theorem, and relevant techniques coming from a synthesis of various theories.
Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes. It then examines codes based on the Galois field theory as well as their application in BCH and especially the Reed-Solomon codes that have been used for error correction of data transmissions in space missions. The major outlook in coding theory seems to be geared toward stochastic processes, and this book takes a bold step in this direction. As research focuses on error correction and recovery of erasures, the book discusses belief propagation and distributions. It examines the low-density parity-check and erasure codes that have opened up new approaches to improve wide-area network data transmission. It also describes modern codes, such as the Luby transform and Raptor codes, that are enabling new directions in high-speed transmission of very large data to multiple users. This robust, self-contained text fully explains coding problems, illustrating them with more than 200 examples. Combining theory and computational techniques, it will appeal not only to students but also to industry professionals, researchers, and academics in areas such as coding theory and signal and image processing.
The ubiquitous nature of the Internet is enabling a new generation of - pUcations to support collaborative work among geographically distant users. Security in such an environment is of utmost importance to safeguard the pri vacy of the communication and to ensure the integrity of the applications. 'Secure group communications' (SGC) refers to a scenario in which a group of participants can receive and send messages to group members, in a way that outsiders are unable to glean any information even when they are able to intercept the messages. SGC is becoming extremely important for researchers and practitioners because many applications that require SGC are now widely used, such as teleconferencing, tele-medicine, real-time information services, distributed interactive simulations, collaborative work, grid computing, and the deployment of VPN (Virtual Private Networks). Even though considerable research accomplishments have been achieved in SGC, few books exist on this very important topic. The purpose of this book is to provide a comprehensive survey of principles and state-of-the-art techniques for secure group communications over data net works. The book is targeted towards practitioners, researchers and students in the fields of networking, security, and software applications development. The book consists of 7 chapters, which are listed and described as follows."
This reference work looks at modern concepts of computer security. It introduces the basic mathematical background necessary to follow computer security concepts before moving on to modern developments in cryptography. The concepts are presented clearly and illustrated by numerous examples. Subjects covered include: private-key and public-key encryption, hashing, digital signatures, authentication, secret sharing, group-oriented cryptography, and many others. The section on intrusion detection and access control provide examples of security systems implemented as a part of operating system. Database and network security is also discussed. The final chapters introduce modern e- business systems based on digital cash.
This book is about language that is designed to mean what it does not seem to mean. Ciphers and codes conceal messages and protect secrets. Symbol and magic hide meanings to delight or imperil. Languages made to baffle and confuse let insiders talk openly without being understood by those beyond the circle. Barry Blake looks at these and many more. He explores the history and uses of the slangs and argots of schools and trades. He traces the centuries-old cants used by sailors and criminals in Britain, among them Polari, the mix of Italian, Yiddish, and slang once spoken among strolling players and circus folk and taken up by gays in the twentieth century. He examines the sacred languages of ancient cults and religions, uncovers the workings of onomancy, spells, and gematria, looks into the obliqueness of allusion and parody, and celebrates the absurdities of euphemism and jargon. Secret Language takes the reader on fascinating excursions down obscure byways of language, ranging across time and culture. With revelations on every page it will entertain anyone with an urge to know more about the most arcane and curious uses of language.
The Third International Conference on Network Security and Applications (CNSA-2010) focused on all technical and practical aspects of security and its applications for wired and wireless networks. The goal of this conference is to bring together researchers and practitioners from academia and industry to focus on understanding modern security threats and countermeasures, and establishing new collaborations in these areas. Authors are invited to contribute to the conference by submitting articles that illustrate research results, projects, survey work and industrial experiences describing significant advances in the areas of security and its applications, including: * Network and Wireless Network Security * Mobile, Ad Hoc and Sensor Network Security * Peer-to-Peer Network Security * Database and System Security * Intrusion Detection and Prevention * Internet Security, and Applications Security and Network Management * E-mail Security, Spam, Phishing, E-mail Fraud * Virus, Worms, Trojon Protection * Security Threats and Countermeasures (DDoS, MiM, Session Hijacking, Replay attack etc. ) * Ubiquitous Computing Security * Web 2. 0 Security * Cryptographic Protocols * Performance Evaluations of Protocols and Security Application There were 182 submissions to the conference and the Program Committee selected 63 papers for publication. The book is organized as a collection of papers from the First International Workshop on Trust Management in P2P Systems (IWTMP2PS 2010), the First International Workshop on Database Management Systems (DMS- 2010), and the First International Workshop on Mobile, Wireless and Networks Security (MWNS-2010).
Modern cryptology increasingly employs mathematically rigorous concepts and methods from complexity theory. Conversely, current research topics in complexity theory are often motivated by questions and problems from cryptology. This book takes account of this situation, and therefore its subject is what may be dubbed "cryptocomplexity'', a kind of symbiosis of these two areas. This book is written for undergraduate and graduate students of computer science, mathematics, and engineering, and can be used for courses on complexity theory and cryptology, preferably by stressing their interrelation. Moreover, it may serve as a valuable source for researchers, teachers, and practitioners working in these fields. Starting from scratch, it works its way to the frontiers of current research in these fields and provides a detailed overview of their history and their current research topics and challenges.
This book constitutes the refereed proceedings of the 30th Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2011, held in Tallinn, Estonia, in May 2011. The 31 papers, presented together with 2 invited talks, were carefully reviewed and selected from 167 submissions. The papers are organized in topical sections on lattice-base cryptography, implementation and side channels, homomorphic cryptography, signature schemes, information-theoretic cryptography, symmetric key cryptography, attacks and algorithms, secure computation, composability, key dependent message security, and public key encryption.
This book constitutes the refereed proceedings of the International Symposium on Information and Automation, ISIA 2010, held in Guangzhou, China, in November 2010. The 110 revised full papers presented were carefully reviewed and selected from numerous submissions. The symposium provides a forum for researchers, educators, engineers, and government officials to present and discuss their latest research results and exchange views on the future research directions in the general areas of Information and Automation.
Over the last decade, we have witnessed a growing dependency on information technologyresultingina wide rangeofnew opportunities. Clearly, ithas become almost impossible to imagine life without a personal computer or laptop, or without a cell phone. Social network sites (SNS) are competing with face-- face encounters and may even oust them. Most SNS-adepts have hundreds of "friends," happily sharing pictures and pro?les and endless chitchat. We are on the threshold of the Internet of Things, where every object will have its RFID-tag. This will not only e?ect companies, who will be able to optimize their production and delivery processes, but also end users, who will be able to enjoy many new applications, ranging from smart shopping, and smart fridges to geo-localized services. In the near future, elderly people will be able to stay longer at home due to clever health monitoring systems. The sky seems to be the limit However, we have also seen the other side of the coin: viruses, Trojan horses, breaches of privacy, identity theft, and other security threats. Our real and virtual worlds are becoming increasingly vulnerable to attack. In order to encouragesecurity researchby both academia and industry and to stimulate the dissemination of results, conferences need to be organized. With the 11th edition of the joint IFIP TC-6 TC-11 Conference on C- munications and Multimedia Security (CMS 2010), the organizers resumed the tradition of previous CMS conferences after a three-year recess.
An Introduction to Mathematical Cryptography provides an introduction to public key cryptography and underlying mathematics that is required for the subject. Each of the eight chapters expands on a specific area of mathematical cryptography and provides an extensive list of exercises. It is a suitable text for advanced students in pure and applied mathematics and computer science, or the book may be used as a self-study. This book also provides a self-contained treatment of mathematical cryptography for the reader with limited mathematical background.
TheseproceedingscontainthepapersselectedforpresentationatCARDIS 2010, the 9th IFIP Conference on Smart Card Research and Advanced Application hosted by the Institute of IT-Security and Security Law (ISL) of the University ofPassau, Germany.CARDISisorganizedbyIFIPWorkingGroupsWG8.8and WG 11.2. Since 1994, CARDIS has been the foremost international conference dedicated to smart card research and applications. Every second year leading researchers and practitioners meet to present new ideas and discuss recent - velopments in smart card technologies. Thefastevolutioninthe?eldofinformationsecurityrequiresadequatemeans for representing the user in human-machine interactions. Smart cards, and by extension smart devices with their processing power and their direct association with the user, are considered the ?rst choice for this purpose. A wide range of areas including hardware design, operating systems, systems modelling, cr- tography, and distributed systems contribute to this fast-growing technology. The submissions to CARDIS were reviewed by at least three members of the ProgramCommittee, followedbyatwo-weekdiscussionphaseheldelectronically, wherecommittee memberscouldcomment onall papersand allreviews.Finally, 16 papers were selected for presentation at CARDIS. There aremany volunteerswho o?ered their time and energy to put together the symposium and who deserve our acknowledgment. We want to thank all the members of the Program Committee and the external reviewers for their hard work in evaluating and discussing the submissions. We are also very grateful to JoachimPosegga, the GeneralChairof CARDIS 2010, andhisteam for thelocal conference management. Last, but certainly not least, our thanks go to all the authors who submitted papers and all the attendees. We hope you ?nd the proceedings stimulat
During the past two decades, many communication techniques have been developed to achieve various goals such as higher data rate, more robust link quality, andmoreusercapacityinmorerigorouschannelconditions.Themost well known are, for instance, CDMA, OFDM, MIMO, multiuser OFDM, and UWB systems.All these systems havetheir ownunique superioritywhile they also induce other drawbacks that limit the system performance. Conventional way to overcome the drawback is to impose most of the computational e?ort in the receiver side and let the transmitter design much simpler than receiver. The fact is that, however, by leveraging reasonable computational e?ort to the transmitter, the receiver design can be greatly simpli?ed. For instance, multiaccess interference (MAI) has long been considered to limit the perf- mance of multiuser systems. Popular solutions to mitigate MAI issue include multiuser detection (MUD) or sophisticated signal processing for interference cancellation such as PIC or SIC. However, those solutions impose great b- den in the receiver. In this case, precoding o?er good solutions to achieve simple transceiver designs as we will mention later in this book. This book is intended to provide a comprehensive review of precoding techniques for digital communications systems from a signal processing p- spective. The variety of selected precoding techniques and their applications makes this book quite di?erent from other texts about precoding techniques in digital communication engineering
Security has been a human concern since the dawn of time. With the rise of the digital society, information security has rapidly grown to an area of serious study and ongoing research. While much research has focused on the technical aspects of computer security, far less attention has been given to the management issues of information risk and the economic concerns facing firms and nations. Managing Information Risk and the Economics of Security provides leading edge thinking on the security issues facing managers, policy makers, and individuals. Many of the chapters of this volume were presented and debated at the 2008 Workshop on the Economics of Information Security (WEIS), hosted by the Tuck School of Business at Dartmouth College. Sponsored by Tuck's Center for Digital Strategies and the Institute for Information Infrastructure Protection (I3P), the conference brought together over one hundred information security experts, researchers, academics, reporters, corporate executives, government officials, cyber crime investigators and prosecutors. The group represented the global nature of information security with participants from China, Italy, Germany, Canada, Australia, Denmark, Japan, Sweden, Switzerland, the United Kingdom and the US. This volume would not be possible without the dedicated work Xia Zhao (of Dartmouth College and now the University of North Carolina, Greensboro) who acted as the technical editor.
A Classical Introduction to Cryptography: Applications for Communications Security introduces fundamentals of information and communication security by providing appropriate mathematical concepts to prove or break the security of cryptographic schemes. This advanced-level textbook covers conventional cryptographic primitives and cryptanalysis of these primitives; basic algebra and number theory for cryptologists; public key cryptography and cryptanalysis of these schemes; and other cryptographic protocols, e.g. secret sharing, zero-knowledge proofs and undeniable signature schemes. A Classical Introduction to Cryptography: Applications for Communications Security is designed for upper-level undergraduate and graduate-level students in computer science. This book is also suitable for researchers and practitioners in industry. A separate exercise/solution booklet is available as well, please go to www.springeronline.com under author: Vaudenay for additional details on how to purchase this booklet.
From the reviews: "This is a textbook in cryptography with emphasis on algebraic methods. It is supported by many exercises (with answers) making it appropriate for a course in mathematics or computer science. ...] Overall, this is an excellent expository text, and will be very useful to both the student and researcher." Mathematical Reviews
Turbo Code Applications: a journey from a paper to realization presents c- temporary applications of turbo codes in thirteen technical chapters. Each chapter focuses on a particular communication technology utilizing turbo codes, and they are written by experts who have been working in related th areas from around the world. This book is published to celebrate the 10 year anniversary of turbo codes invention by Claude Berrou Alain Glavieux and Punya Thitimajshima (1993-2003). As known for more than a decade, turbo code is the astonishing error control coding scheme which its perf- mance closes to the Shannon's limit. It has been honored consequently as one of the seventeen great innovations during the ?rst ?fty years of information theory foundation. With the amazing performance compared to that of other existing codes, turbo codes have been adopted into many communication s- tems and incorporated with various modern industrial standards. Numerous research works have been reported from universities and advance companies worldwide. Evidently, it has successfully revolutionized the digital commu- cations. Turbo code and its successors have been applied in most communications startingfromthegroundorterrestrialsystemsofdatastorage, ADSLmodem, and ?ber optic communications. Subsequently, it moves up to the air channel applications by employing to wireless communication systems, and then ?ies up to the space by using in digital video broadcasting and satellite com- nications. Undoubtedly, with the excellent error correction potential, it has been selected to support data transmission in space exploring system as well.
The past decade has seen tremendous growth in the demand for biometrics and data security technologies in applications ranging from law enforcement and immigration control to online security. The benefits of biometrics technologies are apparent as they become important technologies for information security of governments, business enterprises, and individuals. At the same time, however, the use of biometrics has raised concerns as to issues of ethics, privacy, and the policy implications of its wi- spread use. The large-scale deployment of biometrics technologies in e-governance, e-security, and e-commerce has required that we launch an international dialogue on these issues, a dialogue that must involve key stakeholders and that must consider the legal, poli- cal, philosophical and cultural aspects of the deployment of biometrics technologies. The Third International Conference on Ethics and Policy of Biometrics and Inter- tional Data Sharing was highly successful in facilitating such interaction among - searchers, policymakers, consumers, and privacy groups. This conference was supported and funded as part of the RISE project in its ongoing effort to develop wide consensus and policy recommendations on ethical, medical, legal, social, cultural, and political concerns in the usage of biometrics and data security technologies. The - tential concerns over the deployment of biometrics systems can be jointly addressed by developing smart biometrics technologies and by developing policies for the - ployment of biometrics technologies that clearly demarcate conflicts of interest - tween stakeholders.
Details how intrusion detection works in network security with comparisons to traditional methods such as firewalls and cryptography Analyzes the challenges in interpreting and correlating Intrusion Detection alerts
This volume constitutes the refereed proceedings of the 4th International Conference on Information Systems, Technology and Management, ICISTM 2010, held in Bangkok, Thailand, in March 2010. The 28 revised full papers presented together with 3 keynote lectures, 9 short papers, and 2 tutorial papers were carefully reviewed and selected from 86 submissions. The papers are organized in topical sections on information systems, information technology, information management, and applications.
How to draw plausible conclusions from uncertain and conflicting sources of evidence is one of the major intellectual challenges of Artificial Intelligence. It is a prerequisite of the smart technology needed to help humans cope with the information explosion of the modern world. In addition, computational modelling of uncertain reasoning is a key to understanding human rationality. Previous computational accounts of uncertain reasoning have fallen into two camps: purely symbolic and numeric. This book represents a major advance by presenting a unifying framework which unites these opposing camps. The Incidence Calculus can be viewed as both a symbolic and a numeric mechanism. Numeric values are assigned indirectly to evidence via the possible worlds in which that evidence is true. This facilitates purely symbolic reasoning using the possible worlds and numeric reasoning via the probabilities of those possible worlds. Moreover, the indirect assignment solves some difficult technical problems, like the combinat ion of dependent sources of evidcence, which had defeated earlier mechanisms. Weiru Liu generalises the Incidence Calculus and then compares it to a succes sion of earlier computational mechanisms for uncertain reasoning: Dempster-Shafer Theory, Assumption-Based Truth Maintenance, Probabilis tic Logic, Rough Sets, etc. She shows how each of them is represented and interpreted in Incidence Calculus. The consequence is a unified mechanism which includes both symbolic and numeric mechanisms as special cases. It provides a bridge between symbolic and numeric approaches, retaining the advantages of both and overcoming some of their disadvantages."
This open access book covers the most cutting-edge and hot research topics and fields of post-quantum cryptography. The main purpose of this book is to focus on the computational complexity theory of lattice ciphers, especially the reduction principle of Ajtai, in order to fill the gap that post-quantum ciphers focus on the implementation of encryption and decryption algorithms, but the theoretical proof is insufficient. In Chapter 3, Chapter 4 and Chapter 6, author introduces the theory and technology of LWE distribution, LWE cipher and homomorphic encryption in detail. When using random analysis tools, there is a problem of "ambiguity" in both definition and algorithm. The greatest feature of this book is to use probability distribution to carry out rigorous mathematical definition and mathematical demonstration for various unclear or imprecise expressions, so as to make it a rigorous theoretical system for classroom teaching and dissemination. Chapters 5 and 7 further expand and improve the theory of cyclic lattice, ideal lattice and generalized NTRU cryptography. This book is used as a professional book for graduate students majoring in mathematics and cryptography, as well as a reference book for scientific and technological personnel engaged in cryptography research. |
![]() ![]() You may like...
Research Anthology on Architectures…
Information R Management Association
Hardcover
R13,709
Discovery Miles 137 090
Practical Industrial Data Communications…
Deon Reynders, Steve Mackay, …
Paperback
R1,539
Discovery Miles 15 390
Research Anthology on Architectures…
Information R Management Association
Hardcover
R13,706
Discovery Miles 137 060
Deep Learning - Research and…
Siddhartha Bhattacharyya, Vaclav Snasel, …
Hardcover
R4,094
Discovery Miles 40 940
3D Imaging, Analysis and Applications
Yonghuai Liu, Nick Pears, …
Hardcover
R3,282
Discovery Miles 32 820
|