![]() |
![]() |
Your cart is empty |
||
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
This book constitutes the refereed proceedings of the 11th IFIP WG 5.5/SOCOLNET Advanced Doctoral Conference on Computing, Electrical and Industrial Systems, DoCEIS 2020, held in Costa de Caparica, Portugal, in July 2020. The 20 full papers and 24 short papers presented were carefully reviewed and selected from 91 submissions. The papers present selected results produced in engineering doctoral programs and focus on technological innovation for industry and service systems. Research results and ongoing work are presented, illustrated and discussed in the following areas: collaborative networks; decisions systems; analysis and synthesis algorithms; communication systems; optimization systems; digital twins and smart manufacturing; power systems; energy control; power transportation; biomedical analysis and diagnosis; and instrumentation in health.
In this book the authors first describe the background of trusted platforms and trusted computing and speculate about the future. They then describe the technical features and architectures of trusted platforms from several different perspectives, finally explaining second-generation TPMs, including a technical description intended to supplement the Trusted Computing Group's TPM2 specifications. The intended audience is IT managers and engineers and graduate students in information security.
Using an original mode of presentation, and emphasizing the computational nature of the subject, this book explores a number of the unsolved problems that still exist in coding theory. A well-established and highly relevant branch of mathematics, the theory of error-correcting codes is concerned with reliably transmitting data over a noisy channel. Despite frequent use in a range of contexts, the subject still contains interesting unsolved problems that have resisted solution by some of the most prominent mathematicians of recent decades. " Employing Sage a free open-source mathematics software system to illustrate ideas, this book is intended for graduate students and researchers in algebraic coding theory. The work may be used as supplementary reading material in a graduate course on coding theory or for self-study.
Privacy requirements have an increasing impact on the realization of modern applications. Commercial and legal regulations demand that privacy guarantees be provided whenever sensitive information is stored, processed, or communicated to external parties. Current approaches encrypt sensitive data, thus reducing query execution efficiency and preventing selective information release. Preserving Privacy in Data Outsourcing presents a comprehensive approach for protecting highly sensitive information when it is stored on systems that are not under the data owner's control. The approach illustrated combines access control and encryption, enforcing access control via structured encryption. This solution, coupled with efficient algorithms for key derivation and distribution, provides efficient and secure authorization management on outsourced data, allowing the data owner to outsource not only the data but the security policy itself. To reduce the amount of data to be encrypted the book also investigates data fragmentation as a possible way to protect privacy of data associations and provide fragmentation as a complementary means for protecting privacy: associations broken by fragmentation will be visible only to users authorized (by knowing the proper key) to join fragments. The book finally investigates the problem of executing queries over possible data distributed at different servers and which must be controlled to ensure sensitive information and sensitive associations be visible only to parties authorized for that. Case Studies are provided throughout the book. Privacy, data mining, data protection, data outsourcing, electronic commerce, machine learning professionals and others working in these related fields will find this book a valuable asset, as well as primary associations such as ACM, IEEE and Management Science. This book is also suitable for advanced level students and researchers concentrating on computer science as a secondary text or reference book.
This book presents an overview of the state of the art in video coding technology. Specifically, it introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AVS2; describes the key technologies used in the AVS2 standard, including prediction coding, transform coding, entropy coding, and loop-filters; examines efficient tools for scene video coding and surveillance video, and the details of a promising intelligent video coding system; discusses optimization technologies in video coding systems; provides a review of image, video, and 3D content quality assessment algorithms; surveys the hot research topics in video compression.
Since their invention in the late seventies, public key cryptosystems have become an indispensable asset in establishing private and secure electronic communication, and this need, given the tremendous growth of the Internet, is likely to continue growing. Elliptic curve cryptosystems represent the state of the art for such systems. Elliptic Curves and Their Applications to Cryptography: An Introduction provides a comprehensive and self-contained introduction to elliptic curves and how they are employed to secure public key cryptosystems. Even though the elegant mathematical theory underlying cryptosystems is considerably more involved than for other systems, this text requires the reader to have only an elementary knowledge of basic algebra. The text nevertheless leads to problems at the forefront of current research, featuring chapters on point counting algorithms and security issues. The Adopted unifying approach treats with equal care elliptic curves over fields of even characteristic, which are especially suited for hardware implementations, and curves over fields of odd characteristic, which have traditionally received more attention. Elliptic Curves and Their Applications: An Introduction has been used successfully for teaching advanced undergraduate courses. It will be of greatest interest to mathematicians, computer scientists, and engineers who are curious about elliptic curve cryptography in practice, without losing the beauty of the underlying mathematics.
Since publication of the initial papers in 2006, compressed sensing has captured the imagination of the international signal processing community, and the mathematical foundations are nowadays quite well understood. Parallel to the progress in mathematics, the potential applications of compressed sensing have been explored by many international groups of, in particular, engineers and applied mathematicians, achieving very promising advances in various areas such as communication theory, imaging sciences, optics, radar technology, sensor networks, or tomography. Since many applications have reached a mature state, the research center MATHEON in Berlin focusing on "Mathematics for Key Technologies", invited leading researchers on applications of compressed sensing from mathematics, computer science, and engineering to the "MATHEON Workshop 2013: Compressed Sensing and its Applications" in December 2013. It was the first workshop specifically focusing on the applications of compressed sensing. This book features contributions by the plenary and invited speakers of this workshop. To make this book accessible for those unfamiliar with compressed sensing, the book will not only contain chapters on various applications of compressed sensing written by plenary and invited speakers, but will also provide a general introduction into compressed sensing. The book is aimed at both graduate students and researchers in the areas of applied mathematics, computer science, and engineering as well as other applied scientists interested in the potential and applications of the novel methodology of compressed sensing. For those readers who are not already familiar with compressed sensing, an introduction to the basics of this theory will be included.
The book introduces new techniques which imply rigorous lower bounds on the complexity of some number theoretic and cryptographic problems. These methods and techniques are based on bounds of character sums and numbers of solutions of some polynomial equations over finite fields and residue rings. It also contains a number of open problems and proposals for further research. We obtain several lower bounds, exponential in terms of logp, on the de grees and orders of * polynomials; * algebraic functions; * Boolean functions; * linear recurring sequences; coinciding with values of the discrete logarithm modulo a prime p at suf ficiently many points (the number of points can be as small as pI/He). These functions are considered over the residue ring modulo p and over the residue ring modulo an arbitrary divisor d of p - 1. The case of d = 2 is of special interest since it corresponds to the representation of the right most bit of the discrete logarithm and defines whether the argument is a quadratic residue. We also obtain non-trivial upper bounds on the de gree, sensitivity and Fourier coefficients of Boolean functions on bits of x deciding whether x is a quadratic residue. These results are used to obtain lower bounds on the parallel arithmetic and Boolean complexity of computing the discrete logarithm. For example, we prove that any unbounded fan-in Boolean circuit. of sublogarithmic depth computing the discrete logarithm modulo p must be of superpolynomial size.
These proceedings contain the papers of IFIP/SEC 2010. It was a special honour and privilege to chair the Program Committee and prepare the proceedings for this conf- ence, which is the 25th in a series of well-established international conferences on security and privacy organized annually by Technical Committee 11 (TC-11) of IFIP. Moreover, in 2010 it is part of the IFIP World Computer Congress 2010 celebrating both the Golden Jubilee of IFIP (founded in 1960) and the Silver Jubilee of the SEC conference in the exciting city of Brisbane, Australia, during September 20-23. The call for papers went out with the challenging motto of "Security & Privacy Silver Linings in the Cloud" building a bridge between the long standing issues of security and privacy and the most recent developments in information and commu- cation technology. It attracted 102 submissions. All of them were evaluated on the basis of their significance, novelty, and technical quality by at least five member of the Program Committee. The Program Committee meeting was held electronically over a period of a week. Of the papers submitted, 25 were selected for presentation at the conference; the acceptance rate was therefore as low as 24. 5% making SEC 2010 a highly competitive forum. One of those 25 submissions could unfortunately not be included in the proceedings, as none of its authors registered in time to present the paper at the conference.
Covering topics in algebraic geometry, coding theory, and cryptography, this volume presents interdisciplinary group research completed for the February 2016 conference at the Institute for Pure and Applied Mathematics (IPAM) in cooperation with the Association for Women in Mathematics (AWM). The conference gathered research communities across disciplines to share ideas and problems in their fields and formed small research groups made up of graduate students, postdoctoral researchers, junior faculty, and group leaders who designed and led the projects. Peer reviewed and revised, each of this volume's five papers achieves the conference's goal of using algebraic geometry to address a problem in either coding theory or cryptography. Proposed variants of the McEliece cryptosystem based on different constructions of codes, constructions of locally recoverable codes from algebraic curves and surfaces, and algebraic approaches to the multicast network coding problem are only some of the topics covered in this volume. Researchers and graduate-level students interested in the interactions between algebraic geometry and both coding theory and cryptography will find this volume valuable.
The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.
This book contains extended and revised versions of the best papers that were presented during the fifteenth edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 15th conference was held at the Georgia Institute of Technology, Atlanta, USA (October 15-17, 2007). Previous conferences have taken place in Edinburgh, Trondheim, Vancouver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth and Nice. The purpose of this conference, sponsored by IFIP TC 10 Working Group 10.5 and by the IEEE Council on Electronic Design Automation (CEDA), is to provide a forum to exchange ideas and show industrial and academic research results in the field of microelectronics design. The current trend toward increasing chip integration and technology process advancements brings about stimulating new challenges both at the physical and system-design levels, as well in the test of these systems. VLSI-SoC conferences aim to address these exciting new issues.
The first cultural history of early modern cryptography, this collection brings together scholars in history, literature, music, the arts, mathematics, and computer science who study ciphering and deciphering from new materialist, media studies, cognitive studies, disability studies, and other theoretical perspectives. Essays analyze the material forms of ciphering as windows into the cultures of orality, manuscript, print, and publishing, revealing that early modern ciphering, and the complex history that preceded it in the medieval period, not only influenced political and military history but also played a central role in the emergence of the capitalist media state in the West, in religious reformation, and in the scientific revolution. Ciphered communication, whether in etched stone and bone, in musical notae, runic symbols, polyalphabetic substitution, algebraic equations, graphic typographies, or literary metaphors, took place in contested social spaces and offered a means of expression during times of political, economic, and personal upheaval. Ciphering shaped the early history of linguistics as a discipline, and it bridged theological and scientific rhetoric before and during the Reformation. Ciphering was an occult art, a mathematic language, and an aesthetic that influenced music, sculpture, painting, drama, poetry, and the early novel. This collection addresses gaps in cryptographic history, but more significantly, through cultural analyses of the rhetorical situations of ciphering and actual solved and unsolved medieval and early modern ciphers, it traces the influences of cryptographic writing and reading on literacy broadly defined as well as the cultures that generate, resist, and require that literacy. This volume offers a significant contribution to the history of the book, highlighting the broader cultural significance of textual materialities.
ThisvolumecontainstheproceedingsofIFIPTM2009, theThirdIFIPWG11.11 International Conference on Trust Management, held at Purdue University in West Lafayette, Indiana, USA during June 15-19, 2009. IFIPTM 2009 provided a truly global platform for the reporting of research, development, policyandpracticeintheinterdependentareasofprivacy, security, and trust. Building on the traditions inherited from the highly successful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, and the IFIPTM 2008conferencein Trondheim, Norway, IFIPTM 2009 focusedontrust, privacyand security from multidisciplinary perspectives. The conferenceisanarenafor discussionaboutrelevantproblemsfromboth research and practice in the areas of academia, business, and government. IFIPTM 2009 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2009 received 44 submissions. The ProgramCommittee selected 17 - pers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include one invited paper and ?ve demo descriptions. The highlights of IFIPTM 2009 included invited talks and tutorials by academic and governmental experts in the ?elds of trust management, privacy and security, including Eugene Spa?ord, Marianne Winslett, and Michael Novak. Running an international conference requires an immense e?ort from all p- ties involved. We would like to thank the Program Committee members and external referees for having provided timely and in-depth reviews of the subm- ted papers.We wouldalsolike to thank the Workshop, Tutorial, Demonstration, Local Arrangements, and Website Chairs, for having provided great help or- nizing the con
The need for information privacy and security continues to grow and gets increasingly recognized. In this regard, Privacy-preserving Attribute-based Credentials (Privacy-ABCs) are elegant techniques to provide secure yet privacy-respecting access control. This book addresses the federation and interchangeability of Privacy-ABC technologies. It defines a common, unified architecture for Privacy-ABC systems that allows their respective features to be compared and combined Further, this book presents open reference implementations of selected Privacy-ABC systems and explains how to deploy them in actual production pilots, allowing provably accredited members of restricted communities to provide anonymous feedback on their community or its members. To date, credentials such as digitally signed pieces of personal information or other information used to authenticate or identify a user have not been designed to respect the users' privacy. They inevitably reveal the identity of the holder even though the application at hand often needs much less information, e.g. only the confirmation that the holder is a teenager or is eligible for social benefits. In contrast, Privacy-ABCs allow their holders to reveal only their minimal information required by the applications, without giving away their full identity information. Privacy-ABCs thus facilitate the implementation of a trustworthy and at the same time privacy-respecting digital society. The ABC4Trust project as a multidisciplinary and European project, gives a technological response to questions linked to data protection. Viviane Reding (Former Vice-president of the European Commission, Member of European Parliament)
The requirements for multimedia (especially video and audio) communications increase rapidly in the last two decades in broad areas such as television, entertainment, interactive services, telecommunications, conference, medicine, security, business, traffic, defense and banking. Video and audio coding standards play most important roles in multimedia communications. In order to meet these requirements, series of video and audio coding standards have been developed such as MPEG-2, MPEG-4, MPEG-21 for audio and video by ISO/IEC, H.26x for video and G.72x for audio by ITU-T, Video Coder 1 (VC-1) for video by the Society of Motion Picture and Television Engineers (SMPTE) and RealVideo (RV) 9 for video by Real Networks. AVS China is the abbreviation for Audio Video Coding Standard of China. This new standard includes four main technical areas, which are systems, video, audio and digital copyright management (DRM), and some supporting documents such as consistency verification. The second part of the standard known as AVS1-P2 (Video - Jizhun) was approved as the national standard of China in 2006, and several final drafts of the standard have been completed, including AVS1-P1 (System - Broadcast), AVS1-P2 (Video - Zengqiang), AVS1-P3 (Audio - Double track), AVS1-P3 (Audio - 5.1), AVS1-P7 (Mobile Video), AVS-S-P2 (Video) and AVS-S-P3 (Audio). AVS China provides a technical solution for many applications such as digital broadcasting (SDTV and HDTV), high-density storage media, Internet streaming media, and will be used in the domestic IPTV, satellite and possibly the cable TV market. Comparing with other coding standards such as H.264 AVC, the advantages of AVS video standard include similar performance, lower complexity, lower implementation cost and licensing fees. This standard has attracted great deal of attention from industries related to television, multimedia communications and even chip manufacturing from around the world. Also many well known companies have joined the AVS Group to be Full Members or Observing Members. The 163 members of AVS Group include Texas Instruments (TI) Co., Agilent Technologies Co. Ltd., Envivio Inc., NDS, Philips Research East Asia, Aisino Corporation, LG, Alcatel Shanghai Bell Co. Ltd., Nokia (China) Investment (NCIC) Co. Ltd., Sony (China) Ltd., and Toshiba (China) Co. Ltd. as well as some high level universities in China. Thus there is a pressing need from the instructors, students, and engineers for a book dealing with the topic of AVS China and its performance comparisons with similar standards such as H.264, VC-1 and RV-9.
The protection of sensitive information against unauthorized access or fraudulent changes has been of prime concern throughout the centuries. Modern communication techniques, using computers connected through networks, make all data even more vulnerable to these threats. In addition, new issues have surfaced that did not exist previously, e.g. adding a signature to an electronic document.Cryptology addresses the above issues - it is at the foundation of all information security. The techniques employed to this end have become increasingly mathematical in nature. Fundamentals of Cryptology serves as an introduction to modern cryptographic methods. After a brief survey of classical cryptosystems, it concentrates on three main areas. First, stream ciphers and block ciphers are discussed. These systems have extremely fast implementations, but sender and receiver must share a secret key. Second, the book presents public key cryptosystems, which make it possible to protect data without a prearranged key. Their security is based on intractable mathematical problems, such as the factorization of large numbers. The remaining chapters cover a variety of topics, including zero-knowledge proofs, secret sharing schemes and authentication codes. Two appendices explain all mathematical prerequisites in detail: one presents elementary number theory (Euclid's Algorithm, the Chinese Remainder Theorem, quadratic residues, inversion formulas, and continued fractions) and the other introduces finite fields and their algebraic structure.Fundamentals of Cryptology is an updated and improved version of An Introduction to Cryptology, originally published in 1988. Apart from a revision of the existing material, there are many new sections, and two new chapters on elliptic curves and authentication codes, respectively. In addition, the book is accompanied by a full text electronic version on CD-ROM as an interactive Mathematica manuscript.Fundamentals of Cryptology will be of interest to computer scientists, mathematicians, and researchers, students, and practitioners in the area of cryptography.
The work introduces the fundamentals concerning the measure of discrete information, the modeling of discrete sources without and with a memory, as well as of channels and coding. The understanding of the theoretical matter is supported by many examples. One particular emphasis is put on the explanation of Genomic Coding. Many examples throughout the book are chosen from this particular area and several parts of the book are devoted to this exciting implication of coding.
This book investigates the permutation polynomial (PP) based interleavers for turbo codes, including all the main theoretical and practical findings related to topics such as full coefficient conditions for PPs up to fifth; the number of all true different PPs up to fifth degree; the number of true different PPs under Zhao and Fan sufficient conditions, for any degree (with direct formulas or with a simple algorithm); parallel decoding of turbo codes using PP interleavers by butterfly networks; upper bounds of the minimum distance for turbo codes with PP interleavers; specific methods to design and find PP interleavers with good bit/frame error rate (BER/FER) performance. The theoretical results are explained in great detail to enhance readers' understanding. The book is intended for engineers in the telecommunications field, but the chapters dealing with the PP coefficient conditions and with the number of PP are of interest to mathematicians working in the field.
Intelligence results from the interaction of the brain, body and environment. The question addressed in this book is, can we measure the contribution of the body and its' interaction with the environment? To answer this, we first present a comprehensive overview of the various ways in which a body reduces the amount of computation that the brain has to perform to solve a task. This chapter will broaden your understanding of how important inconspicuously appearing physical processes and physical properties of the body are with respect to our cognitive abilities. This form of contribution to intelligence is called Morphological Intelligence. The main contribution of this book to the field is a detailed discussion of how Morphological Intelligence can be measured from observations alone. The required mathematical framework is provided so that readers unfamiliar with information theory will be able to understand and apply the measures. Case studies from biomechanics and soft robotics illustrate how the presented quantifications can, for example, be used to measure the contribution of muscle physics to jumping and optimise the shape of a soft robotic hand. To summarise, this monograph presents various examples of how the physical properties of the body and the body's interaction with the environment contribute to intelligence. Furthermore, it treats theoretical and practical aspects of Morphological Intelligence and demonstrates the value in two case studies.
Information Macrodynamics (IMD) belong to an interdisciplinary science that represents a new theoretical and computer-based methodology for a system informational descriptionand improvement, including various activities in such areas as thinking, intelligent processes, communications, management, and other nonphysical subjects with their mutual interactions, informational superimposition, and theinformation transferredbetweeninteractions. The IMD is based on the implementation of a single concept by a unique mathematical principle and formalism, rather than on an artificial combination of many arbitrary, auxiliary concepts and/or postulates and different mathematical subjects, such as the game, automata, catastrophe, logical operations theories, etc. This concept is explored mathematically using classical mathematics as calculus of variation and the probability theory, which are potent enough, without needing to developnew, specifiedmathematical systemicmethods. The formal IMD model automatically includes the related results from other fields, such as linear, nonlinear, collective and chaotic dynamics, stability theory, theory of information, physical analogies of classical and quantum mechanics, irreversible thermodynamics, andkinetics. The main IMD goal is to reveal the information regularities, mathematically expressed by the considered variation principle (VP), as a mathematical tool to extractthe regularities and define the model, whichdescribes theregularities. The IMD regularities and mechanisms are the results of the analytical solutions and are not retained by logical argumentation, rational introduction, and a reasonable discussion. The IMD's information computer modeling formalism includes a human being (as an observer, carrier and producer ofinformation), with a restoration of the model during the objectobservations.
This book is the study of all codes of life with the standard methods of science. The genetic code and the codes of culture have been known for a long time and represent the historical foundation of this book. What is really new in this field is the study of all codes that came after the genetic code and before the codes of culture. The existence of these organic codes, however, is not only a major experimental fact. It is one of those facts that have extraordinary theoretical implications. The first is that most events of macroevolution were associated with the origin of new organic codes, and this gives us a completely new reconstruction of the history of life. The second implication is that codes involve meaning and we need therefore to introduce in biology not only the concept of information but also the concept of biological meaning. The third theoretical implication comes from the fact that the organic codes have been highly conserved in evolution, which means that they are the greatest invariants of life. The study of the organic codes, in short, is bringing to light new mechanisms that have operated in the history of life and new fundamental concepts in biology.
This volume contains the proceedings of IFIPTM 2010, the 4th IFIP WG 11.11 International Conference on Trust Management, held in Morioka, Iwate, Japan during June 16-18, 2010. IFIPTM 2010 provided a truly global platform for the reporting of research, development, policy, and practice in the interdependent arrears of privacy, se- rity, and trust. Building on the traditions inherited from the highly succe- ful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, the IFIPTM 2008 conference in Trondheim, Norway, and the IFIPTM 2009 conference at Purdue University in Indiana, USA, IFIPTM 2010 focused on trust, privacy and security from multidisciplinary persp- tives. The conference is an arena for discussion on relevant problems from both research and practice in the areas of academia, business, and government. IFIPTM 2010 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2010 received 61 submissions from 25 di?erent countries: Japan (10), UK (6), USA (6), Canada (5), Germany (5), China (3), Denmark (2), India (2), Italy (2), Luxembourg (2), The Netherlands (2), Switzerland (2), Taiwan (2), Austria, Estonia, Finland, France, Ireland, Israel, Korea, Malaysia, Norway, Singapore, Spain, Turkey. The Program Committee selected 18 full papers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include two invited papers by academic experts in the ?elds of trust management, privacy and security, namely, Toshio Yamagishi and Pamela Briggs
Turbo Codes: Desirable and Designable introduces the basics of turbo codes in their different flavors (more specifically, parallel concatenated convolutional turbo codes and block turbo codes). Through the application of systemic design methodology that considers data transfer and storage as top priority candidates for optimization, the authors show how turbo codes can be implemented and the attractive performance results that can be achieved in throughput, latency and energy consumption. These solutions and results make turbo-codes close competitors to traditional coding scheme such as convolutional codes or algebraic codes. Finally, a real-life prototype of parallel concatenated convolutional (turbo-) codes is presented. A complete turbo codes ASIC data-flow is described together with on-board power, speed and coding gain measurements that demonstrate the effectiveness of the proposed solution. |
![]() ![]() You may like...
Web-Based Services - Concepts…
Information Reso Management Association
Hardcover
R18,336
Discovery Miles 183 360
|