![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
Thomas M. Cover and B. Gopinatb The papers in this volume are the contributions to a special workshop on problems in communication and computation conducted in the summers of 1984 and 1985 in Morristown, New Jersey, and the summer of 1986 in Palo Alto. California. The structure of this workshop was unique: no recent results. no surveys. Instead. we asked for outstanding open prob lems in the field. There are many famous open problems, including the question P = NP?, the simplex conjecture in communication theory, the capacity region of the broadcast channel. and the two.helper problem in information theory. Beyond these well-defined problems are certain grand research goals. What is the general theory of information flow in stochastic networks? What is a comprehensive theory of computational complexity? What about a unification of algorithmic complexity and computational complex ity? Is there a notion of energy-free computation? And if so, where do information theory, communication theory, computer science, and physics meet at the atomic level? Is there a duality between computation and communication? Finally. what is the ultimate impact of algorithmic com plexity on probability theory? And what is its relationship to information theory? The idea was to present problems on the first day. try to solve them on the second day, and present the solutions on the third day. In actual fact, only one problem was solved during the meeting -- El Gamal's prob. lem on noisy communication over a common line."
This book constitutes the refereed proceedings of the 4th International Workshop on the Arithmetic of Finite Field, WAIFI 2012, held in Bochum, Germany, in July 2012. The 13 revised full papers and 4 invited talks presented were carefully reviewed and selected from 29 submissions. The papers are organized in topical sections on coding theory and code-based cryptography, Boolean functions, finite field arithmetic, equations and functions, and polynomial factorization and permutation polynomial.
Space Division Multiple Access is one of the most promising methods in solving the capacity problem of wireless communication systems. This book defines formulae that can be used to evaluate the limit capacity of multipath wireless channels in a particular receiving region with size limitation.
This book constitutes the refereed proceedings of the 5th International Workshop on Multiple Access Communications, MACOM 2012, held in Maynooth, Ireland, in November 2012. The 13 full papers and 5 demo and poster papers presented were carefully reviewed and selected from various submissions. The papers are organized in topical sections on network coding, handling interference and localization techniques at PHY/MAC layers, wireless access networks, and medium access control.
Oversampled Delta-Sigma Modulators: Analysis, Applications, and
Novel Topologies presents theorems and their mathematical proofs
for the exact analysis of the quantization noise in delta-sigma
modulators. Extensive mathematical equations are included
throughout the book to analyze both single-stage and multi-stage
architectures. It has been proved that appropriately set initial
conditions generate tone free output, provided that the modulator
order is at least three. These results are applied to the design of
a Fractional-N PLL frequency synthesizer to produce spurious free
RF waveforms. Furthermore, the book also presents time-interleaved
topologies to increase the conversion bandwidth of delta-sigma
modulators. The topologies have been generalized for any
interleaving number and modulator order.
Contents Practical Corner: The Evolution of the Exchange Rate from "Sacrosanct" Parity to Flexible Monetary Policy Instrument.- Historical Studies: The Society for Business History: A Decade of Work. The Bankers Simon and Abraham Oppenheim 1812-1880. The Private Background to Their Professional Activity, their Role in Politics and Ennoblement. Russian Business in the Bruning Era.- Reviews of Literature: A Review of the New Literature on Business History.- A Review of the New Literature on Banking History. Reports on Conferences. The "German Yearbook on Business " "History" is a source of insights into the entrepreneurial economy of the 19th and 20th centuries. It contains translations of topical journal articles and informative reviews of results and trends in business history research. As in the previous Yearbooks, the authors of this volume are experts in economic theory and practice whose contributions cover a wide spectrum."
This book constitutes the thoroughly refereed post-conference proceedings of the 15th International Conference on Information Security and Cryptology, ICISC 2012, held in Seoul, Korea, in November 2012. The 32 revised full papers presented together with 3 invited talks were carefully selected from 120 submissions during two rounds of reviewing. The papers provide the latest results in research, development, and applications in the field of information security and cryptology. They are organized in topical sections on attack and defense, software and Web security, cryptanalysis, cryptographic protocol, identity-based encryption, efficient implementation, cloud computing security, side channel analysis, digital signature, and privacy enhancement.
(Preliminary): The Orthogonal Frequency Division Multiplexing (OFDM) digital transmission technique has several advantages in broadcast and mobile communications applications. The main objective of this book is to give a good insight into these efforts, and provide the reader with a comprehensive overview of the scientific progress which was achieved in the last decade. Besides topics of the physical layer, such as coding, modulation and non-linearities, a special emphasis is put on system aspects and concepts, in particular regarding cellular networks and using multiple antenna techniques. The work extensively addresses challenges of link adaptation, adaptive resource allocation and interference mitigation in such systems. Moreover, the domain of cross-layer design, i.e. the combination of physical layer aspects and issues of higher layers, are considered in detail. These results will facilitate and stimulate further innovation and development in the design of modern communication systems, based on the powerful OFDM transmission technique.
Driven by the increasing demand for capacity and Quality of Service in wireless cellular networks and motivated by the distributed antenna system, the authors proposed a cooperative communication architecture-Group Cell architecture, which was initially brought forward in 2001. Years later, Coordinated Multiple-Point Transmission and Reception (CoMP) for LTE-Advanced was put forward in April 2008, as a tool to improve the coverage of cells having high data rates, the cell-edge throughput and/or to increase system throughput. This book mainly focuses on the Group Cell architecture with multi-cell generalized coordination, Contrast Analysis between Group Cell architecture and CoMP, Capacity Analysis, Slide Handover Strategy, Power Allocation schemes of Group Cell architecture to mitigate the inter-cell interference and maximize system capacity and the trial network implementation and performance evaluations of Group Cell architecture.
Artificial Intelligence and Security in Computing Systems is a peer-reviewed conference volume focusing on three areas of practice and research progress in information technologies: -Methods of Artificial Intelligence presents methods and algorithms which are the basis for applications of artificial intelligence environments. -Multiagent Systems include laboratory research on multiagent intelligent systems as well as upon their applications in transportation and information systems. -Computer Security and Safety presents techniques and algorithms which will be of great interest to practitioners. In general, they focus on new cryptographic algorithms (including a symmetric key encryption scheme, hash functions, secret generation and sharing schemes, and secure data storage), a formal language for policy access control description and its implementation, and risk management methods (used for continuous analysis both in distributed network and software development projects).
For four decades, information theory has been viewed almost exclusively as a theory based upon the Shannon measure of uncertainty and information, usually referred to as Shannon entropy. Since the publication of Shannon's seminal paper in 1948, the theory has grown extremely rapidly and has been applied with varied success in almost all areas of human endeavor. At this time, the Shannon information theory is a well established and developed body of knowledge. Among its most significant recent contributions have been the use of the complementary principles of minimum and maximum entropy in dealing with a variety of fundamental systems problems such as predic tive systems modelling, pattern recognition, image reconstruction, and the like. Since its inception in 1948, the Shannon theory has been viewed as a restricted information theory. It has often been argued that the theory is capable of dealing only with syntactic aspects of information, but not with its semantic and pragmatic aspects. This restriction was considered a v~rtue by some experts and a vice by others. More recently, however, various arguments have been made that the theory can be appropriately modified to account for semantic aspects of in formation as well. Some of the most convincing arguments in this regard are in cluded in Fred Dretske's Know/edge & Flow of Information (The M.LT. Press, Cambridge, Mass., 1981) and in this book by Guy lumarie.
This book contains the thoroughly refereed post-conference proceedings of the 14th Information Hiding Conference, IH 2012, held in Berkeley, CA, USA, in May 2012. The 18 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on multimedia forensics and counter-forensics, steganalysis, data hiding in unusual content, steganography, covert channels, anonymity and privacy, watermarking, and fingerprinting.
This brief focuses on radio resource allocation in a heterogeneous wireless medium. It presents radio resource allocation algorithms with decentralized implementation, which support both single-network and multi-homing services. The brief provides a set of cooperative networking algorithms, which rely on the concepts of short-term call traffic load prediction, network cooperation, convex optimization, and decomposition theory. In the proposed solutions, mobile terminals play an active role in the resource allocation operation, instead of their traditional role as passive service recipients in the networking environment.
The objective of the present edition of this monograph is the same as that of earlier editions, namely, to provide readers with some mathemati cal maturity a rigorous and modern introduction to the ideas and principal theorems of probabilistic information theory. It is not necessary that readers have any prior knowledge whatever of information theory. The rapid development of the subject has had the consequence that any one book can now cover only a fraction of the literature. The latter is often written by engineers for engineers, and the mathematical reader may have some difficulty with it. The mathematician who understands the content and methods of this monograph should be able to read the literature and start on research of his own in a subject of mathematical beauty and interest. The present edition differs from the second in the following: Chapter 6 has been completely replaced by one on arbitrarily varying channels. Chapter 7 has been greatly enlarged. Chapter 8 on semi-continuous channels has been drastically shortened, and Chapter 11 on sequential decoding completely removed. The new Chapters 11-15 consist entirely of material which has been developed only in the last few years. The topics discussed are rate distortion, source coding, multiple access channels, and degraded broadcast channels. Even the specialist will find a new approach in the treatment of these subjects. Many of the proofs are new, more perspicuous, and considerably shorter than the original ones."
The spreading of digital technology has resulted in a dramatic increase in the demand for data compression (DC) methods. At the same time, the appearance of highly integrated elements has made more and more com plicated algorithms feasible. It is in the fields of speech and image trans mission and the transmission and storage of biological signals (e.g., ECG, Body Surface Mapping) where the demand for DC algorithms is greatest. There is, however, a substantial gap between the theory and the practice of DC: an essentially nonconstructive information theoretical attitude and the attractive mathematics of source coding theory are contrasted with a mixture of ad hoc engineering methods. The classical Shannonian infor mation theory is fundamentally different from the world of practical pro cedures. Theory places great emphasis on block-coding while practice is overwhelmingly dominated by theoretically intractable, mostly differential predictive coding (DPC), algorithms. A dialogue between theory and practice has been hindered by two pro foundly different conceptions of a data source: practice, mostly because of speech compression considerations, favors non stationary models, while the theory deals mostly with stationary ones."
This book constitutes the thoroughly refereed post-worksop proceedings of the 8th International Workshop Radio Frequency Identification: Security and Privacy Issues, RFIDSec 2012, held in Nijmegen, The Netherlands, in July 2012. The 12 revised full papers presented were carefully reviewed and selected from 29 submissions for inclusion in the book. The papers focus on approaches to solve security and data protection issues in advanced contactless technologies.
This book contains the lectures given at the Conference on Dynamics and Randomness held at the Centro de Modelamiento Matematico of the Universidad de Chile from December 11th to 15th, 2000. This meeting brought together mathematicians, theoretical physicists and theoretical computer scientists, and graduate students interested in fields re lated to probability theory, ergodic theory, symbolic and topological dynam ics. We would like to express our gratitude to all the participants of the con ference and to the people who contributed to its organization. In particular, to Pierre Collet, Bernard Host and Mike Keane for their scientific advise. VVe want to thank especially the authors of each chapter for their well prepared manuscripts and the stimulating conferences they gave at Santiago. We are also indebted to our sponsors and supporting institutions, whose interest and help was essential to organize this meeting: ECOS-CONICYT, FONDAP Program in Applied Mathematics, French Cooperation, Fundacion Andes, Presidential Fellowship and Universidad de Chile. We are grateful to Ms. Gladys Cavallone for their excellent work during the preparation of the meeting as well as for the considerable task of unifying the typography of the different chapters of this book.
This book constitutes the refereed proceedings of the 7th International Workshop on Security, IWSEC 2012, held in Fukuoka, Japan, in November 2012. The 16 revised selected papers presented in this volume were carefully reviewed and selected from 53 submissions. They are organized in topical sections named: implementation; encryption and key exchange; cryptanalysis; and secure protocols.
Covering classical cryptography, modern cryptography, and steganography, this volume details how data can be kept secure and private. Each topic is presented and explained by describing various methods, techniques, and algorithms. Moreover, there are numerous helpful examples to reinforce the reader's understanding and expertise with these techniques and methodologies. Features & Benefits: * Incorporates both data encryption and data hiding * Supplies a wealth of exercises and solutions to help readers readily understand the material * Presents information in an accessible, nonmathematical style * Concentrates on specific methodologies that readers can choose from and pursue, for their data-security needs and goals * Describes new topics, such as the advanced encryption standard (Rijndael), quantum cryptography, and elliptic-curve cryptography. The book, with its accessible style, is an essential companion for all security practitioners and professionals who need to understand and effectively use both information hiding and encryption to protect digital data and communications. It is also suitable for self-study in the areas of programming, software engineering, and security.
Approach your problems from the right end It isn't that they can't see the solution. It is and begin with the answers. Then one day, that they can't see the problem. perhaps you will find the final question. G. K. Chesterton. The Scandal of Father 'The Hermit Clad in Crane Feathers' in R. Brown 'The point of a Pin'. van Gulik's The Chif1ese Maze Murders. Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However, the "tree" of knowledge of mathematics and related fields does not grow only by putting forth new branches. It also happens, quite often in fact, that branches which were thought to be completely disparate are suddenly seen to be related. Further, the kind and level of sophistication of mathematics applied in various sciences has changed drastically in recent years: measure theory is used (non trivially) in regional and theoretical economics; algebraic geometry interacts with physics; the Minkowsky lemma, coding theory and the structure of water meet one another in packing and covering theory; quantum fields, crystal defects and mathematical programming profit from homotopy theory; Lie algebras are relevant to filtering; and prediction and electrical engineering can use Stein spaces. And in addition to this there are such new emerging subdisciplines as "experimental mathematics", "CFD", "completely integrable systems", "chaos, synergetics and large-scale order", which are almost impossible to fit into the existing classification schemes. They draw upon widely different sections of mathematics.
DES, the Data Encryption Standard, is the best known and most widely used civilian cryptosystem. It was developed by IBM and adopted as a US national standard in the mid 1970s, and had resisted all attacks in the last 15 years. This book presents the first successful attack which can break the full 16 round DES faster than via exhaustive search. It describes in full detail, the novel technique of Differential Cryptanalysis, and demonstrates its applicability to a wide variety of cryptosystems and hash functions, including FEAL, Khafre, REDOC-II, LOKI, Lucifer, Snefru, N-Hash, and many modified versions of DES. The methodology used offers valuable insights to anyone interested in data security and cryptography, and points out the intricacies of developing, evaluating, testing, and implementing such schemes. This book was written by two of the fields leading researchers, and describes state-of-the-art research in a clear and completely contained manner.
This volume is dedicated to the memory of Rudolf Ahlswede, who passed away in December 2010. The Festschrift contains 36 thoroughly refereed research papers from a memorial symposium, which took place in July 2011. Thefour macro-topics of this workshop: theory of games and strategic planning; combinatorial group testing and database mining; computational biology and string matching; information coding and spreading and patrolling on networks; provide a comprehensive picture of the vision Rudolf Ahlswede put forward of a broad and systematic theory of search.
This volume represents the refereed proceedings of the "Sixth International Conference on Finite Fields and Applications (Fq6)" held in the city of Oaxaca, Mexico, between 22-26 May 200l. The conference was hosted by the Departmento do Matermiticas of the U niversidad Aut6noma Metropolitana- Iztapalapa, Nlexico. This event continued a series of biennial international conferences on Finite Fields and Applications, following earlier meetings at the University of Nevada at Las Vegas (USA) in August 1991 and August 1993, the University of Glasgow (Scotland) in July 1995, the University of Waterloo (Canada) in August 1997, and at the University of Augsburg (Ger- many) in August 1999. The Organizing Committee of Fq6 consisted of Dieter Jungnickel (University of Augsburg, Germany), Neal Koblitz (University of Washington, USA), Alfred }. lenezes (University of Waterloo, Canada), Gary Mullen (The Pennsylvania State University, USA), Harald Niederreiter (Na- tional University of Singapore, Singapore), Vera Pless (University of Illinois, USA), Carlos Renteria (lPN, Mexico). Henning Stichtenoth (Essen Univer- sity, Germany). and Horacia Tapia-Recillas, Chair (Universidad Aut6noma l'vIetropolitan-Iztapalapa. Mexico). The program of the conference consisted of four full days and one half day of sessions, with 7 invited plenary talks, close to 60 contributed talks, basic courses in finite fields. cryptography and coding theory and a series of lectures at local educational institutions. Finite fields have an inherently fascinating structure and they are im- portant tools in discrete mathematics.
This book constitutes revised selected papers from the 7th Conference on Theory of Quantum Computation, Communication, and Cryptography, TQC 2012, held in Tokyo, Japan, in May 2012. The 12 papers presented were carefully reviewed and selected for inclusion in this book. They contain original research on the rapidly growing, interdisciplinary field of quantum computation, communication and cryptography. Topics addressed are such as quantum algorithms, quantum computation models, quantum complexity theory, simulation of quantum systems, quantum programming languages, quantum cryptography, quantum communication, quantum estimation, quantum measurement, quantum tomography, completely positive maps, decoherence, quantum noise, quantum coding theory, fault-tolerant quantum computing, entanglement theory, and quantum teleportation.
What constitutes an identity, how do new technologies affect identity, how do we manage identities in a globally networked information society? The increasing div- sity of information and communication technologies and their equally wide range of usage in personal, professional and official capacities raise challenging questions of identity in a variety of contexts. The aim of the IFIP/FIDIS Summer Schools has been to encourage young a- demic and industry entrants to share their own ideas about privacy and identity m- agement and to build up collegial relationships with others. As such, the Summer Schools have been introducing participants to the social implications of information technology through the process of informed discussion. The 4th International Summer School took place in Brno, Czech Republic, during September 1-7, 2008. It was organized by IFIP (International Federation for Infor- tion Processing) working groups 9.2 (Social Accountability), 9.6/11.7 (IT Misuse and the Law) and 11.6 (Identity Management) in cooperation with the EU FP6 Network of Excellence FIDIS and Masaryk University in Brno. The focus of the event was on security and privacy issues in the Internet environment, and aspects of identity m- agement in relation to current and future technologies in a variety of contexts. |
You may like...
Elements of Petroleum Geology
Richard C. Selley, Stephen A Sonnenberg
Hardcover
Bridging the Semantic Gap in Image and…
Halina Kwasnicka, Lakhmi C. Jain
Hardcover
R1,408
Discovery Miles 14 080
European Glacial Landscapes - Maximum…
David Palacios, Philip D Hughes, …
Paperback
R3,349
Discovery Miles 33 490
High Efficiency Video Coding (HEVC…
Vivienne Sze, Madhukar Budagavi, …
Hardcover
R4,331
Discovery Miles 43 310
|