Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
This new edition of a well-received textbook provides a concise introduction to both the theoretical and experimental aspects of quantum information at the graduate level. While the previous edition focused on theory, the book now incorporates discussions of experimental platforms. Several chapters on experimental implementations of quantum information protocols have been added: implementations using neutral atoms, trapped ions, optics, and solidstate systems are each presented in its own chapter. Previous chapters on entanglement, quantum measurements, quantum dynamics, quantum cryptography, and quantum algorithms have been thoroughly updated, and new additions include chapters on the stabilizer formalism and the Gottesman-Knill theorem as well as aspects of classical and quantum information theory. To facilitate learning, each chapter starts with a clear motivation to the topic and closes with exercises and a recommended reading list. Quantum Information Processing: Theory and Implementation will be essential to graduate students studying quantum information as well as and researchers in other areas of physics who wish to gain knowledge in the field.
This book constitutes the refereed post-conference proceedings of the IFIP TC 3 Open Conference on Computers in Education, OCCE 2020, held in Mumbai, India, in January 2020. The 11 full papers and 4 short papers included in this volume were carefully reviewed and selected from 57 submissions. The papers discuss key emerging topics and evolving practices in the area of educational computing research. They are organized in the following topical sections: computing education; learners' and teachers' perspectives; teacher professional development; the industry perspective; and further aspects.
The topics covered in this book, written by researchers at the forefront of their field, represent some of the most relevant research areas in modern coding theory: codes and combinatorial structures, algebraic geometric codes, group codes, quantum codes, convolutional codes, network coding and cryptography. The book includes a survey paper on the interconnections of coding theory with constrained systems, written by an invited speaker, as well as 37 cutting-edge research communications presented at the 4th International Castle Meeting on Coding Theory and Applications (4ICMCTA), held at the Castle of Palmela in September 2014. The event's scientific program consisted of four invited talks and 39 regular talks by authors from 24 different countries. This conference provided an ideal opportunity for communicating new results, exchanging ideas, strengthening international cooperation, and introducing young researchers into the coding theory community.
This book describes a set of novel statistical algorithms designed to infer functional connectivity of large-scale neural assemblies. The algorithms are developed with the aim of maximizing computational accuracy and efficiency, while faithfully reconstructing both the inhibitory and excitatory functional links. The book reports on statistical methods to compute the most significant functional connectivity graph, and shows how to use graph theory to extract the topological features of the computed network. A particular feature is that the methods used and extended at the purpose of this work are reported in a fairly completed, yet concise manner, together with the necessary mathematical fundamentals and explanations to understand their application. Furthermore, all these methods have been embedded in the user-friendly open source software named SpiCoDyn, which is also introduced here. All in all, this book provides researchers and graduate students in bioengineering, neurophysiology and computer science, with a set of simplified and reduced models for studying functional connectivity in in silico biological neuronal networks, thus overcoming the complexity of brain circuits.
This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two independent precise faults in a single pairing computation, it has not been taken seriously either. Based on these two attacks, the book demonstrates that the assessment of physical attack complexity is error-prone, and as such cryptography should not rely on it. Cryptographic technologies have to be protected against all physical attacks, whether they have already been successfully implemented or not. The development of countermeasures does not require the successful execution of an attack but can already be carried out as soon as the principle of a side channel or a fault attack is sufficiently understood.
This book covers novel research on construction and analysis of optimal cryptographic functions such as almost perfect nonlinear (APN), almost bent (AB), planar and bent functions. These functions have optimal resistance to linear and/or differential attacks, which are the two most powerful attacks on symmetric cryptosystems. Besides cryptographic applications, these functions are significant in many branches of mathematics and information theory including coding theory, combinatorics, commutative algebra, finite geometry, sequence design and quantum information theory. The author analyzes equivalence relations for these functions and develops several new methods for construction of their infinite families. In addition, the book offers solutions to two longstanding open problems, including the problem on characterization of APN and AB functions via Boolean, and the problem on the relation between two classes of bent functions.
This monograph describes and implements partially homomorphic encryption functions using a unified notation. After introducing the appropriate mathematical background, the authors offer a systematic examination of the following known algorithms: Rivest-Shamir-Adleman; Goldwasser-Micali; ElGamal; Benaloh; Naccache-Stern; Okamoto-Uchiyama; Paillier; Damgaard-Jurik; Boneh-Goh-Nissim; and Sander-Young-Yung. Over recent years partially and fully homomorphic encryption algorithms have been proposed and researchers have addressed issues related to their formulation, arithmetic, efficiency and security. Formidable efficiency barriers remain, but we now have a variety of algorithms that can be applied to various private computation problems in healthcare, finance and national security, and studying these functions may help us to understand the difficulties ahead. The book is valuable for researchers and graduate students in Computer Science, Engineering, and Mathematics who are engaged with Cryptology.
Algebraic & geometry methods have constituted a basic background and tool for people working on classic block coding theory and cryptography. Nowadays, new paradigms on coding theory and cryptography have arisen such as: Network coding, S-Boxes, APN Functions, Steganography and decoding by linear programming. Again understanding the underlying procedure and symmetry of these topics needs a whole bunch of non trivial knowledge of algebra and geometry that will be used to both, evaluate those methods and search for new codes and cryptographic applications. This book shows those methods in a self-contained form.
This volume constitutes the refereed and revised post-conference proceedings of the 4th IFIP TC 5 DCITDRR International Conference on Information Technology in Disaster Risk Reduction, ITDRR 2019, in Kyiv, Ukraine, in October 2019. The 17 full papers and 2 short papers presented were carefully reviewed and selected from 53 submissions. The papers focus on various aspects and challenges of coping with disaster risk reduction. The main topics include areas such as natural disasters, big data, cloud computing, Internet of Things, mobile computing, emergency management, disaster information processing, and disaster risk assessment and management.
Cryptography has experienced rapid development, with major advances recently in both secret and public key ciphers, cryptographic hash functions, cryptographic algorithms and multiparty protocols, including their software engineering correctness verification, and various methods of cryptanalysis. This textbook introduces the reader to these areas, offering an understanding of the essential, most important, and most interesting ideas, based on the authors' teaching and research experience. After introducing the basic mathematical and computational complexity concepts, and some historical context, including the story of Enigma, the authors explain symmetric and asymmetric cryptography, electronic signatures and hash functions, PGP systems, public key infrastructures, cryptographic protocols, and applications in network security. In each case the text presents the key technologies, algorithms, and protocols, along with methods of design and analysis, while the content is characterized by a visual style and all algorithms are presented in readable pseudocode or using simple graphics and diagrams. The book is suitable for undergraduate and graduate courses in computer science and engineering, particularly in the area of networking, and it is also a suitable reference text for self-study by practitioners and researchers. The authors assume only basic elementary mathematical experience, the text covers the foundational mathematics and computational complexity theory.
The book compiles technologies for enhancing and provisioning
security, privacy and trust in cloud systems based on Quality of
Service requirements. It is a timely contribution to a field that
is gaining considerable research interest, momentum, and provides a
comprehensive coverage of technologies related to cloud security,
privacy and trust. In particular, the book includes
This book reveals the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945. It traces the all-important genesis and development of the cryptanalytic techniques used to break the main Japanese Navy code (JN-25) and the Japanese Army s Water Transport Code during WWII. This is the first book to describe, explain and analyze the code breaking techniques developed and used to provide this intelligence, thus closing the sole remaining gap in the published accounts of the Pacific War. The authors also explore the organization of cryptographic teams and issues of security, censorship, and leaks. Correcting gaps in previous research, this book illustrates how Sigint remained crucial to Allied planning throughout the war. It helped direct the advance to the Philippines from New Guinea, the sea battles and the submarine onslaught on merchant shipping. Written by well-known authorities on the history of cryptography and mathematics, Code Breaking in the Pacific is designed for cryptologists, mathematicians and researchers working in communications security. Advanced-level students interested in cryptology, the history of the Pacific War, mathematics or the history of computing will also find this book a valuable resource."
The volume "Storing and Transmitting Data" is based on Rudolf Ahlswede's introductory course on "Information Theory I" and presents an introduction to Shannon Theory. Readers, familiar or unfamiliar with the technical intricacies of Information Theory, will benefit considerably from working through the book; especially Chapter VI with its lively comments and uncensored insider views from the world of science and research offers informative and revealing insights. This is the first of several volumes that will serve as a collected research documentation of Rudolf Ahlswede's lectures on information theory. Each volume includes comments from an invited well-known expert. Holger Boche contributed his insights in the supplement of the present volume. Classical information processing concerns the main tasks of gaining knowledge, storage, transmitting and hiding data. The first task is the prime goal of Statistics. For the two next, Shannon presented an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also in Theoretical Computer Science, Physics, and Electrical Engineering with background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find the basis of entire research programs.
(Preliminary): The Orthogonal Frequency Division Multiplexing (OFDM) digital transmission technique has several advantages in broadcast and mobile communications applications. The main objective of this book is to give a good insight into these efforts, and provide the reader with a comprehensive overview of the scientific progress which was achieved in the last decade. Besides topics of the physical layer, such as coding, modulation and non-linearities, a special emphasis is put on system aspects and concepts, in particular regarding cellular networks and using multiple antenna techniques. The work extensively addresses challenges of link adaptation, adaptive resource allocation and interference mitigation in such systems. Moreover, the domain of cross-layer design, i.e. the combination of physical layer aspects and issues of higher layers, are considered in detail. These results will facilitate and stimulate further innovation and development in the design of modern communication systems, based on the powerful OFDM transmission technique.
The rapid increase in computing power and communication speed, coupled with computer storage facilities availability, has led to a new age of multimedia app- cations. Multimedia is practically everywhere and all around us we can feel its presence in almost all applications ranging from online video databases, IPTV, - teractive multimedia and more recently in multimedia based social interaction. These new growing applications require high-quality data storage, easy access to multimedia content and reliable delivery. Moving ever closer to commercial - ployment also aroused a higher awareness of security and intellectual property management issues. All the aforementioned requirements resulted in higher demands on various - eas of research (signal processing, image/video processing and analysis, com- nication protocols, content search, watermarking, etc.). This book covers the most prominent research issues in multimedia and is divided into four main sections: i) content based retrieval, ii) storage and remote access, iii) watermarking and co- right protection and iv) multimedia applications. Chapter 1 of the first section presents an analysis on how color is used and why is it crucial in nowadays multimedia applications. In chapter 2 the authors give an overview of the advances in video abstraction for fast content browsing, transm- sion, retrieval and skimming in large video databases and chapter 3 extends the discussion on video summarization even further. Content retrieval problem is tackled in chapter 4 by describing a novel method for producing meaningful s- ments suitable for MPEG-7 description based on binary partition trees (BPTs).
This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author also reveals a high-level service-oriented firewall configuration language (called FLIP) and a methodology and framework for designing optimal distributed firewall architecture. The chapters illustrate the concepts, algorithms, implementations and case studies for each technique. Automated Firewall Analytics: Design, Configuration and Optimization is appropriate for researchers and professionals working with firewalls. Advanced-level students in computer science will find this material suitable as a secondary textbook or reference.
Analysis of information transfer has found rapid adoption in neuroscience, where a highly dynamic transfer of information continuously runs on top of the brain's slowly-changing anatomical connectivity. Measuring such transfer is crucial to understanding how flexible information routing and processing give rise to higher cognitive function. "Directed Information Measures in Neuroscience" reviews recent developments of concepts and tools for measuring information transfer, their application to neurophysiological recordings and analysis of interactions. Written by the most active researchers in the field the book discusses the state of the art, future prospects and challenges on the way to an efficient assessment of neuronal information transfer. Highlights include the theoretical quantification and practical estimation of information transfer, description of transfer locally in space and time, multivariate directed measures, information decomposition among a set of stimulus/responses variables and the relation between interventional and observational causality. Applications to neural data sets and pointers to open source software highlight the usefulness of these measures in experimental neuroscience. With state-of-the-art mathematical developments, computational techniques and applications to real data sets, this book will be of benefit to all graduate students and researchers interested in detecting and understanding the information transfer between components of complex systems.
Cooperative and relay communications have recently become the most widely explored topics in communications, whereby users cooperate in transmitting their messages to the destination, instead of conventional networks which operate independently and compete among each other for channel resources. As the field has progressed, cooperative communications have become a design concept rather than a specific transmission technology. This concept has revolutionized the design of wireless networks, allowing increased coverage, throughput, and transmission reliability even as conventional transmission techniques gradually reach their limits. Cooperative and relay technologies have also made their way toward next generation wireless standards, such as IEEE802.16 (WiMAX) or LTE, and have been incorporated into many modern wireless applications, such as cognitive radio and secret communications. "Cooperative Communications and Networking: Technologies and System Design" provides a systematic introduction to the fundamental concepts of cooperative communications and relays technology to enable engineers, researchers or graduate students to conduct advanced research and development in this area. "Cooperative Communications and Networking: Technologies and System Design" provides researchers, graduate students, and practical engineers with sufficient knowledge of both the background of cooperative communications and networking, and potential research directions.
An ontology is a formal description of concepts and relationships that can exist for a community of human and/or machine agents. The notion of ontologies is crucial for the purpose of enabling knowledge sharing and reuse. The Handbook on Ontologies provides a comprehensive overview of the current status and future prospectives of the field of ontologies considering ontology languages, ontology engineering methods, example ontologies, infrastructures and technologies for ontologies, and how to bring this all into ontology-based infrastructures and applications that are among the best of their kind. The field of ontologies has tremendously developed and grown in the five years since the first edition of the "Handbook on Ontologies." Therefore, its revision includes 21 completely new chapters as well as a major re-working of 15 chapters transferred to this second edition.
This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection III, is the third volume in the annualseriesproducedbyIFIP WorkingGroup11.10onCriticalInfrastructure Protection, an active international community of scientists, engineers, prac- tioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Third Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at Dartmouth College, Hanover, New Hampshire, March 23-25, 2009. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure protection.
This book constitutes the refereed proceedings of the 16th IFIP WG 9.4 International Conference on Social Implications of Computers in Developing Countries, ICT4D 2020, which was supposed to be held in Salford, UK, in June 2020, but was held virtually instead due to the COVID-19 pandemic. The 18 revised full papers presented were carefully reviewed and selected from 29 submissions. The papers present a wide range of perspectives and disciplines including (but not limited to) public administration, entrepreneurship, business administration, information technology for development, information management systems, organization studies, philosophy, and management. They are organized in the following topical sections: digital platforms and gig economy; education and health; inclusion and participation; and business innovation and data privacy.
Cryptographic applications, such as RSA algorithm, ElGamal cryptography, elliptic curve cryptography, Rabin cryptosystem, Diffie -Hellmann key exchange algorithm, and the Digital Signature Standard, use modular exponentiation extensively. The performance of all these applications strongly depends on the efficient implementation of modular exponentiation and modular multiplication. Since 1984, when Montgomery first introduced a method to evaluate modular multiplications, many algorithmic modifications have been done for improving the efficiency of modular multiplication, but very less work has been done on the modular exponentiation to improve the efficiency. This research monograph addresses the question- how can the performance of modular exponentiation, which is the crucial operation of many public-key cryptographic techniques, be improved? The book focuses on Energy Efficient Modular Exponentiations for Cryptographic hardware. Spread across five chapters, this well-researched text focuses in detail on the Bit Forwarding Techniques and the corresponding hardware realizations. Readers will also discover advanced performance improvement techniques based on high radix multiplication and Cryptographic hardware based on multi-core architectures.
|
You may like...
Advances in Production Management…
Bojan Lalic, Vidosav Majstorovic, …
Hardcover
R2,967
Discovery Miles 29 670
Information Theory - Poincare Seminar…
Bertrand Duplantier, Vincent Rivasseau
Hardcover
R3,518
Discovery Miles 35 180
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R503
Discovery Miles 5 030
Analysis, Cryptography And Information…
Panos M. Pardalos, Nicholas J. Daras, …
Hardcover
R2,473
Discovery Miles 24 730
|