![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
This book focuses on how to apply network coding at different layers in wireless networks - including MAC, routing, and TCP - with special focus on cognitive radio networks. It discusses how to select parameters in network coding (e.g., coding field, number of packets involved, redundant information ration) in order to be suitable for the varying wireless environments. The author explores how to deploy network coding in MAC to improve network performance and examine joint network coding with opportunistic routing to improve the successful rate of routing. In regards to TCP and network coding, the author considers transport layer protocol working with network coding to overcome the transmission error rate, particularly with how to use the ACK feedback of TCP to enhance the efficiency of network coding. The book pertains to researchers and postgraduate students, especially whose interests are in opportunistic routing and TCP in cognitive radio networks.
Universal codes efficiently compress sequences generated by stationary and ergodic sources with unknown statistics, and they were originally designed for lossless data compression. In the meantime, it was realized that they can be used for solving important problems of prediction and statistical analysis of time series, and this book describes recent results in this area. The first chapter introduces and describes the application of universal codes to prediction and the statistical analysis of time series; the second chapter describes applications of selected statistical methods to cryptography, including attacks on block ciphers; and the third chapter describes a homogeneity test used to determine authorship of literary texts. The book will be useful for researchers and advanced students in information theory, mathematical statistics, time-series analysis, and cryptography. It is assumed that the reader has some grounding in statistics and in information theory.
This book provides a systematic and comparative description of the vast number of research issues related to the quality of data and information. It does so by delivering a sound, integrated and comprehensive overview of the state of the art and future development of data and information quality in databases and information systems. To this end, it presents an extensive description of the techniques that constitute the core of data and information quality research, including record linkage (also called object identification), data integration, error localization and correction, and examines the related techniques in a comprehensive and original methodological framework. Quality dimension definitions and adopted models are also analyzed in detail, and differences between the proposed solutions are highlighted and discussed. Furthermore, while systematically describing data and information quality as an autonomous research area, paradigms and influences deriving from other areas, such as probability theory, statistical data analysis, data mining, knowledge representation, and machine learning are also included. Last not least, the book also highlights very practical solutions, such as methodologies, benchmarks for the most effective techniques, case studies, and examples. The book has been written primarily for researchers in the fields of databases and information management or in natural sciences who are interested in investigating properties of data and information that have an impact on the quality of experiments, processes and on real life. The material presented is also sufficiently self-contained for masters or PhD-level courses, and it covers all the fundamentals and topics without the need for other textbooks. Data and information system administrators and practitioners, who deal with systems exposed to data-quality issues and as a result need a systematization of the field and practical methods in the area, will also benefit from the combination of concrete practical approaches with sound theoretical formalisms.
A new class of provably capacity achieving error-correction codes, polar codes are suitable for many problems, such as lossless and lossy source coding, problems with side information, multiple access channel, etc. The first comprehensive book on the implementation of decoders for polar codes, the authors take a tutorial approach to explain the practical decoder implementation challenges and trade-offs in either software or hardware. They also demonstrate new trade-offs in latency, throughput, and complexity in software implementations for high-performance computing and GPGPUs, and hardware implementations using custom processing elements, full-custom application-specific integrated circuits (ASICs), and field-programmable-gate arrays (FPGAs). Presenting a good overview of this research area and future directions, High-Speed Decoders for Polar Codes is perfect for any researcher or SDR practitioner looking into implementing efficient decoders for polar codes, as well as students and professors in a modern error correction class. As polar codes have been accepted to protect the control channel in the next-generation mobile communication standard (5G) developed by the 3GPP, the audience includes engineers who will have to implement decoders for such codes and hardware engineers designing the backbone of communication networks.
This book focuses on the different representations and cryptographic properties of Booleans functions, presents constructions of Boolean functions with some good cryptographic properties. More specifically, Walsh spectrum description of the traditional cryptographic properties of Boolean functions, including linear structure, propagation criterion, nonlinearity, and correlation immunity are presented. Constructions of symmetric Boolean functions and of Boolean permutations with good cryptographic properties are specifically studied. This book is not meant to be comprehensive, but with its own focus on some original research of the authors in the past. To be self content, some basic concepts and properties are introduced. This book can serve as a reference for cryptographic algorithm designers, particularly the designers of stream ciphers and of block ciphers, and for academics with interest in the cryptographic properties of Boolean functions.
This book explores the future of cyber technologies and cyber operations which will influence advances in social media, cyber security, cyber physical systems, ethics, law, media, economics, infrastructure, military operations and other elements of societal interaction in the upcoming decades. It provides a review of future disruptive technologies and innovations in cyber security. It also serves as a resource for wargame planning and provides a strategic vision of the future direction of cyber operations. It informs military strategist about the future of cyber warfare. Written by leading experts in the field, chapters explore how future technical innovations vastly increase the interconnectivity of our physical and social systems and the growing need for resiliency in this vast and dynamic cyber infrastructure. The future of social media, autonomy, stateless finance, quantum information systems, the internet of things, the dark web, space satellite operations, and global network connectivity is explored along with the transformation of the legal and ethical considerations which surround them. The international challenges of cyber alliances, capabilities, and interoperability is challenged with the growing need for new laws, international oversight, and regulation which informs cybersecurity studies. The authors have a multi-disciplinary scope arranged in a big-picture framework, allowing both deep exploration of important topics and high level understanding of the topic. Evolution of Cyber Technologies and Operations to 2035 is as an excellent reference for professionals and researchers working in the security field, or as government and military workers, economics, law and more. Students will also find this book useful as a reference guide or secondary text book.
This book provides a comprehensive introduction to advanced topics in the computational and algorithmic aspects of number theory, focusing on applications in cryptography. Readers will learn to develop fast algorithms, including quantum algorithms, to solve various classic and modern number theoretic problems. Key problems include prime number generation, primality testing, integer factorization, discrete logarithms, elliptic curve arithmetic, conjecture and numerical verification. The author discusses quantum algorithms for solving the Integer Factorization Problem (IFP), the Discrete Logarithm Problem (DLP), and the Elliptic Curve Discrete Logarithm Problem (ECDLP) and for attacking IFP, DLP and ECDLP based cryptographic systems. Chapters also cover various other quantum algorithms for Pell's equation, principal ideal, unit group, class group, Gauss sums, prime counting function, Riemann's hypothesis and the BSD conjecture. Quantum Computational Number Theory is self-contained and intended to be used either as a graduate text in computing, communications and mathematics, or as a basic reference in the related fields. Number theorists, cryptographers and professionals working in quantum computing, cryptography and network security will find this book a valuable asset.
This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two independent precise faults in a single pairing computation, it has not been taken seriously either. Based on these two attacks, the book demonstrates that the assessment of physical attack complexity is error-prone, and as such cryptography should not rely on it. Cryptographic technologies have to be protected against all physical attacks, whether they have already been successfully implemented or not. The development of countermeasures does not require the successful execution of an attack but can already be carried out as soon as the principle of a side channel or a fault attack is sufficiently understood.
Avec une preface de Jean-Pierre Luminet Comment penser un atome ? Qu'est devenue la pomme de Newton ? Sous quelles formes representer l'Univers ? Quelle imagerie est la preferee des scientifiques afin de decrire notre monde ? Comment creer et utiliser les metaphores ? Le changement de paradigme opere par la physique au 20e siecle exige de transformer notre systeme de representations et de repenser notre cadre referentiel. De la simple comparaison didactique a la metaphore heuristique, cet ouvrage recense les " images vedettes " a l'oeuvre dans la diffusion des connaissances et expose les huit benefices principaux inherents a cette imagerie scientifique. L'importance de faire un usage maitrise de ces " reflexions " depasse largement un transfert d'informations. C'est la raison pour laquelle un guide a l'usage des scientifiques est propose. Sous la forme de questions-reponses, ce guide pratique avertit des pieges a eviter tout en indiquant les emplois metaphoriques les plus pertinents pour comprendre et se faire comprendre.
This book - an outgrowth of a topical summer school - sets out to introduce non-specialists from physics and engineering to the basic mathematical concepts of approximation and Fourier theory. After a general introduction, Part II of this volume contains basic material on the complex and harmonic analysis underlying the further developments presented. Part III deals with the essentials of approximation theory while Part IV completes the foundations by a tour of probability theory. Part V reviews some major applications in signal and control theory. In Part VI mathematical aspects of dynamical systems theory are discussed. Part VII, finally, is devoted to a modern approach to two physics problems: turbulence and the control and noise analysis in gravitational waves measurements.
Contemporary epistemological and cognitive studies, as well as recent trends in computer science and game theory have revealed an increasingly important and intimate relationship between Information, Interaction, and Agency. Agents perform actions based on the available information and in the presence of other interacting agents. From this perspective Information, Interaction, and Agency neatly ties together classical themes like rationality, decision-making and belief revision with games, strategies and learning in a multi-agent setting. Unified by the central notions Information, Interaction, and Agency, the essays in this volume provide refreshing methodological perspectives on belief revision, dynamic epistemic logic, von Neumann games, and evolutionary game theory; all of which in turn are central approaches to understanding our own rationality and that of other agents. Reprinted from Synthese, 139:2 and 142:2 (2004), Special Section Knowledge, Rationality, and Action.
Die Kryptologie, eine jahrtausendealte "Geheimwissenschaft," gewinnt zusehends praktische Bedeutung fur den Schutz von Kommunikationswegen, Datenbanken und Software. Neben ihre Nutzung in rechnergestutzten offentlichen Nachrichtensystemen ("public keys") treten mehr und mehr rechnerinterne Anwendungen, wie Zugriffsberechtigungen und der Quellenschutz von Software. - Der erste Teil des Buches behandelt die Geheimschriften und ihren Gebrauch - die Kryptographie. Dabei wird auch auf das aktuelle Thema "Kryptographie und Grundrechte des Burgers" eingegangen. Im zweiten Teil wird das Vorgehen zum unbefugten Entziffern einer Geheimschrift - die Kryptanalyse - besprochen, wobei insbesondere Hinweise zur Beurteilung der Verfahrenssicherheit gegeben werden. Mit der vorliegenden dritten Auflage wurde das Werk auf den neuesten Stand gebracht. - Das Buch setzt nur mathematische Grundkenntnisse voraus. Mit einer Fulle spannender, lustiger und bisweilen anzuglicher Geschichten aus der historischen Kryptologie gewurzt, ist es auch fur Laien reizvoll zu lesen."
Communication complexity is the mathematical study of scenarios where several parties need to communicate to achieve a common goal, a situation that naturally appears during computation. This introduction presents the most recent developments in an accessible form, providing the language to unify several disjoint research subareas. Written as a guide for a graduate course on communication complexity, it will interest a broad audience in computer science, from advanced undergraduates to researchers in areas ranging from theory to algorithm design to distributed computing. The first part presents basic theory in a clear and illustrative way, offering beginners an entry into the field. The second part describes applications including circuit complexity, proof complexity, streaming algorithms, extension complexity of polytopes, and distributed computing. Proofs throughout the text use ideas from a wide range of mathematics, including geometry, algebra, and probability. Each chapter contains numerous examples, figures, and exercises to aid understanding.
Auction theory is now an important component of an economist's training. The techniques and insights gained from the study of auction theory provide a useful starting point for those who want to venture into the economics of information, mechanism design, and regulatory economics. This book provides a step-by-step, self-contained treatment of the theory of auctions. It allows students and readers with a calculus background to work through all the basic results, covering the basic independent-private-model; the effects of introducing correlation in valuations on equilibrium behaviour and the seller's expected revenue; mechanism design; and the theory of multi-object auctions.
Model checking is one of the most successful verification techniques and has been widely adopted in traditional computing and communication hardware and software industries. This book provides the first systematic introduction to model checking techniques applicable to quantum systems, with broad potential applications in the emerging industry of quantum computing and quantum communication as well as quantum physics. Suitable for use as a course textbook and for self-study, graduate and senior undergraduate students will appreciate the step-by-step explanations and the exercises included. Researchers and engineers in the related fields can further develop these techniques in their own work, with the final chapter outlining potential future applications.
This book is a useful text for advanced students of MIS and ICT courses, and for those studying ICT in related areas: Management and Organization Studies, Cultural Studies, and Technology and Innovation. As ICTs permeate every sphere of society - business, education, leisure, government, etc. - it is important to reflect the character and complexity of the interaction between people and computers, between society and technology. For example, the user may represent a much broader set of actors than 'the user' conventionally found in many texts: the operator, the customer, the citizen, the gendered individual, the entrepreneur, the 'poor', the student. Each actor uses ICT in different ways. This book examines these issues, deploying a number of methods such as Actor Network Theory, Socio-Technical Systems, and phenomenological approaches. Management concerns about strategy and productivity are covered together with issues of power, politics, and globalization. Topics range from long-standing themes in the study of IT in organizations such as implementation, strategy, and evaluation, to general analysis of IT as socio-economic change A distinguished group of contributors, including Bruno Latour, Saskia Sassen, Robert Galliers, Frank Land, Ian Angel, and Richard Boland, offer the reader a rich set of perspectives and ideas on the relationship between ICT and society, organizational knowledge and innovation.
This book provides an elementary introduction to Information Theory and Coding Theory - two related aspects of the problem of how to transmit information efficiently and accurately. The first part of the book focuses on Information Theory, covering uniquely decodable and instantaneous codes, Huffman coding, entropy, information channels, and Shannon's Fundamental Theorem. In the second part, on Coding Theory, linear algebra is used to construct examples of such codes, such as the Hamming, Hadamard, Golay and Reed-Muller codes.The book emphasises carefully explained proofs and worked examples; exercises (with solutions) are integrated into the text as part of the learning process. Only some basic probability theory and linear algebra, together with a little calculus (as covered in most first-year university syllabuses), is assumed, making it suitable for second- and third-year undergraduates in mathematics, electronics and computer science.
Computer Media and Communication: A Reader is a collection of key texts selected for their significance to thought about computers as media. The book is divided into two parts. The chapters in the first part offer a chronological overview of how thinking about computers as a means of communication developed, while the second part offers far-reaching analyses of the implications of computer media for culture and society, while highlighting significant directions of current research. The book not only provides an insight into how thinking about computers as media has developed but also is an excellent guide for students and others interested in the field of media and communication studies. (This book is the first in the Oxford Readers in Media and Communication series under the General Editorship of Professors Brian Winston and Everette Dennis which will be an authoritative wide-ranging series of readings for media students. There are more than eighty institutions in the UK offering courses in the field at present and in the USA this number is ten times as great.)
The development of interpersonal skills in all health professions is of increasing interest to a wide range of teachers, students, practioners and managers. This expanded and revised edition, includes further information on reflection and counselling, and provides many activities and exercises to help the reader to devise learning strategies in the interpersonal domain. The author draws on a range of literature and research to provide a guide to teaching and learning interpersonal skills. This guide offers both the theory and practice of how to draw on people's life experience in order to enhance their interpersonal skills. Chapters are included on educational theory, managing learning groups and curriculum design. Short sections called "activities for improving interpersonal skills" provide brief exercises and tips that can further develop skills. Teachers, students, practioners and managers in all health professions should find this book useful in acquiring interpersonal skills.
Error correcting codes is an area which has vital importance for the development of signal processing over the next 20 years. The purpose of this book is to provide a theoretical and practical introduction to the subject of error correcting codes. This textbook is a reprint of Chapters 1-20 of the original hardback edition. It provides the reader with the tools necessary to implement modern error-processing schemes, and only a basic knowledge of linear algebra is assumed. More elementary than any of the other books of the same subject, it will be considerably easier for most graduate students to follow. All the necessary mathematics is developed in parallel with the applications. The author has provided a simple step-by-step approach, with worked examples which motivate and explain the theory.
A "New York Times" Notable Book
Learn about an information-theoretic approach to managing interference in future generation wireless networks. Focusing on cooperative schemes motivated by Coordinated Multi-Point (CoMP) technology, the book develops a robust theoretical framework for interference management that uses recent advancements in backhaul design, and practical pre-coding schemes based on local cooperation, to deliver the increased speed and reliability promised by interference alignment. Gain insight into how simple, zero-forcing pre-coding schemes are optimal in locally connected interference networks, and discover how significant rate gains can be obtained by making cell association decisions and allocating backhaul resources based on centralized (cloud) processing and knowledge of network topology. Providing a link between information-theoretic analyses and interference management schemes that are easy to implement, this is an invaluable resource for researchers, graduate students and practicing engineers in wireless communications.
Marshall McLuhan's insights are fresher and more applicable today than when he first announced them to a startled world. A whole new generation is turning to his work to understand a global village made real by the information superhighway and the overwhelming challenge of electronic transformation."Before anyone could perceive the electric form of the information revolution, McLuhan was publishing brilliant explanations of the perceptual changes being experienced by the users of mass media. He seemed futuristic to some and an enemy of print and literacy to others. He was, in reality, a deeply literate man of astonishing prescience. Tom Wolfe suggested aloud that McLuhan's work was as important culturally as that of Darwin or Freud. Agreement and scoffing ensued. Increasingly Wolfe's wonder seems justified."From the IntroductionHere in one volume, are McLuhan's key ideas, drawn from his books, articles, correspondence, and published speeches. This book is the essential archive of his constantly surprising vision. |
You may like...
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,961
Discovery Miles 209 610
Research Methodologies, Innovations and…
Manuel Mora, Ovsei Gelman, …
Hardcover
R4,529
Discovery Miles 45 290
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,954
Discovery Miles 209 540
Quantum Zero-Error Information Theory
Elloa B. Guedes, Francisco Marcos De Assis, …
Hardcover
R3,601
Discovery Miles 36 010
|