![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
Image and Video Encryption provides a unified overview of techniques for encryption of images and video data. This ranges from commercial applications like DVD or DVB to more research oriented topics and recently published material. This volume introduces different techniques from unified viewpoint, then evaluates these techniques with respect to their respective properties (e.g., security, speed.....). The authors experimentally compare different approaches proposed in the literature and include an extensive bibliography of corresponding published material.
"European industry has already developed successful standards in the past, and I am very con?dent that on the basis of DVB-H, Mobile TV services can developtheeconomiesofscaletheyneedfortake-upacrossEuropeandaround the world," With these words of EU's Telecom Commissioner Viviane Reding, DVB-H is destined to be a dominating mobile TV technology in Europe and even in the world. I was ?rst getting in touch with the DVB technology when I was doing my PhD research in Brunel University in UK in 2002. At that time DVB-T was already a mature and widely used digital broadcast technology and anyone could easily buy a DVB-T receiver in the market to try the digital broadcast signals that have been already broadcasted in UK since 1998. Then the DVB technology world changed dramatically. As a more ?exible and robust terr- trial broadcast system targeting handsets, DVB-H was developed based on DVB-T. In 2003 the DVB-H community were continuously working to ?n- ize the standard. Finally in November 2004 DVB-H was adopted as an ETSI standard EN 302 304. I was lucky to see all these changes when I was doing my PhD research in DVB technology. And I was very proud to be involved in the di?erent DVB-H research projects since the beginning of the DVB-H standard development stage. I was also lucky enough that I am one of the ?rst persons who ?nished PhD degree by focusing on DVB-H research.
Personal motivation. The dream of creating artificial devices that reach or outperform human inteUigence is an old one. It is also one of the dreams of my youth, which have never left me. What makes this challenge so interesting? A solution would have enormous implications on our society, and there are reasons to believe that the AI problem can be solved in my expected lifetime. So, it's worth sticking to it for a lifetime, even if it takes 30 years or so to reap the benefits. The AI problem. The science of artificial intelligence (AI) may be defined as the construction of intelligent systems and their analysis. A natural definition of a system is anything that has an input and an output stream. Intelligence is more complicated. It can have many faces like creativity, solving prob lems, pattern recognition, classification, learning, induction, deduction, build ing analogies, optimization, surviving in an environment, language processing, and knowledge. A formal definition incorporating every aspect of intelligence, however, seems difficult. Most, if not all known facets of intelligence can be formulated as goal driven or, more precisely, as maximizing some utility func tion. It is, therefore, sufficient to study goal-driven AI; e. g. the (biological) goal of animals and humans is to survive and spread. The goal of AI systems should be to be useful to humans."
"Information Theory and Statistical Learning" presents theoretical and practical results about information theoretic methods used in the context of statistical learning. The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines. Advance Praise for "Information Theory and Statistical Learning" "A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places." Shun-ichi Amari, RIKEN Brain Science Institute, Professor-Emeritus at the University of Tokyo
This book aims to fill a growing need in the research community for a reference that describes the state-of-the-art in securing group communications. It focuses on tailoring the security solution to the underlying network architecture (such as the wireless cellular network or the ad hoc/sensor network), or to the application using the security methods (such as multimedia multicasts).
As organizations today are linking their systems across enterprise-wide networks and VPNs as well as increasing their exposure to customers, competitors, browsers and hackers on the Internet, it becomes increasingly imperative for Web professionals to be trained in techniques for effectively protecting their sites from internal and external threats. Each connection magnifies the vulnerability to attack. With the increased connectivity to the Internet and the wide availability of automated cracking tools, organizations can no longer simply rely on operating system security to protect their valuable corporate data. Furthermore, the exploding use of Web technologies for corporate intranets and Internet sites has escalated security risks to corporate data and information systems. Practical Internet Security reveals how the Internet is paving the way for secure communications within organizations and on the public Internet. This book provides the fundamental knowledge needed to analyze risks to a system and to implement a security policy that protects information assets from potential intrusion, damage, or theft. It provides dozens of real-life scenarios and examples, as well as hands-on instruction in securing Web communications and sites. You will learn the common vulnerabilities of Web sites; as well as, how to carry out secure communications across unsecured networks. All system administrators and IT security managers will find this book an essential practical resource.
Autonomic Computing and Networking presents introductory and advanced topics on autonomic computing and networking with emphasis on architectures, protocols, services, privacy & security, simulation and implementation testbeds. Autonomic computing and networking are new computing and networking paradigms that allow the creation of self-managing and self-controlling computing and networking environments using techniques such as distributed algorithms and context-awareness to dynamically control networking functions without human interventions. Autonomic networking is characterized by recovery from failures and malfunctions, agility to changing networking environment, self-optimization and self-awareness. The self-control and management features can help to overcome the growing complexity and heterogeneity of exiting communication networks and systems. The realization of fully autonomic heterogeneous networking introduces several research challenges in all aspects of computing and networking and related fields.
It was an honor and a privilege to chair the 24th IFIP International Information Se- rity Conference (SEC 2009), a 24-year-old event that has become a tradition for - formation security professionals around the world. SEC 2009 was organized by the Technical Committee 11 (TC-11) of IFIP, and took place in Pafos, Cyprus, during May 18-20, 2009. It is an indication of good fortune for a Chair to serve a conference that takes place in a country with the natural beauty of Cyprus, an island where the hospitality and frie- liness of the people have been going together, hand-in-hand, with its long history. This volume contains the papers selected for presentation at SEC 2009. In response to the call for papers, 176 papers were submitted to the conference. All of them were evaluated on the basis of their novelty and technical quality, and reviewed by at least two members of the conference Program Committee. Of the papers submitted, 39 were selected for presentation at the conference; the acceptance rate was as low as 22%, thus making the conference a highly competitive forum. It is the commitment of several people that makes international conferences pos- ble. That also holds true for SEC 2009. The list of people who volunteered their time and energy to help is really long.
This professional book discusses privacy as multi-dimensional, and then pulls forward the economics of privacy in the first few chapters. This book also includes identity-based signatures, spyware, and placing biometric security in an economically broken system, which results in a broken biometric system. The last chapters include systematic problems with practical individual strategies for preventing identity theft for any reader of any economic status. While a plethora of books on identity theft exists, this book combines both technical and economic aspects, presented from the perspective of the identified individual.
This book focuses on the analysis and design of low-density parity-check (LDPC) coded modulations, which are becoming part of several current and future communication systems, such as high-throughput terrestrial and satellite wireless networks. In this book, a two-sided perspective on the design of LDPC coded systems is proposed, encompassing both code/modulation optimization (transmitter side) and detection algorithm design (receiver side). After introducing key concepts on error control coding, in particular LDPC coding, and detection techniques, the book presents several relevant applications. More precisely, by using advanced performance evaluation techniques, such as extrinsic information transfer charts, the optimization of coded modulation schemes are considered for (i) memoryless channels, (ii) dispersive and partial response channels, and (iii) concatenated systems including differential encoding. This book is designed to be used by graduate students working in the field of communication theory, with particular emphasis on LDPC coded communication schemes, and industry experts working on related fields.
Introduction The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical c- mittees. This book represents the proceedings of the 2008 conference of technical committee 8 (TC8), which covers the field of infor- tion systems. TC8 aims to promote and encourage the advancement of research and practice of concepts, methods, techniques and issues related to information systems in organisations. TC8 has established eight working groups covering the following areas: design and evaluation of information systems; the interaction of information systems and the organization; decision support systems; e-business information systems: multi-disciplinary research and practice; inf- mation systems in public administration; smart cards, technology, applications and methods; and enterprise information systems. Further details of the technical committee and its working groups can be found on our website (ifiptc8. dsi. uminho. pt). This conference was part of IFIP's World Computer Congress in Milan, Italy which took place 7-10 September 2008. The occasion celebrated the 32nd anniversary of IFIP TC8. The call for papers invited researchers, educators, and practitioners to submit papers and panel proposals that advance concepts, methods, techniques, tools, issues, education, and practice of information systems in organi- tions. Thirty one submissions were received.
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows."
Coordinated Multiuser Communications provides for the first time a unified treatment of multiuser detection and multiuser decoding in a single volume. Many communications systems, such as cellular mobile radio and wireless local area networks, are subject to multiple-access interference, caused by a multitude of users sharing a common transmission medium. The performance of receiver systems in such cases can be greatly improved by the application of joint detection and decoding methods. Multiuser detection and decoding not only improve system reliability and capacity, they also simplify the problem of resource allocation. Coordinated Multiuser Communications provides the reader with tools for the design and analysis of joint detection and joint decoding methods. These methods are developed within a unified framework of linear multiple-access channels, which includes code-division multiple-access, multiple antenna channels and orthogonal frequency division multiple access. Emphasis is placed on practical implementation aspects and modern iterative processing techniques for systems both with, and without integrated error control coding. Focusing on the theory and practice of unifying accessing and transmission aspects of communications, this book is a valuable reference for students, researchers and practicing engineers.
This book introduces turbo error correcting concept in a simple language, including a general theory and the algorithms for decoding turbo-like code. It presents a unified framework for the design and analysis of turbo codes and LDPC codes and their decoding algorithms. A major focus is on high speed turbo decoding, which targets applications with data rates of several hundred million bits per second (Mbps).
This book is the outcome of the successful NATO Advanced Study Institute on Pattern Recognition Theory and Applications, held at St. Anne's College, Oxford, in April 1981., The aim of the meeting was to review the recent advances in the theory of pattern recognition and to assess its current and future practical potential. The theme of the Institute - the decision making aspects of pattern recognition with the emphasis on the novel hybrid approaches - and its scope - a high level tutorial coverage of pattern recognition methodologies counterpointed with contrib uted papers on advanced theoretical topics and applications - are faithfully reflected by the volume. The material is divided into five sections: 1. Methodology 2. Image Understanding and Interpretation 3. Medical Applications 4. Speech Processing and Other Applications 5. Panel Discussions. The first section covers a broad spectrum of pattern recognition methodologies, including geometric, statistical, fuzzy set, syntactic, graph-theoretic and hybrid approaches. Its cove,r age of hybrid methods places the volume in a unique position among existing books on pattern recognition. The second section provides an extensive treatment of the topical problem of image understanding from both the artificial intelligence and pattern recognition points of view. The two application sections demonstrate the usefulness of the novel methodologies in traditional pattern 'recognition application areas. They address the problems of hardware/software implementation and of algorithm robustness, flexibility and general reliability. The final section reports on a panel discussion held during the Institute.
This reference work looks at modern concepts of computer security. It introduces the basic mathematical background necessary to follow computer security concepts before moving on to modern developments in cryptography. The concepts are presented clearly and illustrated by numerous examples. Subjects covered include: private-key and public-key encryption, hashing, digital signatures, authentication, secret sharing, group-oriented cryptography, and many others. The section on intrusion detection and access control provide examples of security systems implemented as a part of operating system. Database and network security is also discussed. The final chapters introduce modern e- business systems based on digital cash.
This book constitutes the refereed proceedings of the 14th International Conference on Information Security, ISC 2011, held in Xi'an, China, in October 2011. The 25 revised full papers were carefully reviewed and selected from 95 submissions. The papers are organized in topical sections on attacks; protocols; public-key cryptosystems; network security; software security; system security; database security; privacy; digital signatures.
This book constitutes the refereed proceedings of the 13th IMA International Conference on Cryptography and Coding, IMACC 2011, held in Oxford, UK in December 2011. The 27 revised full papers presented together with one invited contribution were carefully reviewed and selected from 57 submissions. The papers cover a wide range of topics in the field of mathematics and computer science, including coding theory, homomorphic encryption, symmetric and public key cryptosystems, cryptographic functions and protocols, efficient pairing and scalar multiplication implementation, knowledge proof, and security analysis.
This book constitutes the refereed proceedings of the 10th International Conference on Cryptology and Network Security, CANS 2011, held in Sanya, China, in December 2011. The 18 revised full papers, presented were carefully reviewed and selected from 65 submissions. The book also includes two invited talks. The papers are organized in topical sections on symmetric cryptanalysis, symmetric ciphers, public key cryptography, protocol attacks, and privacy techniques.
This book constitutes the refereed proceedings of the 12th International Conference on Cryptology in India, INDOCRYPT 2011, held in Chennai, India, in December 2011. The 22 revised full papers presented together with the abstracts of 3 invited talks and 3 tutorials were carefully reviewed and selected from 127 submissions. The papers are organized in topical sections on side-channel attacks, secret-key cryptography, hash functions, pairings, and protocols.
This book is about language that is designed to mean what it does not seem to mean. Ciphers and codes conceal messages and protect secrets. Symbol and magic hide meanings to delight or imperil. Languages made to baffle and confuse let insiders talk openly without being understood by those beyond the circle. Barry Blake looks at these and many more. He explores the history and uses of the slangs and argots of schools and trades. He traces the centuries-old cants used by sailors and criminals in Britain, among them Polari, the mix of Italian, Yiddish, and slang once spoken among strolling players and circus folk and taken up by gays in the twentieth century. He examines the sacred languages of ancient cults and religions, uncovers the workings of onomancy, spells, and gematria, looks into the obliqueness of allusion and parody, and celebrates the absurdities of euphemism and jargon. Secret Language takes the reader on fascinating excursions down obscure byways of language, ranging across time and culture. With revelations on every page it will entertain anyone with an urge to know more about the most arcane and curious uses of language.
This book constitutes the refereed proceedings of the 8th
International Conference on Autonomic and Trusted Computing, ATC
2011, held in Banff, Canada, September 2011.
This book constitutes the refereed proceedings of the International Symposium on Information and Automation, ISIA 2010, held in Guangzhou, China, in November 2010. The 110 revised full papers presented were carefully reviewed and selected from numerous submissions. The symposium provides a forum for researchers, educators, engineers, and government officials to present and discuss their latest research results and exchange views on the future research directions in the general areas of Information and Automation.
This book constitutes the refereed proceedings of the IFIP WG 8.4/8.9 International Cross Domain Conference and Workshop on Availability, Reliability and Security - Multidisciplinary Research and Practice for Business, Enterprise and Health Information Systems, ARGES 2011, held in Vienna, Austria, in August 2011.The 29 revised papers presented were carefully reviewed and selected for inclusion in the volume. The papers concentrate on the many aspects of availability, reliability and security for information systems as a discipline bridging the application fields and the well- defined computer science field. They are organized in three sections: multidisciplinary research and practice for business, enterprise and health information systems; massive information sharing and integration and electronic healthcare; and papers from the colocated International Workshop on Security and Cognitive Informatics for Homeland Defense.
The creation of the text really began in 1976 with the author being involved with a group of researchers at Stanford University and the Naval Ocean Systems Center, San Diego. At that time, adaptive techniques were more laboratory (and mental) curiosities than the accepted and pervasive categories of signal processing that they have become. Over the lasl 10 years, adaptive filters have become standard components in telephony, data communications, and signal detection and tracking systems. Their use and consumer acceptance will undoubtedly only increase in the future. The mathematical principles underlying adaptive signal processing were initially fascinating and were my first experience in seeing applied mathematics work for a paycheck. Since that time, the application of even more advanced mathematical techniques have kept the area of adaptive signal processing as exciting as those initial days. The text seeks to be a bridge between the open literature in the professional journals, which is usually quite concentrated, concise, and advanced, and the graduate classroom and research environment where underlying principles are often more important. |
You may like...
Fundamental Problems in Computing…
Sekharipuram S. Ravi, Sandeep Kumar Shukla
Hardcover
R2,942
Discovery Miles 29 420
Handbook of Research on Modern Systems…
Mahbubur Rahman Syed, Sharifun Nessa Syed
Hardcover
R6,756
Discovery Miles 67 560
Component-Based Software Engineering…
Umesh Kumar Tiwari, Santosh Kumar
Hardcover
R3,354
Discovery Miles 33 540
Learning from Data Streams - Processing…
Joao Gama, Mohamed Medhat Gaber
Hardcover
R2,657
Discovery Miles 26 570
Power Systems Resilience - Modeling…
Naser Mahdavi Tabatabaei, Sajad Najafi Ravadanegh, …
Hardcover
R4,062
Discovery Miles 40 620
Loss - Poems To Better Weather The Many…
Donna Ashworth
Hardcover
(1)
|