![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
This book constitutes the thoroughly refereed joint post proceedings of two international workshops, the 5th International Workshop on Data Privacy Management, DPM 2010, and the 3rd International Workshop on Autonomous and Spontaneous Security, SETOP 2010, collocated with the ESORICS 2010 symposium in Athens, Greece, in September 2010. The 9 revised full papers for DPM 2010 presented together with two keynote talks are accompanied by 7 revised full papers of SETOP 2010; all papers were carefully reviewed and selected for inclusion in the book. The DPM 2010 papers cover topics such as how to translate the high-level business goals into system-level privacy policies, administration of privacy-sensitive data, privacy data integration and engineering, privacy access control mechanisms, information-oriented security, and query execution on privacy-sensitive data for partial answers. The SETOP 2010 papers address several specific aspects of the previously cited topics, as for instance the autonomic administration of security policies, secure P2P storage, RFID authentication, anonymity in reputation systems, etc.
This book constitutes the refereed proceedings of the 7th International Conference on Sequences and Their Applications, SETA 2012, held in Waterloo, Canada, in June 2012. The 28 full papers presented together with 2 invited papers in this volume were carefully reviewed and selected from 48 submissions. The papers are grouped in topical sections on perfect sequences; finite fields; boolean functions; Golomb 80th birthday session; linear complexity; frequency hopping; correlation of sequences; bounds on sequences, cryptography; aperiodic correlation; and Walsh transform.
This book contains extended and revised versions of the best papers that were presented during the fifteenth edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 15th conference was held at the Georgia Institute of Technology, Atlanta, USA (October 15-17, 2007). Previous conferences have taken place in Edinburgh, Trondheim, Vancouver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth and Nice. The purpose of this conference, sponsored by IFIP TC 10 Working Group 10.5 and by the IEEE Council on Electronic Design Automation (CEDA), is to provide a forum to exchange ideas and show industrial and academic research results in the field of microelectronics design. The current trend toward increasing chip integration and technology process advancements brings about stimulating new challenges both at the physical and system-design levels, as well in the test of these systems. VLSI-SoC conferences aim to address these exciting new issues.
This book constitutes the thoroughly refereed proceedings of the 9th Theory of Cryptography Conference, TCC 2012, held in Taormina, Sicily, Italy, in March 2012. The 36 revised full papers presented were carefully reviewed and selected from 131 submissions. The papers are organized in topical sections on secure computation; (blind) signatures and threshold encryption; zero-knowledge and security models; leakage-resilience; hash functions; differential privacy; pseudorandomness; dedicated encryption; security amplification; resettable and parallel zero knowledge.
Having trouble deciding which coding scheme to employ, how to design a new scheme, or how to improve an existing system? This summary of the state-of-the-art in iterative coding makes this decision more straightforward. With emphasis on the underlying theory, techniques to analyse and design practical iterative coding systems are presented. Using Gallager's original ensemble of LDPC codes, the basic concepts are extended for several general codes, including the practically important class of turbo codes. The simplicity of the binary erasure channel is exploited to develop analytical techniques and intuition, which are then applied to general channel models. A chapter on factor graphs helps to unify the important topics of information theory, coding and communication theory. Covering the most recent advances, this text is ideal for graduate students in electrical engineering and computer science, and practitioners. Additional resources, including instructor's solutions and figures, available online: www.cambridge.org/9780521852296.
Introduction The goal of this book is to introduce XML to a bioinformatics audience. It does so by introducing the fundamentals of XML, Document Type De?nitions (DTDs), XML Namespaces, XML Schema, and XML parsing, and illustrating these concepts with speci?c bioinformatics case studies. The book does not assume any previous knowledge of XML and is geared toward those who want a solid introduction to fundamental XML concepts. The book is divided into nine chapters: Chapter 1: Introduction to XML for Bioinformatics. This chapter provides an introduction to XML and describes the use of XML in biological data exchange. A bird's-eye view of our ?rst case study, the Distributed Annotation System (DAS), is provided and we examine a sample DAS XML document. The chapter concludes with a discussion of the pros and cons of using XML in bioinformatic applications. Chapter 2: Fundamentals of XML and BSML. This chapter introduces the fundamental concepts of XML and the Bioinformatic Sequence Markup Language (BSML). We explore the origins of XML, de?ne basic rules for XML document structure, and introduce XML Na- spaces. We also explore several sample BSML documents and visualize these documents in the TM Rescentris Genomic Workspace Viewer.
In 1953, exactly 50 years ago to this day, the first volume of
Studia Logica appeared under the auspices of The Philosophical
Committee of The Polish Academy of Sciences. Now, five decades
later the present volume is dedicated to a celebration of this 50th
Anniversary of Studia Logica. The volume features a series of
papers by distinguished scholars reflecting both the aim and scope
of this journal for symbolic logic.
The first edition of the monograph Information and Randomness: An Algorithmic Perspective by Crist ian Calude was published in 1994. In my Foreword I said: "The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. . . . The present book by Calude fits very well in our series. Much original research is presented. . . making the approach richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and, thus, it is not necessary to consult other sources for classroom instruction. " The vigorous growth in the study of algorithmic information theory has continued during the past few years, which is clearly visible in the present second edition. Many new results, examples, exercises and open prob lems have been added. The additions include two entirely new chapters: "Computably Enumerable Random Reals" and "Randomness and Incom pleteness." The really comprehensive new bibliography makes the book very valuable for a researcher. The new results about the characterization of computably enumerable random reals, as well as the fascinating Omega Numbers, should contribute much to the value of the book as a textbook. The author has been directly involved in these results that have appeared in the prestigious journals Nature, New Scientist and Pour la Science."
Digital forensics deals with the acquisition, preservation, examination, analysis and presentation of electronic evidence. Networked computing, wireless communications and portable electronic devices have expanded the role of digital forensics beyond traditional computer crime investigations. Practically every crime now involves some aspect of digital evidence; digital forensics provides the techniques and tools to articulate this evidence. Digital forensics also has myriad intelligence applications. Furthermore, it has a vital role in information assurance - investigations of security breaches yield valuable information that can be used to design more secure systems. Advances in Digital Forensics describes original research results and innovative applications in the emerging discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. The areas of coverage include:
This book is the first volume of a new series produced by the International Federation for Information Processing (IFIP) Working Group 11.9 on Digital Forensics, an international community of scientists, engineers and practitioners dedicated to advancing the state of the art of research and practice in digital forensics. The book contains a selection of twenty-five edited papers from the First Annual IFIP WG 11.9 Conference on Digital Forensics, held at the National Center for Forensic Science, Orlando, Florida, USA in February 2005. Advances in Digital Forensics is an important resource for researchers, faculty members and graduate students, as well as for practitioners and individuals engaged in research and development efforts for the law enforcement and intelligence communities. Mark Pollitt is President of Digital Evidence Professional Services, Inc., Ellicott City, Maryland, USA. Mr. Pollitt, who is retired from the Federal Bureau of Investigation (FBI), served as the Chief of the FBI's Computer Analysis Response Team, and Director of the Regional Computer Forensic Laboratory National Program. Sujeet Shenoi is the F.P. Walter Professor of Computer Science and a principal with the Center for Information Security at the University of Tulsa, Tulsa, Oklahoma, USA. For more information about the 300 other books in the IFIP series, please visit www.springeronline.com. For more information about IFIP, please visit www.ifip.org.
Juraj Hromkovic takes the reader on an elegant route through the theoretical fundamentals of computer science. The author shows that theoretical computer science is a fascinating discipline, full of spectacular contributions and miracles. The book also presents the development of the computer scientist's way of thinking as well as fundamental concepts such as approximation and randomization in algorithmics, and the basic ideas of cryptography and interconnection network design.
This book has a long history of more than 20 years. The first attempt to write a monograph on information-theoretic approach to thermodynamics was done by one of the authors (RSI) in 1974 when he published, in the preprint form, two volumes of the book "Information Theory and Thermodynamics" concerning classical and quantum information theory, [153] (220 pp.), [154] (185 pp.). In spite of the encouraging remarks by some of the readers, the physical part of this book was never written except for the first chapter. Now this material is written completely anew and in much greater extent. A few years earlier, in 1970, second author of the present book, (AK), a doctoral student and collaborator of RSI in Toruli, published in Polish, also as a preprint, his habilitation dissertation "Information-theoretical decision scheme in quantum statistical mechanics" [196] (96 pp.). This small monograph presented his original results in the physical part of the theory developed in the Torun school. Unfortunately, this preprint was never published in English. The present book contains all these results in a much more modern and developed form.
Cryptography is one of the most active areas in current mathematics research and applications. This book focuses on cryptography along with two related areas: the study of probabilistic proof systems, and the theory of computational pseudorandomness. Following a common theme that explores the interplay between randomness and computation, the important notions in each field are covered, as well as novel ideas and insights.
Objectives Computer and communication practice relies on data compression and dictionary search methods. They lean on a rapidly developing theory. Its exposition from a new viewpoint is the purpose of the book. We start from the very beginning and finish with the latest achievements of the theory, some of them in print for the first time. The book is intended for serving as both a monograph and a self-contained textbook. Information retrieval is the subject of the treatises by D. Knuth (1973) and K. Mehlhorn (1987). Data compression is the subject of source coding. It is a chapter of information theory. Its up-to-date state is presented in the books of Storer (1988), Lynch (1985), T. Bell et al. (1990). The difference between them and the present book is as follows. First. We include information retrieval into source coding instead of discussing it separately. Information-theoretic methods proved to be very effective in information search. Second. For many years the target of the source coding theory was the estimation of the maximal degree of the data compression. This target is practically bit today. The sought degree is now known for most of the sources. We believe that the next target must be the estimation of the price of approaching that degree. So, we are concerned with trade-off between complexity and quality of coding. Third. We pay special attention to universal families that contain a good com pressing map for every source in a set."
Cryptography, secret writing, is enjoying a scientific renaissance following the seminal discovery in 1977 of public-key cryptography and applications in computers and communications. This book gives a broad overview of public-key cryptography - its essence and advantages, various public-key cryptosystems, and protocols - as well as a comprehensive introduction to classical cryptography and cryptoanalysis. The second edition has been revised and enlarged especially in its treatment of cryptographic protocols. From a review of the first edition: "This is a comprehensive review ... there can be no doubt that this will be accepted as a standard text. At the same time, it is clearly and entertainingly written ... and can certainly stand alone." "Alex M. Andrew, Kybernetes, March 1992"
Second International Workshop on Formal Aspects in Security and Trust is an essential reference for both academic and professional researchers in the field of security and trust. Because of the complexity and scale of deployment of emerging ICT systems based on web service and grid computing concepts, we also need to develop new, scalable, and more flexible foundational models of pervasive security enforcement across organizational borders and in situations where there is high uncertainty about the identity and trustworthiness of the participating networked entites. On the other hand, the increasingly complex set of building activities sharing different resources but managed with different policies calls for new and business-enabling models of trust between members of virtual organizations and communities that span the boundaries of physical enterprises and loosely structured groups of individuals. The papers presented in this volume address the challenges posed by "ambient intelligence space" as a future paradigm and the need for a set of concepts, tools and methodologies to enable the user's trust and confidence in the underlying computing infrastructure. This state-of-the-art volume presents selected papers from the 2nd International Workshop on Formal Aspects in Security and Trust, held in conjuuctions with the 18th IFIP World Computer Congress, August 2004, in Toulouse, France. The collection will be important not only for computer security experts and researchers but also for teachers and adminstrators interested in security methodologies and research.
Multimedia Encryption and Watermarking presents a comprehensive survey of contemporary multimedia encryption and watermarking techniques, which enable a secure exchange of multimedia intellectual property. Part I, Digital Rights Management (DRM) for Multimedia, introduces DRM concepts and models for multimedia content protection, and presents the key players. Part II, Multimedia Cryptography, provides an overview of modern cryptography, with the focus on modern image, video, speech, and audio encryption techniques. This book also provides an advanced concept of visual and audio sharing techniques. Part III, Digital Watermarking, introduces the concept of watermarking for multimedia, classifies watermarking applications, and evaluates various multimedia watermarking concepts and techniques, including digital watermarking techniques for binary images. Multimedia Encryption and Watermarking is designed for researchers and practitioners, as well as scientists and engineers who design and develop systems for the protection of digital multimedia content. This volume is also suitable as a textbook for graduate courses on multimedia security.
2.1 E-Government: e-Governance and e-Democracy The term Electronic Government (e-Government), as an expression, was coined after the example of Electronic Commerce. In spite of being a relatively recent expression, e-Government designates a field of activity that has been with us for several decades and which has attained a high level of penetration in many countries2. What has been observed over the recent years is a shift on the broadness of the e-Government concept. The ideas inside e-Governance and e- Democracy are to some extent promising big changes in public administration. The demand now is not only simply delivering a service - line. It is to deliver complex and new services, which are all citizen-centric. Another important demand is related to the improvement of citizen's participation in governmental processes and decisions so that the governments' transparency and legitimacy are enforced. In order to fulfill these new demands, a lot of research has been done over the recent years (see Section 3) but many challenges are still to be faced, not only in the technological field, but also in the political and social aspects.
Quality of Protection: Security Measurements and Metrics is an edited volume based on the Quality of Protection Workshop in Milano, Italy (September 2005). This volume discusses how security research can progress towards quality of protection in security comparable to quality of service in networking and software measurements, and metrics in empirical software engineering. Information security in the business setting has matured in the last few decades. Standards such as IS017799, the Common Criteria (ISO15408), and a number of industry certifications and risk analysis methodologies have raised the bar for good security solutions from a business perspective. Designed for a professional audience composed of researchers and practitioners in industry, Quality of Protection: Security Measurements and Metrics is also suitable for advanced-level students in computer science.
This book grew out of our lectures given in the Oberseminar on 'Cod ing Theory and Number Theory' at the Mathematics Institute of the Wiirzburg University in the Summer Semester, 2001. The coding the ory combines mathematical elegance and some engineering problems to an unusual degree. The major advantage of studying coding theory is the beauty of this particular combination of mathematics and engineering. In this book we wish to introduce some practical problems to the math ematician and to address these as an essential part of the development of modern number theory. The book consists of five chapters and an appendix. Chapter 1 may mostly be dropped from an introductory course of linear codes. In Chap ter 2 we discuss some relations between the number of solutions of a diagonal equation over finite fields and the weight distribution of cyclic codes. Chapter 3 begins by reviewing some basic facts from elliptic curves over finite fields and modular forms, and shows that the weight distribution of the Melas codes is represented by means of the trace of the Hecke operators acting on the space of cusp forms. Chapter 4 is a systematic study of the algebraic-geometric codes. For a long time, the study of algebraic curves over finite fields was the province of pure mathematicians. In the period 1977 - 1982, V. D. Goppa discovered an amazing connection between the theory of algebraic curves over fi nite fields and the theory of q-ary codes."
Due to the rapid growth of digital communication and electronic data exchange, information security has become a crucial issue in industry, business, and administration. Modern cryptography provides essential techniques for securing information and protecting data. In the first part, this book covers the key concepts of cryptography on an undergraduate level, from encryption and digital signatures to cryptographic protocols. Essential techniques are demonstrated in protocols for key exchange, user identification, electronic elections and digital cash. In the second part, more advanced topics are addressed, such as the bit security of one-way functions and computationally perfect pseudorandom bit generators. The security of cryptographic schemes is a central topic. Typical examples of provably secure encryption and signature schemes and their security proofs are given. Though particular attention is given to the mathematical foundations, no special background in mathematics is presumed. The necessary algebra, number theory and probability theory are included in the appendix. Each chapter closes with a collection of exercises. The second edition contains corrections, revisions and new material, including a complete description of the AES, an extended section on cryptographic hash functions, a new section on random oracle proofs, and a new section on public-key encryption schemes that are provably secure against adaptively-chosen-ciphertext attacks.
Is knowledge an economic good? Which are the characteristics of the institutions regulating the production and diffusion of knowledge? Cumulation of knowledge is a key determinant of economic growth, but only recently knowledge has moved to the core of economic analysis. Recent literature also gives profound insights into events like scientific progress, artistic and craft development which have been rarely addressed as socio-economic institutions, being the domain of sociologists and historians rather than economists. This volume adopts a multidisciplinary approach to bring knowledge in the focus of attention, as a key economic issue.
Building on a range of disciplines from biology and anthropology to philosophy and linguistics this book draws on the expertise of leading names in the study of organic, mental and cultural codes brought together by the emerging discipline of biosemiotics. The book s 18 chapters present a range of experimental evidence which suggests that the genetic code was only the first in a long series of organic codes, and that it has been the appearance of new codes organic, mental and cultural that paved the way for the major transitions in the history of life. While the existence of many organic codes has been proposed since the 1980s, this volume represents the first multi-authored attempt to deal with the range of codes relevant to life, and to reveal the ubiquitous role of coding mechanisms in both organic and mental evolution. This creates the conditions for a synthesis of biology and linguistics that finally overcomes the old divide between nature and culture. The book will appeal to all those interested in the origins and evolution of life, including biologists (from molecular and cellular biologists to evolutionary and developmental biologists), ecologists, anthropologists, psychologists, philosophers of science, linguists, and researchers interested in the history of science, the origins of life, artificial life and intelligence, and information theory and communication technology."
Algorithmic Information Theory treats the mathematics of many important areas in digital information processing. It has been written as a read-and-learn book on concrete mathematics, for teachers, students and practitioners in electronic engineering, computer science and mathematics. The presentation is dense, and the examples and exercises are numerous. It is based on lectures on information technology (Data Compaction, Cryptography, Polynomial Coding) for engineers.
The related fields of fractal image encoding and fractal image
analysis have blossomed in recent years. This book, originating
from a NATO Advanced Study Institute held in 1995, presents work by
leading researchers. It is developing the subjects at an
introductory level, but it also has some recent and exciting
results in both fields.
This volume contains review articles and original results obtained in various fields of modern science using mathematical simulation methods. The basis of the articles are the plenary and some section reports that were made and discussed at the Fourth International Mathematical Simulation Conference, held in Moscow on June 27 through July 1, 2000. The conference was devoted to the following scientific areas: * mathematical and computer discrete systems models; * non-linear excitation in condensed media; * complex systems evolution; * mathematical models in economics; * non-equilibrium processes kinematics; * dynamics and structure of the molecular and biomolecular systems; * mathematical transfer models in non-linear systems; * numerical simulation and algorithms; * turbulence and determined chaos; * chemical physics of polymer. This conference was supported by the Russian Ministry of Education, Russian foundation for Basic Research and Federal Program "Integration". This volume contains the following sections: 1. models of non-linear phenomena in physics; 2. numerical methods and computer simulations; 3. mathematical computer models of discrete systems; 4. mathematical models in economics; 5. non-linear models in chemical physics and physical chemistry; 6. mathematical models of transport processes in complex systems. In Sections One and Five a number of fundamental and sufficiently general problems, concerning real physical and physical-chemical systems simulation, is discussed. |
You may like...
Readings from CRYPTOLOGIA on the Enigma…
Brian J. Winkel, Cipher Deavors, …
Hardcover
R2,798
Discovery Miles 27 980
Information Security - Foundations…
Ali Ismail Awad, Michael Fairhurst
Hardcover
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R539
Discovery Miles 5 390
|