Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
Details multimodal biometrics and its exceptional utility for increasingly reliable human recognition systems. Reveals the substantial advantages of multimodal systems over conventional identification methods.
This book contains extended and revised versions of the best papers that were presented during the fifteenth edition of the IFIP/IEEE WG10.5 International Conference on Very Large Scale Integration, a global System-on-a-Chip Design & CAD conference. The 15th conference was held at the Georgia Institute of Technology, Atlanta, USA (October 15-17, 2007). Previous conferences have taken place in Edinburgh, Trondheim, Vancouver, Munich, Grenoble, Tokyo, Gramado, Lisbon, Montpellier, Darmstadt, Perth and Nice. The purpose of this conference, sponsored by IFIP TC 10 Working Group 10.5 and by the IEEE Council on Electronic Design Automation (CEDA), is to provide a forum to exchange ideas and show industrial and academic research results in the field of microelectronics design. The current trend toward increasing chip integration and technology process advancements brings about stimulating new challenges both at the physical and system-design levels, as well in the test of these systems. VLSI-SoC conferences aim to address these exciting new issues.
From the reviews: "This book nicely complements the existing literature on information and coding theory by concentrating on arbitrary nonstationary and/or nonergodic sources and channels with arbitrarily large alphabets. Even with such generality the authors have managed to successfully reach a highly unconventional but very fertile exposition rendering new insights into many problems." -- MATHEMATICAL REVIEWS
In 1953, exactly 50 years ago to this day, the first volume of
Studia Logica appeared under the auspices of The Philosophical
Committee of The Polish Academy of Sciences. Now, five decades
later the present volume is dedicated to a celebration of this 50th
Anniversary of Studia Logica. The volume features a series of
papers by distinguished scholars reflecting both the aim and scope
of this journal for symbolic logic.
The first edition of the monograph Information and Randomness: An Algorithmic Perspective by Crist ian Calude was published in 1994. In my Foreword I said: "The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. . . . The present book by Calude fits very well in our series. Much original research is presented. . . making the approach richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and, thus, it is not necessary to consult other sources for classroom instruction. " The vigorous growth in the study of algorithmic information theory has continued during the past few years, which is clearly visible in the present second edition. Many new results, examples, exercises and open prob lems have been added. The additions include two entirely new chapters: "Computably Enumerable Random Reals" and "Randomness and Incom pleteness." The really comprehensive new bibliography makes the book very valuable for a researcher. The new results about the characterization of computably enumerable random reals, as well as the fascinating Omega Numbers, should contribute much to the value of the book as a textbook. The author has been directly involved in these results that have appeared in the prestigious journals Nature, New Scientist and Pour la Science."
Quality of Protection: Security Measurements and Metrics is an edited volume based on the Quality of Protection Workshop in Milano, Italy (September 2005). This volume discusses how security research can progress towards quality of protection in security comparable to quality of service in networking and software measurements, and metrics in empirical software engineering. Information security in the business setting has matured in the last few decades. Standards such as IS017799, the Common Criteria (ISO15408), and a number of industry certifications and risk analysis methodologies have raised the bar for good security solutions from a business perspective. Designed for a professional audience composed of researchers and practitioners in industry, Quality of Protection: Security Measurements and Metrics is also suitable for advanced-level students in computer science.
2.1 E-Government: e-Governance and e-Democracy The term Electronic Government (e-Government), as an expression, was coined after the example of Electronic Commerce. In spite of being a relatively recent expression, e-Government designates a field of activity that has been with us for several decades and which has attained a high level of penetration in many countries2. What has been observed over the recent years is a shift on the broadness of the e-Government concept. The ideas inside e-Governance and e- Democracy are to some extent promising big changes in public administration. The demand now is not only simply delivering a service - line. It is to deliver complex and new services, which are all citizen-centric. Another important demand is related to the improvement of citizen's participation in governmental processes and decisions so that the governments' transparency and legitimacy are enforced. In order to fulfill these new demands, a lot of research has been done over the recent years (see Section 3) but many challenges are still to be faced, not only in the technological field, but also in the political and social aspects.
This book contains the lectures given at the II Canference an Dynamics and Randamness held at the Centro de Modelamiento Matematico of the Universidad de Chile, from December 9th to 13th, 2002. This meeting brought together mathematicians, theoretical physicists, theoretical computer scientists, and graduate students interested in fields related to probability theory, ergodic theory, symbolic and topological dynamics. We would like to express our gratitude to an the participants of the conference and to the people who contributed to its orga- nization. In particular, to Pierre Collet, BerIiard Rost and Karl Petersen for their scientific advise. We want to thank warmly the authors of each chapter for their stimulating lectures and for their manuscripts devoted to a various of appealing subjects in probability and dynamics: to Jean Bertoin for his course on Some aspects of random fragmentation in con- tinuous time; to Anton Bovier for his course on Metastability and ageing in stochastic dynamics; to Steve Lalley for his course on AI- gebraic systems of generat ing functions and return probabilities for random walks; to Elon Lindenstrauss for his course on Recurrent measures and measure rigidity; to Sylvie Meleard for her course on Stochastic particle approximations for two-dimensional N avier- Stokes equations; and to Anatoly Vershik for his course on Random and universal metric spaces.
This book grew out of our lectures given in the Oberseminar on 'Cod ing Theory and Number Theory' at the Mathematics Institute of the Wiirzburg University in the Summer Semester, 2001. The coding the ory combines mathematical elegance and some engineering problems to an unusual degree. The major advantage of studying coding theory is the beauty of this particular combination of mathematics and engineering. In this book we wish to introduce some practical problems to the math ematician and to address these as an essential part of the development of modern number theory. The book consists of five chapters and an appendix. Chapter 1 may mostly be dropped from an introductory course of linear codes. In Chap ter 2 we discuss some relations between the number of solutions of a diagonal equation over finite fields and the weight distribution of cyclic codes. Chapter 3 begins by reviewing some basic facts from elliptic curves over finite fields and modular forms, and shows that the weight distribution of the Melas codes is represented by means of the trace of the Hecke operators acting on the space of cusp forms. Chapter 4 is a systematic study of the algebraic-geometric codes. For a long time, the study of algebraic curves over finite fields was the province of pure mathematicians. In the period 1977 - 1982, V. D. Goppa discovered an amazing connection between the theory of algebraic curves over fi nite fields and the theory of q-ary codes."
Every thought is a throw of dice. Stephane Mallarme This book is the last one of a trilogy which reports a part of our research work over nearly thirty years (we discard our non-conventional results in automatic control theory and applications on the one hand, and fuzzy sets on the other), and its main key words are Information Theory, Entropy, Maximum Entropy Principle, Linguistics, Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian Motion, Stochastic Differential Equations of Order n, Stochastic Optimal Control, Computer Vision. Our obsession has been always the same: Shannon's information theory should play a basic role in the foundations of sciences, but subject to the condition that it be suitably generalized to allow us to deal with problems which are not necessarily related to communication engineering. With this objective in mind, two questions are of utmost importance: (i) How can we introduce meaning or significance of information in Shannon's information theory? (ii) How can we define and/or measure the amount of information involved in a form or a pattern without using a probabilistic scheme? It is obligatory to find suitable answers to these problems if we want to apply Shannon's theory to science with some chance of success. For instance, its use in biology has been very disappointing, for the very reason that the meaning of information is there of basic importance, and is not involved in this approach.
Even in the age of ubiquitous computing, the importance of the Internet will not change and we still need to solve conventional security issues. In addition, we need to deal with new issues such as security in the P2P environment, privacy issues in the use of smart cards, and RFID systems. Security and Privacy in the Age of Ubiquitous Computing addresses these issues and more by exploring a wide scope of topics. The volume presents a selection of papers from the proceedings of the 20th IFIP International Information Security Conference held from May 30 to June 1, 2005 in Chiba, Japan. Topics covered include cryptography applications, authentication, privacy and anonymity, DRM and content security, computer forensics, Internet and web security, security in sensor networks, intrusion detection, commercial and industrial security, authorization and access control, information warfare and critical protection infrastructure. These papers represent the most current research in information security, including research funded in part by DARPA and the National Science Foundation.
Multimedia Encryption and Watermarking presents a comprehensive survey of contemporary multimedia encryption and watermarking techniques, which enable a secure exchange of multimedia intellectual property. Part I, Digital Rights Management (DRM) for Multimedia, introduces DRM concepts and models for multimedia content protection, and presents the key players. Part II, Multimedia Cryptography, provides an overview of modern cryptography, with the focus on modern image, video, speech, and audio encryption techniques. This book also provides an advanced concept of visual and audio sharing techniques. Part III, Digital Watermarking, introduces the concept of watermarking for multimedia, classifies watermarking applications, and evaluates various multimedia watermarking concepts and techniques, including digital watermarking techniques for binary images. Multimedia Encryption and Watermarking is designed for researchers and practitioners, as well as scientists and engineers who design and develop systems for the protection of digital multimedia content. This volume is also suitable as a textbook for graduate courses on multimedia security.
Digital forensics deals with the acquisition, preservation, examination, analysis and presentation of electronic evidence. Networked computing, wireless communications and portable electronic devices have expanded the role of digital forensics beyond traditional computer crime investigations. Practically every crime now involves some aspect of digital evidence; digital forensics provides the techniques and tools to articulate this evidence. Digital forensics also has myriad intelligence applications. Furthermore, it has a vital role in information assurance - investigations of security breaches yield valuable information that can be used to design more secure systems. Advances in Digital Forensics describes original research results and innovative applications in the emerging discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. The areas of coverage include:
This book is the first volume of a new series produced by the International Federation for Information Processing (IFIP) Working Group 11.9 on Digital Forensics, an international community of scientists, engineers and practitioners dedicated to advancing the state of the art of research and practice in digital forensics. The book contains a selection of twenty-five edited papers from the First Annual IFIP WG 11.9 Conference on Digital Forensics, held at the National Center for Forensic Science, Orlando, Florida, USA in February 2005. Advances in Digital Forensics is an important resource for researchers, faculty members and graduate students, as well as for practitioners and individuals engaged in research and development efforts for the law enforcement and intelligence communities. Mark Pollitt is President of Digital Evidence Professional Services, Inc., Ellicott City, Maryland, USA. Mr. Pollitt, who is retired from the Federal Bureau of Investigation (FBI), served as the Chief of the FBI's Computer Analysis Response Team, and Director of the Regional Computer Forensic Laboratory National Program. Sujeet Shenoi is the F.P. Walter Professor of Computer Science and a principal with the Center for Information Security at the University of Tulsa, Tulsa, Oklahoma, USA. For more information about the 300 other books in the IFIP series, please visit www.springeronline.com. For more information about IFIP, please visit www.ifip.org.
Cryptography is one of the most active areas in current mathematics research and applications. This book focuses on cryptography along with two related areas: the study of probabilistic proof systems, and the theory of computational pseudorandomness. Following a common theme that explores the interplay between randomness and computation, the important notions in each field are covered, as well as novel ideas and insights.
Juraj Hromkovic takes the reader on an elegant route through the theoretical fundamentals of computer science. The author shows that theoretical computer science is a fascinating discipline, full of spectacular contributions and miracles. The book also presents the development of the computer scientist's way of thinking as well as fundamental concepts such as approximation and randomization in algorithmics, and the basic ideas of cryptography and interconnection network design.
Appendices 133 A Mathematical Results 133 A.1 Singularities of the Displacement Error Covariance Matrix 133 A.2 A Class of Matrices and their Eigenvalues 134 A.3 Inverse of the Power Spectral Density Matrix 134 A.4 Power Spectral Density of a Frame 136 Glossary 137 References 141 Index 159 Preface This book aims to capture recent advances in motion compensation for - ficient video compression. It investigates linearly combined motion comp- sated signals and generalizes the well known superposition for bidirectional prediction in B-pictures. The number of superimposed signals and the sel- tion of reference pictures will be important aspects of the discussion. The application oriented part of the book employs this concept to the well known ITU-T Recommendation H.263 and continues with the improvements by superimposed motion-compensated signals for the emerging ITU-T R- ommendation H.264 and ISO/IEC MPEG-4 (Part 10). In addition, it discusses a new approach for wavelet-based video coding. This technology is currently investigated by MPEG to develop a new video compression standard for the mid-term future.
Objectives Computer and communication practice relies on data compression and dictionary search methods. They lean on a rapidly developing theory. Its exposition from a new viewpoint is the purpose of the book. We start from the very beginning and finish with the latest achievements of the theory, some of them in print for the first time. The book is intended for serving as both a monograph and a self-contained textbook. Information retrieval is the subject of the treatises by D. Knuth (1973) and K. Mehlhorn (1987). Data compression is the subject of source coding. It is a chapter of information theory. Its up-to-date state is presented in the books of Storer (1988), Lynch (1985), T. Bell et al. (1990). The difference between them and the present book is as follows. First. We include information retrieval into source coding instead of discussing it separately. Information-theoretic methods proved to be very effective in information search. Second. For many years the target of the source coding theory was the estimation of the maximal degree of the data compression. This target is practically bit today. The sought degree is now known for most of the sources. We believe that the next target must be the estimation of the price of approaching that degree. So, we are concerned with trade-off between complexity and quality of coding. Third. We pay special attention to universal families that contain a good com pressing map for every source in a set."
This book presents a specific and unified approach to Knowledge Discovery and Data Mining, termed IFN for Information Fuzzy Network methodology. Data Mining (DM) is the science of modelling and generalizing common patterns from large sets of multi-type data. DM is a part of KDD, which is the overall process for Knowledge Discovery in Databases. The accessibility and abundance of information today makes this a topic of particular importance and need. The book has three main parts complemented by appendices as well as software and project data that are accessible from the book's web site (http: //www.eng.tau.ac.iV-maimonlifn-kdg ). Part I (Chapters 1-4) starts with the topic of KDD and DM in general and makes reference to other works in the field, especially those related to the information theoretic approach. The remainder of the book presents our work, starting with the IFN theory and algorithms. Part II (Chapters 5-6) discusses the methodology of application and includes case studies. Then in Part III (Chapters 7-9) a comparative study is presented, concluding with some advanced methods and open problems. The IFN, being a generic methodology, applies to a variety of fields, such as manufacturing, finance, health care, medicine, insurance, and human resources. The appendices expand on the relevant theoretical background and present descriptions of sample projects (including detailed results)."
Cryptography, secret writing, is enjoying a scientific renaissance following the seminal discovery in 1977 of public-key cryptography and applications in computers and communications. This book gives a broad overview of public-key cryptography - its essence and advantages, various public-key cryptosystems, and protocols - as well as a comprehensive introduction to classical cryptography and cryptoanalysis. The second edition has been revised and enlarged especially in its treatment of cryptographic protocols. From a review of the first edition: "This is a comprehensive review ... there can be no doubt that this will be accepted as a standard text. At the same time, it is clearly and entertainingly written ... and can certainly stand alone." "Alex M. Andrew, Kybernetes, March 1992"
Second International Workshop on Formal Aspects in Security and Trust is an essential reference for both academic and professional researchers in the field of security and trust. Because of the complexity and scale of deployment of emerging ICT systems based on web service and grid computing concepts, we also need to develop new, scalable, and more flexible foundational models of pervasive security enforcement across organizational borders and in situations where there is high uncertainty about the identity and trustworthiness of the participating networked entites. On the other hand, the increasingly complex set of building activities sharing different resources but managed with different policies calls for new and business-enabling models of trust between members of virtual organizations and communities that span the boundaries of physical enterprises and loosely structured groups of individuals. The papers presented in this volume address the challenges posed by "ambient intelligence space" as a future paradigm and the need for a set of concepts, tools and methodologies to enable the user's trust and confidence in the underlying computing infrastructure. This state-of-the-art volume presents selected papers from the 2nd International Workshop on Formal Aspects in Security and Trust, held in conjuuctions with the 18th IFIP World Computer Congress, August 2004, in Toulouse, France. The collection will be important not only for computer security experts and researchers but also for teachers and adminstrators interested in security methodologies and research.
This book provides a good introduction to the classical elementary number theory and the modern algorithmic number theory, and their applications in computing and information technology, including computer systems design, cryptography and network security. In this second edition proofs of many theorems have been provided, further additions and corrections were made.
This volume contains review articles and original results obtained in various fields of modern science using mathematical simulation methods. The basis of the articles are the plenary and some section reports that were made and discussed at the Fourth International Mathematical Simulation Conference, held in Moscow on June 27 through July 1, 2000. The conference was devoted to the following scientific areas: * mathematical and computer discrete systems models; * non-linear excitation in condensed media; * complex systems evolution; * mathematical models in economics; * non-equilibrium processes kinematics; * dynamics and structure of the molecular and biomolecular systems; * mathematical transfer models in non-linear systems; * numerical simulation and algorithms; * turbulence and determined chaos; * chemical physics of polymer. This conference was supported by the Russian Ministry of Education, Russian foundation for Basic Research and Federal Program "Integration". This volume contains the following sections: 1. models of non-linear phenomena in physics; 2. numerical methods and computer simulations; 3. mathematical computer models of discrete systems; 4. mathematical models in economics; 5. non-linear models in chemical physics and physical chemistry; 6. mathematical models of transport processes in complex systems. In Sections One and Five a number of fundamental and sufficiently general problems, concerning real physical and physical-chemical systems simulation, is discussed.
Face recognition has been actively studied over the past decade and continues to be a big research challenge. Just recently, researchers have begun to investigate face recognition under unconstrained conditions. Unconstrained Face Recognition provides a comprehensive review of this biometric, especially face recognition from video, assembling a collection of novel approaches that are able to recognize human faces under various unconstrained situations. The underlying basis of these approaches is that, unlike conventional face recognition algorithms, they exploit the inherent characteristics of the unconstrained situation and thus improve the recognition performance when compared with conventional algorithms. Unconstrained Face Recognition is structured to meet the needs of a professional audience of researchers and practitioners in industry. This volume is also suitable for advanced-level students in computer science.
Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it rougher than it might be. Such virtues as it has owe much to others; the faults are all mine. MyworkleadingtothisbookbeganwhenDavidBoultonandIattempted to develop a method for intrinsic classi?cation. Given data on a sample from some population, we aimed to discover whether the population should be considered to be a mixture of di?erent types, classes or species of thing, and, if so, how many classes were present, what each class looked like, and which things in the sample belonged to which class. I saw the problem as one of Bayesian inference, but with prior probability densities replaced by discrete probabilities re?ecting the precision to which the data would allow parameters to be estimated. Boulton, however, proposed that a classi?cation of the sample was a way of brie?y encoding the data: once each class was described and each thing assigned to a class, the data for a thing would be partially implied by the characteristics of its class, and hence require little further description. After some weeks' arguing our cases, we decided on the maths for each approach, and soon discovered they gave essentially the same results. Without Boulton's insight, we may never have made the connection between inference and brief encoding, which is the heart of this work.
Is knowledge an economic good? Which are the characteristics of the institutions regulating the production and diffusion of knowledge? Cumulation of knowledge is a key determinant of economic growth, but only recently knowledge has moved to the core of economic analysis. Recent literature also gives profound insights into events like scientific progress, artistic and craft development which have been rarely addressed as socio-economic institutions, being the domain of sociologists and historians rather than economists. This volume adopts a multidisciplinary approach to bring knowledge in the focus of attention, as a key economic issue. |
You may like...
Protecting Privacy through Homomorphic…
Kristin Lauter, Wei Dai, …
Hardcover
R2,977
Discovery Miles 29 770
Bitcoin and Cryptocurrency Technologies…
Keizer Soeze
Hardcover
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R503
Discovery Miles 5 030
Analysis, Cryptography And Information…
Panos M. Pardalos, Nicholas J. Daras, …
Hardcover
R2,473
Discovery Miles 24 730
Re-imagining Diffusion and Adoption of…
Sujeet K. Sharma, Yogesh K. Dwivedi, …
Hardcover
R4,632
Discovery Miles 46 320
Blockchain 2035 - The Digital DNA of…
Andrew D Knapp, Jared C Tate
Hardcover
R1,410
Discovery Miles 14 100
|