Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Coding theory & cryptology
This book is the study of all codes of life with the standard methods of science. The genetic code and the codes of culture have been known for a long time and represent the historical foundation of this book. What is really new in this field is the study of all codes that came after the genetic code and before the codes of culture. The existence of these organic codes, however, is not only a major experimental fact. It is one of those facts that have extraordinary theoretical implications. The first is that most events of macroevolution were associated with the origin of new organic codes, and this gives us a completely new reconstruction of the history of life. The second implication is that codes involve meaning and we need therefore to introduce in biology not only the concept of information but also the concept of biological meaning. The third theoretical implication comes from the fact that the organic codes have been highly conserved in evolution, which means that they are the greatest invariants of life. The study of the organic codes, in short, is bringing to light new mechanisms that have operated in the history of life and new fundamental concepts in biology.
Information Macrodynamics (IMD) belong to an interdisciplinary science that represents a new theoretical and computer-based methodology for a system informational descriptionand improvement, including various activities in such areas as thinking, intelligent processes, communications, management, and other nonphysical subjects with their mutual interactions, informational superimposition, and theinformation transferredbetweeninteractions. The IMD is based on the implementation of a single concept by a unique mathematical principle and formalism, rather than on an artificial combination of many arbitrary, auxiliary concepts and/or postulates and different mathematical subjects, such as the game, automata, catastrophe, logical operations theories, etc. This concept is explored mathematically using classical mathematics as calculus of variation and the probability theory, which are potent enough, without needing to developnew, specifiedmathematical systemicmethods. The formal IMD model automatically includes the related results from other fields, such as linear, nonlinear, collective and chaotic dynamics, stability theory, theory of information, physical analogies of classical and quantum mechanics, irreversible thermodynamics, andkinetics. The main IMD goal is to reveal the information regularities, mathematically expressed by the considered variation principle (VP), as a mathematical tool to extractthe regularities and define the model, whichdescribes theregularities. The IMD regularities and mechanisms are the results of the analytical solutions and are not retained by logical argumentation, rational introduction, and a reasonable discussion. The IMD's information computer modeling formalism includes a human being (as an observer, carrier and producer ofinformation), with a restoration of the model during the objectobservations.
The book is concerned with contemporary methodologies used for automatic text summarization. It proposes interesting approaches to solve well-known problems on text-summarization using computational intelligence (CI) techniques including cognitive approaches. A better understanding of the cognitive basis of the summarization task is still an open research issue, an extent of its use in text summarization is highlighted for further exploration. With the ever-growing text and people on research has little time to spare for extensive reading, where, summarized information helps for a better understanding of the context at a shorter time. This book helps students and researchers to automatically summarize the text documents in an efficient and effective way. The computational approaches and the research techniques presented guides to achieve text summarization at ease. The summarized text generated supports readers to learn the context or the domain at a quicker pace. The book is presented with reasonable amount of illustrations and examples convenient for the readers to understand and implement for their use. The book is not to make readers understand what text summarization is, but for people to perform text summarization using various approaches. This also describes measures that can help to evaluate, determine and explore the best possibilities for text summarization to analyse and use for any specific purpose. The illustration is based on social media and healthcare domain, which shows the possibilities to work with any domain for summarization. The new approach for text summarization based on cognitive intelligence is presented for further exploration in the field.
This volume contains the proceedings of IFIPTM 2010, the 4th IFIP WG 11.11 International Conference on Trust Management, held in Morioka, Iwate, Japan during June 16-18, 2010. IFIPTM 2010 provided a truly global platform for the reporting of research, development, policy, and practice in the interdependent arrears of privacy, se- rity, and trust. Building on the traditions inherited from the highly succe- ful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, the IFIPTM 2008 conference in Trondheim, Norway, and the IFIPTM 2009 conference at Purdue University in Indiana, USA, IFIPTM 2010 focused on trust, privacy and security from multidisciplinary persp- tives. The conference is an arena for discussion on relevant problems from both research and practice in the areas of academia, business, and government. IFIPTM 2010 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2010 received 61 submissions from 25 di?erent countries: Japan (10), UK (6), USA (6), Canada (5), Germany (5), China (3), Denmark (2), India (2), Italy (2), Luxembourg (2), The Netherlands (2), Switzerland (2), Taiwan (2), Austria, Estonia, Finland, France, Ireland, Israel, Korea, Malaysia, Norway, Singapore, Spain, Turkey. The Program Committee selected 18 full papers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include two invited papers by academic experts in the ?elds of trust management, privacy and security, namely, Toshio Yamagishi and Pamela Briggs
This volume brings together a multidisciplinary group of scholars from diverse fields including computer science, engineering, archival science, law, business, psychology, economics, medicine and more to discuss the trade-offs between different "layers" in designing the use of blockchain/Distributed Ledger Technology (DLT) for social trust, trust in data and records, and trust in systems. Blockchain technology has emerged as a solution to the problem of trust in data and records, as well as trust in social, political and economic institutions, due to its profound potential as a digital trust infrastructure. Blockchain is a DLT in which confirmed and validated sets of transactions are stored in blocks that are chained together to make tampering more difficult and render records immutable. This book is dedicated to exploring and disseminating the latest findings on the relationships between socio-political and economic data, record-keeping, and technical aspects of blockchain.
In the context of life sciences, we are constantly confronted with information that possesses precise semantic values and appears essentially immersed in a specific evolutionary trend. In such a framework, Nature appears, in Monod's words, as a tinkerer characterized by the presence of precise principles of self-organization. However, while Monod was obliged to incorporate his brilliant intuitions into the framework of first-order cybernetics and a theory of information with an exclusively syntactic character such as that defined by Shannon, research advances in recent decades have led not only to the definition of a second-order cybernetics but also to an exploration of the boundaries of semantic information. As H. Atlan states, on a biological level "the function self-organizes together with its meaning". Hence the need to refer to a conceptual theory of complexity and to a theory of self-organization characterized in an intentional sense. There is also a need to introduce, at the genetic level, a distinction between coder and ruler as well as the opportunity to define a real software space for natural evolution. The recourse to non-standard model theory, the opening to a new general semantics, and the innovative definition of the relationship between coder and ruler can be considered, today, among the most powerful theoretical tools at our disposal in order to correctly define the contours of that new conceptual revolution increasingly referred to as metabiology. This book focuses on identifying and investigating the role played by these particular theoretical tools in the development of this new scientific paradigm. Nature "speaks" by means of mathematical forms: we can observe these forms, but they are, at the same time, inside us as they populate our organs of cognition. In this context, the volume highlights how metabiology appears primarily to refer to the growth itself of our instruments of participatory knowledge of the world.
Turbo Codes: Desirable and Designable introduces the basics of turbo codes in their different flavors (more specifically, parallel concatenated convolutional turbo codes and block turbo codes). Through the application of systemic design methodology that considers data transfer and storage as top priority candidates for optimization, the authors show how turbo codes can be implemented and the attractive performance results that can be achieved in throughput, latency and energy consumption. These solutions and results make turbo-codes close competitors to traditional coding scheme such as convolutional codes or algebraic codes. Finally, a real-life prototype of parallel concatenated convolutional (turbo-) codes is presented. A complete turbo codes ASIC data-flow is described together with on-board power, speed and coding gain measurements that demonstrate the effectiveness of the proposed solution.
This book constitutes the proceedings of the 15th IFIP WG 11.12 International Symposium on Human Aspects of Information Security and Assurance, HAISA 2021, held virtually in July 2021.The 18 papers presented in this volume were carefully reviewed and selected from 30 submissions. They are organized in the following topical sections: attitudes and perspectives; cyber security education; and people and technology.
This book focuses on information geometry manifolds of structured data/information and their advanced applications featuring new and fruitful interactions between several branches of science: information science, mathematics and physics. It addresses interrelations between different mathematical domains like shape spaces, probability/optimization & algorithms on manifolds, relational and discrete metric spaces, computational and Hessian information geometry, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensor-valued morphology, optimal transport theory, manifold & topology learning, and applications like geometries of audio-processing, inverse problems and signal processing. The book collects the most important contributions to the conference GSI'2017 - Geometric Science of Information.
YUNMIN ZHU In the past two decades, multi sensor or multi-source information fusion tech niques have attracted more and more attention in practice, where observations are processed in a distributed manner and decisions or estimates are made at the individual processors, and processed data (or compressed observations) are then transmitted to a fusion center where the final global decision or estimate is made. A system with multiple distributed sensors has many advantages over one with a single sensor. These include an increase in the capability, reliability, robustness and survivability of the system. Distributed decision or estimation fusion prob lems for cases with statistically independent observations or observation noises have received significant attention (see Varshney's book Distributed Detec tion and Data Fusion, New York: Springer-Verlag, 1997, Bar-Shalom's book Multitarget-Multisensor Tracking: Advanced Applications, vol. 1-3, Artech House, 1990, 1992,2000). Problems with statistically dependent observations or observation noises are more difficult and have received much less study. In practice, however, one often sees decision or estimation fusion problems with statistically dependent observations or observation noises. For instance, when several sensors are used to detect a random signal in the presence of observation noise, the sensor observations could not be statistically independent when the signal is present. This book provides a more complete treatment of the fundamentals of multi sensor decision and estimation fusion in order to deal with general random ob servations or observation noises that are correlated across the sensors."
The five-volume set IFIP AICT 630, 631, 632, 633, and 634 constitutes the refereed proceedings of the International IFIP WG 5.7 Conference on Advances in Production Management Systems, APMS 2021, held in Nantes, France, in September 2021.*The 378 papers presented were carefully reviewed and selected from 529 submissions. They discuss artificial intelligence techniques, decision aid and new and renewed paradigms for sustainable and resilient production systems at four-wall factory and value chain levels. The papers are organized in the following topical sections: Part I: artificial intelligence based optimization techniques for demand-driven manufacturing; hybrid approaches for production planning and scheduling; intelligent systems for manufacturing planning and control in the industry 4.0; learning and robust decision support systems for agile manufacturing environments; low-code and model-driven engineering for production system; meta-heuristics and optimization techniques for energy-oriented manufacturing systems; metaheuristics for production systems; modern analytics and new AI-based smart techniques for replenishment and production planning under uncertainty; system identification for manufacturing control applications; and the future of lean thinking and practice Part II: digital transformation of SME manufacturers: the crucial role of standard; digital transformations towards supply chain resiliency; engineering of smart-product-service-systems of the future; lean and Six Sigma in services healthcare; new trends and challenges in reconfigurable, flexible or agile production system; production management in food supply chains; and sustainability in production planning and lot-sizing Part III: autonomous robots in delivery logistics; digital transformation approaches in production management; finance-driven supply chain; gastronomic service system design; modern scheduling and applications in industry 4.0; recent advances in sustainable manufacturing; regular session: green production and circularity concepts; regular session: improvement models and methods for green and innovative systems; regular session: supply chain and routing management; regular session: robotics and human aspects; regular session: classification and data management methods; smart supply chain and production in society 5.0 era; and supply chain risk management under coronavirus Part IV: AI for resilience in global supply chain networks in the context of pandemic disruptions; blockchain in the operations and supply chain management; data-based services as key enablers for smart products, manufacturing and assembly; data-driven methods for supply chain optimization; digital twins based on systems engineering and semantic modeling; digital twins in companies first developments and future challenges; human-centered artificial intelligence in smart manufacturing for the operator 4.0; operations management in engineer-to-order manufacturing; product and asset life cycle management for smart and sustainable manufacturing systems; robotics technologies for control, smart manufacturing and logistics; serious games analytics: improving games and learning support; smart and sustainable production and supply chains; smart methods and techniques for sustainable supply chain management; the new digital lean manufacturing paradigm; and the role of emerging technologies in disaster relief operations: lessons from COVID-19 Part V: data-driven platforms and applications in production and logistics: digital twins and AI for sustainability; regular session: new approaches for routing problem solving; regular session: improvement of design and operation of manufacturing systems; regular session: crossdock and transportation issues; regular session: maintenance improvement and lifecycle management; regular session: additive manufacturing and mass customization; regular session: frameworks and conceptual modelling for systems and services efficiency; regular session: optimization of production and transportation systems; regular session: optimization of supply chain agility and reconfigurability; regular session: advanced modelling approaches; regular session: simulation and optimization of systems performances; regular session: AI-based approaches for quality and performance improvement of production systems; and regular session: risk and performance management of supply chains *The conference was held online.
The idea of this book comes from the observation that sensor networks represent a topic of interest from both theoretical and practical perspectives. The title und- lines that sensor networks offer the unique opportunity of clearly linking theory with practice. In fact, owing to their typical low-cost, academic researchers have the opportunity of implementing sensor network testbeds to check the validity of their theories, algorithms, protocols, etc., in reality. Likewise, a practitioner has the opportunity of understanding what are the principles behind the sensor networks under use and, thus, how to properly tune some accessible network parameters to improve the performance. On the basis of the observations above, the book has been structured in three parts: PartIisdenotedas"Theory,"sincethetopicsofits vechaptersareapparently "detached" from real scenarios; Part II is denoted as "Theory and Practice," since the topics of its three chapters, altough theoretical, have a clear connection with speci c practical scenarios; Part III is denoted as "Practice," since the topics of its ve chapters are clearly related to practical applications.
The authors give a detailed summary about the fundamentals and the historical background of digital communication. This includes an overview of the encoding principles and algorithms of textual information, audio information, as well as images, graphics, and video in the Internet. Furthermore the fundamentals of computer networking, digital security and cryptography are covered. Thus, the book provides a well-founded access to communication technology of computer networks, the internet and the WWW. Numerous pictures and images, a subject-index and a detailed list of historical personalities including a glossary for each chapter increase the practical benefit of this book that is well suited as well as for undergraduate students as for working practitioners.
Unique selling point: * Industry standard book for merchants, banks, and consulting firms looking to learn more about PCI DSS compliance. Core audience: * Retailers (both physical and electronic), firms who handle credit or debit cards (such as merchant banks and processors), and firms who deliver PCI DSS products and services. Place in the market: * Currently there are no PCI DSS 4.0 books
Information theory is an exceptional field in many ways. Technically, it is one of the rare fields in which mathematical results and insights have led directly to significant engineering payoffs. Professionally, it is a field that has sustained a remarkable degree of community, collegiality and high standards. James L. Massey, whose work in the field is honored here, embodies the highest standards of the profession in his own career. The book covers the latest work on: block coding, convolutional coding, cryptography, and information theory. The 44 contributions represent a cross-section of the world's leading scholars, scientists and researchers in information theory and communication. The book is rounded off with an index and a bibliography of publications by James Massey.
This clearly written and enlightening textbook provides a concise, introductory guide to the key mathematical concepts and techniques used by computer scientists. Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, review questions, and a glossary; places our current state of knowledge within the context of the contributions made by early civilizations, such as the ancient Babylonians, Egyptians and Greeks; examines the building blocks of mathematics, including sets, relations and functions; presents an introduction to logic, formal methods and software engineering; explains the fundamentals of number theory, and its application in cryptography; describes the basics of coding theory, language theory, and graph theory; discusses the concept of computability and decideability; includes concise coverage of calculus, probability and statistics, matrices, complex numbers and quaternions.
Since humans began writing, they have been communicating in code. This obsession with secrecy has had dramatic effects on the outcome of wars, monarchies and individual lives. With clear mathematical, linguistic and technological demonstrations of many of the codes, as well as illustrations of some of the remarkable personalities behind them – many courageous, some villainous – The Code Book traces the fascinating development of codes and code-breaking from military espionage in Ancient Greece to modern computer ciphers, to reveal how the remarkable science of cryptography has often changed the course of history. Amongst many extraordinary examples, Simon Singh relates in detail the story of Mary, Queen of Scots, trapped by her own code and put to death by Elizabeth I; the strange history of the Beale Ciphers, describing the hidden location of a fortune in gold, buried somewhere in Virginia in the nineteenth century and still not found; and the monumental efforts in code-making and code-breaking that influenced the outcomes of the First and Second World Wars. Now, with the Information Age bringing the possibility of a truly unbreakable code ever nearer, and cryptography one of the major debates of our times, Singh investigates the challenge that technology has brought to personal privacy today. Dramatic, compelling and remarkably far-reaching, The Code Book will forever alter your view of history, what drives it and how private your last e-mail really was.
Cryptography is a field that is constantly advancing, due to exponential growth in new technologies within the past few decades. Applying strategic algorithms to cryptic issues can help save time and energy in solving the expanding problems within this field. Algorithmic Strategies for Solving Complex Problems in Cryptography is an essential reference source that discusses the evolution and current trends in cryptology, and it offers new insight into how to use strategic algorithms to aid in solving intricate difficulties within this domain. Featuring relevant topics such as hash functions, homomorphic encryption schemes, two party computation, and integer factoring, this publication is ideal for academicians, graduate students, engineers, professionals, and researchers interested in expanding their knowledge of current trends and techniques within the cryptology field.
At its peak in January 1945, 10,000 people worked at Bletchley Park, reading 4000 messages a day, decrypting German and Japanese communications and helping the Allies to victory. But while we know that Bletchley was the centre of Britain's World War II code-breaking, how did its efforts actually change the course of the war? Enigma: How Breaking the Code Helped Win World War II tells the story of Bletchley's role in defeating U-boats in the Atlantic, breaking the Japanese codes, helping the Allies to victory in North Africa, deciphering the German military intelligence code, learning of most German positions in western Europe before the Normandy Landings, defeating the Italian Navy in the Mediterranean, and helping sink the German battleship Scharnhorst off Norway. In tracing these events, the book also delves into the stories of major Bletchley characters, 'boffins' such as Alan Turing and Gordon Welchman, and 'Debs' such as Joan Clarke and Margaret Rock. An accessible work of military history that ranges across air, land and naval warfare, the book also touches on the story of early computer science. Illustrated with 120 black-&-white and colour photographs, artworks and maps, Enigma: How Breaking the Code Helped Win World War II is an authoritative and novel perspective on WWII history.
Modern cryptology increasingly employs mathematically rigorous concepts and methods from complexity theory. Conversely, current research topics in complexity theory are often motivated by questions and problems from cryptology. This book takes account of this situation, and therefore its subject is what may be dubbed "cryptocomplexity'', a kind of symbiosis of these two areas. This book is written for undergraduate and graduate students of computer science, mathematics, and engineering, and can be used for courses on complexity theory and cryptology, preferably by stressing their interrelation. Moreover, it may serve as a valuable source for researchers, teachers, and practitioners working in these fields. Starting from scratch, it works its way to the frontiers of current research in these fields and provides a detailed overview of their history and their current research topics and challenges.
This book explores alternative ways of accomplishing secure information transfer with incoherent multi-photon pulses in contrast to conventional Quantum Key Distribution techniques. Most of the techniques presented in this book do not need conventional encryption. Furthermore, the book presents a technique whereby any symmetric key can be securely transferred using the polarization channel of an optical fiber for conventional data encryption. The work presented in this book has largely been practically realized, albeit in a laboratory environment, to offer proof of concept rather than building a rugged instrument that can withstand the rigors of a commercial environment.
Genetic programming (GP), one of the most advanced forms of evolutionary computation, has been highly successful as a technique for getting computers to automatically solve problems without having to tell them explicitly how. Since its inceptions more than ten years ago, GP has been used to solve practical problems in a variety of application fields. Along with this ad-hoc engineering approaches interest increased in how and why GP works. This book provides a coherent consolidation of recent work on the theoretical foundations of GP. A concise introduction to GP and genetic algorithms (GA) is followed by a discussion of fitness landscapes and other theoretical approaches to natural and artificial evolution. Having surveyed early approaches to GP theory it presents new exact schema analysis, showing that it applies to GP as well as to the simpler GAs. New results on the potentially infinite number of possible programs are followed by two chapters applying these new techniques. |
You may like...
Advances in Production Management…
Bojan Lalic, Vidosav Majstorovic, …
Hardcover
R2,944
Discovery Miles 29 440
Analysis, Cryptography And Information…
Panos M. Pardalos, Nicholas J. Daras, …
Hardcover
R2,473
Discovery Miles 24 730
New Research on the Voynich Manuscript…
National Security Agency
Hardcover
R503
Discovery Miles 5 030
|