Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
The first half of the book provides an introduction to general topology, with ample space given to exercises and carefully selected applications. The second half of the text includes topics in asymmetric topology, a field motivated by applications in computer science. Recurring themes include the interactions of topology with order theory and mathematics designed to model loss-of-resolution situations.
Wahrend die Kryptologie Konzepte und Methoden aus der Komplexitatstheorie verwendet, ist die Forschung in der Komplexitatstheorie wiederum oft durch Fragen aus der Kryptologie motiviert. Der Band hebt die enge Verflechtung dieser beiden Gebiete hervor und fuhrt auf verstandlicher Weise in das faszinierende Gebiet der Kryptokomplexitat" ein. Das Buch enthalt zahlreiche Abbildungen und Ubungsaufgaben sowie ein ausfuhrliches Stichwort- und Literaturverzeichnis. Es eignet sich fur Studierende der Informatik, Mathematik oder Ingenieurswissenschaften."
Quality assurance is an essential aspect for ensuring the success of corporations worldwide. Consistent quality requirements across organizations of similar types ensure that these requirements can be accurately and easily evaluated. Shaping the Future Through Standardization is an essential scholarly book that examines quality and standardization within diverse organizations globally with a special focus on future perspectives, including how standards and standardization may shape the future. Featuring a wide range of topics such as economics, pedagogy, and management, this book is ideal for academicians, researchers, decision makers, policymakers, managers, corporate professionals, and students.
This book is written for anyone who is interested in how a field of research evolves and the fundamental role of understanding uncertainties involved in different levels of analysis, ranging from macroscopic views to meso- and microscopic ones. We introduce a series of computational and visual analytic techniques, from research areas such as text mining, deep learning, information visualization and science mapping, such that readers can apply these tools to the study of a subject matter of their choice. In addition, we set the diverse set of methods in an integrative context, that draws upon insights from philosophical, sociological, and evolutionary theories of what drives the advances of science, such that the readers of the book can guide their own research with their enriched theoretical foundations. Scientific knowledge is complex. A subject matter is typically built on its own set of concepts, theories, methodologies and findings, discovered by generations of researchers and practitioners. Scientific knowledge, as known to the scientific community as a whole, experiences constant changes. Some changes are long-lasting, whereas others may be short lived. How can we keep abreast of the state of the art as science advances? How can we effectively and precisely convey the status of the current science to the general public as well as scientists across different disciplines? The study of scientific knowledge in general has been overwhelmingly focused on scientific knowledge per se. In contrast, the status of scientific knowledge at various levels of granularity has been largely overlooked. This book aims to highlight the role of uncertainties, in developing a better understanding of the status of scientific knowledge at a particular time, and how its status evolves over the course of the development of research. Furthermore, we demonstrate how the knowledge of the types of uncertainties associated with scientific claims serves as an integral and critical part of our domain expertise.
Mathematical inequalities are essential tools in mathematics, natural science and engineering. This book gives an overview on recent advances. Some generalizations and improvements for the classical and well-known inequalities are described. They will be applied and further developed in many fields. Applications of the inequalities to entropy theory and quantum physics are also included.
As digital transformations continue to accelerate in the world, discourses of big data have come to dominate in a number of fields, from politics and economics, to media and education. But how can we really understand the digital world when so much of the writing through which we grapple with it remains deeply problematic? In a compelling new work of feminist critical theory, Bassett, Kember and O'Riordan scrutinise many of the assumptions of a masculinist digital world, highlighting the tendency of digital humanities scholarship to venerate and essentialise technical forms, and to adopt gendered writing and citation practices. Contesting these writings, practices and politics, the authors foreground feminist traditions and contributions to the field, offering alternative modes of knowledge production, and a radically different, poetic writing style. Through this prism, Furious brings into focus themes including the automation of home and domestic work, the Anthropocene, and intersectional feminist technofutures.
This monograph provides a tutorial review of the Lattice-Reduction-Aided and Integer-Forcing approaches to equalization in MIMO communications. The authors highlight the similarities and differences of both approaches while summarizing the various criteria for selecting the integer linear combinations available in the literature in a unified way. This presents the reader with a clear overview of the differing equalization techniques and enables them to be adopted for any particular system under development. The authors proceed to consider the demands on the signal constellations and coding schemes in detail. The monograph provides a concise overview of recent developments in the development of the widely-used MIMO communication systems. It is of interest to researchers, practitioners and students alike.
Group testing emerged as an area for research from the need for the US Government to screen recruits in the second world war for syphilis. Obviously rather than testing each recruit, a more efficient method involving the minimal number of tests was required. The central problem of group testing is thus: Given a number of items and a number of defectives, how many tests are required to accurately discover the defective items, and how can this be achieved? Group testing has since found applications in medical testing, biology, telecommunications, information technology, data science, and more. The focus of this survey is on the non-adaptive setting of group testing. In this setting, the test pools are designed in advance enabling them to be implemented in parallel. The survey gives a comprehensive and thorough treatment of the subject from an information theoretic perspective. It covers several related developments: efficient algorithms with practical storage and computation requirements, achievability bounds for optimal decoding methods, and algorithm-independent converse bounds. It assesses the theoretical guarantees not only in terms of scaling laws, but also in terms of the constant factors, leading to the notion of the rate of group testing, indicating the amount of information learned per test. Considering both noiseless and noisy settings, it identifies several regimes where existing algorithms are provably optimal or near-optimal, as well as regimes where there remains greater potential for improvement. This monograph is an accessible treatment of an important topic for researchers and students in Information Theory.
Technological advancements have become an integral part of life, impacting the way we work, communicate, make decisions, learn, and play. As technology continually progresses, humans are being outpaced by its capabilities, and it is important for businesses, organizations, and individuals to understand how to optimize data and to implement new methods for more efficient knowledge discovery and information management and retrieval. Innovative Applications of Knowledge Discovery and Information Resources Management offers in-depth coverage on the pervasiveness of technological change with a collection of material on topics such as the impact of permeable work-life boundaries, burnout and turnover, big data usage, and computer-based learning. It proves a worthy source for academicians, practitioners, IT leaders, IT professionals, and advanced-level students interested in examining the ways in which technology is changing the world.
Distributed source coding is one of the key enablers for efficient cooperative communication. The potential applications range from wireless sensor networks, ad-hoc networks, and surveillance networks, to robust low-complexity video coding, stereo/Multiview video coding, HDTV, hyper-spectral and multispectral imaging, and biometrics. The book is divided into three sections: theory, algorithms, and applications. Part one covers the background of information theory with an emphasis on DSC; part two discusses designs of algorithmic solutions for DSC problems, covering the three most important DSC problems: Slepian-Wolf, Wyner-Ziv, and MT source coding; and part three is dedicated to a variety of potential DSC applications. Key features: * Clear explanation of distributed source coding theory and algorithms including both lossless and lossy designs. * Rich applications of distributed source coding, which covers multimedia communication and data security applications. * Self-contained content for beginners from basic information theory to practical code implementation. The book provides fundamental knowledge for engineers and computer scientists to access the topic of distributed source coding. It is also suitable for senior undergraduate and first year graduate students in electrical engineering; computer engineering; signal processing; image/video processing; and information theory and communications.
Luciano Floridi develops an original ethical framework for dealing with the new challenges posed by Information and Communication Technologies (ICTs). ICTs have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, health care, industrial production and business, social relations, and conflicts. They have had a radical and widespread impact on our moral lives and on contemporary ethical debates. Privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, and pornography online are only some of the pressing issues that characterise the ethical discourse in the information society. They are the subject of Information Ethics (IE), the new philosophical area of research that investigates the ethical impact of ICTs on human life and society. Since the seventies, IE has been a standard topic in many curricula. In recent years, there has been a flourishing of new university courses, international conferences, workshops, professional organizations, specialized periodicals and research centres. However, investigations have so far been largely influenced by professional and technical approaches, addressing mainly legal, social, cultural and technological problems. This book is the first philosophical monograph entirely and exclusively dedicated to it. Floridi lays down, for the first time, the conceptual foundations for IE. He does so systematically, by pursuing three goals: a) a metatheoretical goal: it describes what IE is, its problems, approaches and methods; b) an introductory goal: it helps the reader to gain a better grasp of the complex and multifarious nature of the various concepts and phenomena related to computer ethics; c) an analytic goal: it answers several key theoretical questions of great philosophical interest, arising from the investigation of the ethical implications of ICTs. Although entirely independent of The Philosophy of Information (OUP, 2011), Floridi's previous book, The Ethics of Information complements it as new work on the foundations of the philosophy of information.
Intelligence results from the interaction of the brain, body and environment. The question addressed in this book is, can we measure the contribution of the body and its' interaction with the environment? To answer this, we first present a comprehensive overview of the various ways in which a body reduces the amount of computation that the brain has to perform to solve a task. This chapter will broaden your understanding of how important inconspicuously appearing physical processes and physical properties of the body are with respect to our cognitive abilities. This form of contribution to intelligence is called Morphological Intelligence. The main contribution of this book to the field is a detailed discussion of how Morphological Intelligence can be measured from observations alone. The required mathematical framework is provided so that readers unfamiliar with information theory will be able to understand and apply the measures. Case studies from biomechanics and soft robotics illustrate how the presented quantifications can, for example, be used to measure the contribution of muscle physics to jumping and optimise the shape of a soft robotic hand. To summarise, this monograph presents various examples of how the physical properties of the body and the body's interaction with the environment contribute to intelligence. Furthermore, it treats theoretical and practical aspects of Morphological Intelligence and demonstrates the value in two case studies.
Ordinal Computability discusses models of computation obtained by generalizing classical models, such as Turing machines or register machines, to transfinite working time and space. In particular, recognizability, randomness, and applications to other areas of mathematics are covered.
1. Lossless Coding.- 2.Universal Coding on Finite Alphabets.- 3.Universal Coding on Infinite Alphabets.- 4.Model Order Estimation.- Notation.- Index.
Developing increasingly computationally-efficient codes for communication and compression has been the major goal of information and coding theory. There have been significant advances towards this goal in the last couple of decades with the emergence of turbo codes, sparse-graph codes, and polar codes. The world has seen faster network speeds and greater storage due to many of these developments. A new class of codes, Sparse Regression Codes (SPARCs), are a promising class of codes for achieving the Shannon limits of a communication channel. This monograph presents a unified and comprehensive overview of sparse regression codes, covering theory, algorithms, and practical implementation aspects. Written by the world's recognized experts in the field it describes the use of SPARCs for efficient communication over AWGN channels, for lossy compression and multi-terminal communication. Researchers and students in modern communication and network systems will find Sparse Regression Codes an essential resource in understanding these new techniques that will have a significant impact on such systems in the years to come.
Photonics has long been considered an attractive substrate for next generation implementations of machine-learning concepts. Reservoir Computing tremendously facilitated the realization of recurrent neural networks in analogue hardware. This concept exploits the properties of complex nonlinear dynamical systems, giving rise to photonic reservoirs implemented by semiconductor lasers, telecommunication modulators and integrated photonic chips.
Written by one of the founding fathers of Quantum Information, this book gives an accessible (albeit mathematically rigorous), self-contained introduction to quantum information theory. The central role is played by the concept of quantum channel and its entropic and information characteristics. In this revised edition, the main results have been updated to reflect the most recent developments in this very active field of research.
This book focuses on information geometry manifolds of structured data/information and their advanced applications featuring new and fruitful interactions between several branches of science: information science, mathematics and physics. It addresses interrelations between different mathematical domains like shape spaces, probability/optimization & algorithms on manifolds, relational and discrete metric spaces, computational and Hessian information geometry, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensor-valued morphology, optimal transport theory, manifold & topology learning, and applications like geometries of audio-processing, inverse problems and signal processing. The book collects the most important contributions to the conference GSI'2017 - Geometric Science of Information. |
You may like...
Navigating Information Literacy
Theo Bothma, Erica Cosijn, …
Paperback
Reliability Modelling with Information…
S.M. Sunoj, G Rajesh, …
Hardcover
R4,461
Discovery Miles 44 610
Machine Learning - A First Course for…
Andreas Lindholm, Niklas Wahlstroem, …
Hardcover
|