![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
This book constitutes the refereed post-conference proceedings of the 4th International Conference on Intelligence Science, ICIS 2020, held in Durgapur, India, in February 2021 (originally November 2020). The 23 full papers and 4 short papers presented were carefully reviewed and selected from 42 submissions. One extended abstract is also included. They deal with key issues in brain cognition; uncertain theory; machine learning; data intelligence; language cognition; vision cognition; perceptual intelligence; intelligent robot; and medical artificial intelligence.
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
This is an examination of information policy in environmental sustainability. It covers issues such as information, knowledge and models; environmental regulation; and information policy.
The organizational world today has been characterized in various terms - turmoil, chaos, the age of paradox and unreason. Common to all these characterizations is that the conventional wisdom fails in responding to novel challenges triggered by the pervasive and radical change of organizations. Information, knowledge, information worker and information technology are at the epicenter of these changes and surprises. This book explores new organizational designs, such as, the network and virtual organization from the information perspective. In addition, proposed is a model of the nontraditional organization in which information work evolves around teams that directly serve customers. This model was put on a test, and elements of the nontraditional organization were identified in firms that have been around for quite some time - the public accounting industry, and specifically its technologically most advanced segment. The book aims at transferring experience and facilitating interest for methods of organizing suitable for the information age.
This book opens a novel dimension in the 50 year history of mathematical theory of "information" since the birth of Shannon theory. First of all, it introduces, in place of the traditional notion of entropy and mutual information, the completely new and highly unconventional approach of "information-spectrum" as a basic but powerful tool for constructing the general theory of information. Reconstructing step-by-step all the essential major topics in information theory from the viewpoint of such an "information-spectrum", this comprehensive work provides an accessible introduction to the new type of mathematical theory of information that focuses mainly on general nonstationary and /or nonergodic sources and channels, in clear contrast with the traditional theories of information. This book is a new non-traditional theoretical reference for communication professionals and statisticians specializing in information theory.
This volume contains a selection of original contributions from internationally reputed scholars in the field of risk management in socio?technical systems with high hazard potential. Its first major section addresses fundamental psychological and socio?technical concepts in the field of risk perception, risk management and learning systems for safety improvement. The second section deals with the variety of procedures for system safety analysis. It covers strategies of analyzing automation problems and of safety culture as well as the analysis of social dynamics in field settings and of field experiments. Its third part then illustrates the utilization of basic concepts and analytic approaches by way of case studies of designing man?machine systems and in various industrial sectors such as intensive care wards, aviation, offfshore oil drilling and chemical industry. In linking basic theoretical conceptual notions and analytic strategies to detailed case studies in the area of hazardous work organizations the volume differs from and complements more theoretical works such as Human Error (J. Reason, 1990) and more general approaches such as New Technologies and Human Error (J. Rasmussen, K. Duncan, J. Leplat, Eds.)
The reach of algebraic curves in cryptography goes far beyond elliptic curve or public key cryptography yet these other application areas have not been systematically covered in the literature. Addressing this gap, Algebraic Curves in Cryptography explores the rich uses of algebraic curves in a range of cryptographic applications, such as secret sharing, frameproof codes, and broadcast encryption. Suitable for researchers and graduate students in mathematics and computer science, this self-contained book is one of the first to focus on many topics in cryptography involving algebraic curves. After supplying the necessary background on algebraic curves, the authors discuss error-correcting codes, including algebraic geometry codes, and provide an introduction to elliptic curves. Each chapter in the remainder of the book deals with a selected topic in cryptography (other than elliptic curve cryptography). The topics covered include secret sharing schemes, authentication codes, frameproof codes, key distribution schemes, broadcast encryption, and sequences. Chapters begin with introductory material before featuring the application of algebraic curves.
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
In The State of State Theory: State Projects, Repression, and Multi-Sites of Power, Glasberg, Willis, and Shannon argue that state theories should be amended to account both for theoretical developments broadly in the contemporary period as well as the multiple sites of power along which the state governs. Using state projects and policies around political economy, sexuality and family, food, welfare policy, racial formation, and social movements as narrative accounts in how the state operates, the authors argue for a complex and intersectional approach to state theory. In doing so, they expand outside of the canon to engage with perspectives within critical race theory, queer theory, and beyond to build theoretical tools for a contemporary and critical state theory capable of providing the foundations for understanding how the state governs, what is at stake in its governance, and, importantly, how people resist and engage with state power.
This volume is to pique the interest of many researchers in the fields of infinite dimensional analysis and quantum probability. These fields have undergone increasingly significant developments and have found many new applications, in particular, to classical probability and to different branches of physics. These fields are rather wide and are of a strongly interdisciplinary nature. For such a purpose, we strove to bridge among these interdisciplinary fields in our Workshop on IDAQP and their Applications that was held at the Institute for Mathematical Sciences, National University of Singapore from 3-7 March 2014. Readers will find that this volume contains all the exciting contributions by well-known researchers in search of new directions in these fields.
This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory - the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.
This book provides the most recent methodological advances in
applying advanced modeling techniques to road pricing.
Distinguished from other Valuable for any academic or professional reader interested in or involved in implementing road pricing.
Interpersonal relationships are the core of our societal system and
have been since before the dawn of civilization. In today's world,
friends, lovers, companions, and confidants make valuable
contributions to our everyday lives. These are the relationships
whose members are not automatically participants as a result of
their birth and kin affiliations. The focus is on these
relationships that must be forged from the sometimes indifferent,
and sometimes hostile world. Yet, there is still much that is not
known about how these relationships evolve, how partners
communicate in on-going relationships, how people keep their
relationships together, and how they cope when they fall apart.
Primary to the focus of this book is the underlying theme of
evolving interpersonal relationships from the initial encounter to
the mature alliance.
The book is a collection of papers of experts in the fields of information and complexity. Information is a basic structure of the world, while complexity is a fundamental property of systems and processes. There are intrinsic relations between information and complexity.The research in information theory, the theory of complexity and their interrelations is very active. The book will expand knowledge on information, complexity and their relations representing the most recent and advanced studies and achievements in this area.The goal of the book is to present the topic from different perspectives - mathematical, informational, philosophical, methodological, etc.
As economic activity has become more information-intensive and ideas about the information society have been canvassed widely, information technology has overshadowed thinking about the role of communication and information. In the advanced economies investment in information-handling equipment has grown rapidly in importance and almost throughout the world telecommunications facilities are advocated as the leading edge of development.This wide-ranging collection charts the responses of the economics discipline to these changes, initially slowly but with gathering pace, as communication and information have moved from the sidelines to centre stage. This book will be an indispensible reference source by all those in the economics community, those interested in information science, library studies and communication.
This book aims to synthesize different directions in knowledge studies into a unified theory of knowledge and knowledge processes. It explicates important relations between knowledge and information. It provides the readers with understanding of the essence and structure of knowledge, explicating operations and process that are based on knowledge and vital for society.The book also highlights how the theory of knowledge paves the way for more advanced design and utilization of computers and networks.
The book is a unique exploration of a spectrum of unexpected analogs to psychopathologies likely to afflict real-time critical systems, written by a specialist in the epidemiology of mental disorders. The purpose of this book is to develop a set of information-theoretic statistical tools for analyzing the instabilities of real-time cognitive systems at those varying scales and levels of organization, with special focus on high level machine function.The book should be of particular interest to both industry and academic scientists, and government regulators, concerned with driverless cars on intelligent roads. Many of the same concerns also afflict high-end automated weapons systems. The book should appeal to students, researchers, and industrial and governmental administrators facing the design, operation, and maintenance of real time critical systems ranging across manufacturing facilities, transportation, finance, and military operations.
'It can be used as a supplementary material for teaching thermodynamics and statistical physics at an undergraduate or postgraduate level and can be a great read for undergraduate and postgraduate students of Sciences and Engineering.'Contemporary PhysicsIn this unique book, the reader is invited to experience the joy of appreciating something which has eluded understanding for many years - entropy and the Second Law of Thermodynamics. The book has a two-pronged message: first, that the Second Law is not infinitely incomprehensible as commonly stated in most textbooks on thermodynamics, but can, in fact, be comprehended through sheer common sense; and second, that entropy is not a mysterious quantity that has resisted understanding but a simple, familiar and easily comprehensible concept.Written in an accessible style, the book guides the reader through an abundance of dice games and examples from everyday life. The author paves the way for readers to discover for themselves what entropy is, how it changes, and, most importantly, why it always changes in one direction in a spontaneous process.In this new edition, seven simulated games are included so that the reader can actually experiment with the games described in the book. These simulated games are meant to enhance the readers' understanding and sense of joy upon discovering the Second Law of Thermodynamics.All errors in the previous edition were corrected and a whole new section (7.7) has been added in which the meaning of entropy is explain in simple lanaguage.
'It can be used as a supplementary material for teaching thermodynamics and statistical physics at an undergraduate or postgraduate level and can be a great read for undergraduate and postgraduate students of Sciences and Engineering.'Contemporary PhysicsIn this unique book, the reader is invited to experience the joy of appreciating something which has eluded understanding for many years - entropy and the Second Law of Thermodynamics. The book has a two-pronged message: first, that the Second Law is not infinitely incomprehensible as commonly stated in most textbooks on thermodynamics, but can, in fact, be comprehended through sheer common sense; and second, that entropy is not a mysterious quantity that has resisted understanding but a simple, familiar and easily comprehensible concept.Written in an accessible style, the book guides the reader through an abundance of dice games and examples from everyday life. The author paves the way for readers to discover for themselves what entropy is, how it changes, and, most importantly, why it always changes in one direction in a spontaneous process.In this new edition, seven simulated games are included so that the reader can actually experiment with the games described in the book. These simulated games are meant to enhance the readers' understanding and sense of joy upon discovering the Second Law of Thermodynamics.All errors in the previous edition were corrected and a whole new section (7.7) has been added in which the meaning of entropy is explain in simple lanaguage.
Information Theory is studied from the following points of view: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources.The main purpose of this book is to present information channels in the environment of functional analysis and operator theory as well as probability theory. Ergodic, mixing, and AMS channels are also considered in detail with some illustrations. In this second edition, channel operators are studied in many aspects, which generalize ordinary channels. Also Gaussian channels are considered in detail together with Gaussian measures on a Hilbert space. The Special Topics chapter deals with features such as generalized capacity, channels with an intermediate noncommutative system, and von Neumann algebra method for channels. Finally, quantum (noncommutative) information channels are examined in an independent chapter, which may be regarded as an introduction to quantum information theory. Von Neumann entropy is introduced and its generalization to a C*-algebra setting is given. Basic results on quantum channels and entropy transmission are also considered.
Big Data and Information Theory are a binding force between various areas of knowledge that allow for societal advancement. Rapid development of data analytic and information theory allows companies to store vast amounts of information about production, inventory, service, and consumer activities. More powerful CPUs and cloud computing make it possible to do complex optimization instead of using heuristic algorithms, as well as instant rather than offline decision-making. The era of "big data" challenges includes analysis, capture, curation, search, sharing, storage, transfer, visualization, and privacy violations. Big data calls for better integration of optimization, statistics, and data mining. In response to these challenges this book brings together leading researchers and engineers to exchange and share their experiences and research results about big data and information theory applications in various areas. This book covers a broad range of topics including statistics, data mining, data warehouse implementation, engineering management in large-scale infrastructure systems, data-driven sustainable supply chain network, information technology service offshoring project issues, online rumors governance, preliminary cost estimation, and information system project selection. The chapters in this book were originally published in the journal, International Journal of Management Science and Engineering Management.
The aim of this book is to explain in simple language what we know and what we do not know about information and entropy - two of the most frequently discussed topics in recent literature - and whether they are relevant to life and the entire universe.Entropy is commonly interpreted as a measure of disorder. This interpretation has caused a great amount of 'disorder' in the literature. One of the aims of this book is to put some 'order' in this 'disorder'.The book explains with minimum amount of mathematics what information theory is and how it is related to thermodynamic entropy. Then it critically examines the application of these concepts to the question of 'What is life?' and whether or not they can be applied to the entire universe.
The aim of this book is to explain in simple language what we know and what we do not know about information and entropy - two of the most frequently discussed topics in recent literature - and whether they are relevant to life and the entire universe.Entropy is commonly interpreted as a measure of disorder. This interpretation has caused a great amount of 'disorder' in the literature. One of the aims of this book is to put some 'order' in this 'disorder'.The book explains with minimum amount of mathematics what information theory is and how it is related to thermodynamic entropy. Then it critically examines the application of these concepts to the question of 'What is life?' and whether or not they can be applied to the entire universe.
Psychological research into human cognition and judgment reveals a wide range of biases and shortcomings. Whether we form impressions of other people, recall episodes from memory, report our attitudes in an opinion poll, or make important decisions, we often get it wrong. The errors made are not trivial and often seem to violate common sense and basic logic. A closer look at the underlying processes, however, suggests that many of the well known fallacies do not necessarily reflect inherent shortcomings of human judgment. Rather, they partially reflect that research participants bring the tacit assumptions that govern the conduct of conversation in daily life to the research situation. According to these assumptions, communicated information comes with a guarantee of relevance and listeners are entitled to assume that the speaker tries to be informative, truthful, relevant, and clear. Moreover, listeners interpret the speakers' utterances on the assumption that they are trying to live up to these ideals. This book introduces social science researchers to the "logic of conversation" developed by Paul Grice, a philosopher of language, who proposed the cooperative principle and a set of maxims on which conversationalists implicitly rely. The author applies this framework to a wide range of topics, including research on person perception, decision making, and the emergence of context effects in attitude measurement and public opinion research. Experimental studies reveal that the biases generally seen in such research are, in part, a function of violations of Gricean conversational norms. The author discusses implications for the design of experiments and questionnaires and addresses the socially contextualized nature of human judgment.
Your software needs to leverage multiple cores, handle thousands of users and terabytes of data, and continue working in the face of both hardware and software failure. Concurrency and parallelism are the keys, and Seven Concurrency Models in Seven Weeks equips you for this new world. See how emerging technologies such as actors and functional programming address issues with traditional threads and locks development. Learn how to exploit the parallelism in your computer's GPU and leverage clusters of machines with MapReduce and Stream Processing. And do it all with the confidence that comes from using tools that help you write crystal clear, high-quality code. This book will show you how to exploit different parallel architectures to improve your code's performance, scalability, and resilience. Learn about the perils of traditional threads and locks programming and how to overcome them through careful design and by working with the standard library. See how actors enable software running on geographically distributed computers to collaborate, handle failure, and create systems that stay up 24/7/365. Understand why shared mutable state is the enemy of robust concurrent code, and see how functional programming together with technologies such as Software Transactional Memory (STM) and automatic parallelism help you tame it. You'll learn about the untapped potential within every GPU and how GPGPU software can unleash it. You'll see how to use MapReduce to harness massive clusters to solve previously intractible problems, and how, in concert with Stream Processing, big data can be tamed. With an understanding of the strengths and weaknesses of each of the different models and hardware architectures, you'll be empowered to tackle any problem with confidence. What You Need: The example code can be compiled and executed on *nix, OS X, or Windows. Instructions on how to download the supporting build systems are given in each chapter. |
You may like...
Building Intuition - Insights from Basic…
Dilip Chhajed, Timothy J. Lowe
Hardcover
R2,655
Discovery Miles 26 550
Groupoid Metrization Theory - With…
Dorina Mitrea, Irina Mitrea, …
Hardcover
R2,733
Discovery Miles 27 330
Finding Source Code on the Web for Remix…
Susan Elliott Sim, Rosalva E. Gallardo-Valencia
Hardcover
Mem-elements for Neuromorphic Circuits…
Christos Volos, Viet-Thanh Pham
Paperback
R3,613
Discovery Miles 36 130
George Boole - Selected Manuscripts on…
Ivor Grattan-Guinness, G erard Bornet
Hardcover
R1,570
Discovery Miles 15 700
|