![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.
This book offers an introduction to ten key topics in quantum information science and quantum coherent phenomena, aimed at graduate-student level. The chapters cover some of the most recent developments in this dynamic research field where theoretical and experimental physics, combined with computer science, provide a fascinating arena for groundbreaking new concepts in information processing. The book addresses both the theoretical and experimental aspects of the subject, and clearly demonstrates how progress in experimental techniques has stimulated a great deal of theoretical effort and vice versa. Experiments are shifting from simply preparing and measuring quantum states to controlling and manipulating them, and the book outlines how the first real applications, notably quantum key distribution for secure communication, are starting to emerge. The chapters cover quantum retrodiction, ultracold quantum gases in optical lattices, optomechanics, quantum algorithms, quantum key distribution, quantum control based on measurement, orbital angular momentum of light, entanglement theory, trapped ions and quantum metrology, and open quantum systems subject to decoherence. The contributing authors have been chosen not just on the basis of their scientific expertise, but also because of their ability to offer pedagogical and well-written contributions which will be of interest to students and established researchers.
Statistical modeling is a critical tool in scientific research. This book provides comprehensive explanations of the concepts and philosophy of statistical modeling, together with a wide range of practical and numerical examples. The authors expect this work to be of great value not just to statisticians but also to researchers and practitioners in various fields of research such as information science, computer science, engineering, bioinformatics, economics, marketing and environmental science. It 's a crucial area of study, as statistical models are used to understand phenomena with uncertainty and to determine the structure of complex systems. They re also used to control such systems, as well as to make reliable predictions in various natural and social science fields.
New Research in Information Behaviour, co-edited by Professor Amanda Spink and Dr. Jannica Heinstrom provides an understanding of the new directions, leading edge theories and models in information behaviour. Information behaviour is conceptualized as complex human information related processes that are embedded within an individual's everyday social and life processes. The book presents chapters by a range of scholars who show new research directions that often challenge the established views and paradigms of information behaviour studies. Beginning with an evolutionary framework, the book builds our understanding of information behaviours over various epochs of human existence from the Palaeolithic Era and within pre-literate societies, to contemporary behaviours by 21st century humans. Drawing upon social and psychological science theories the book presents a more integrated and holistic approach understanding of information behaviours. This book is directly relevant to information scientists, information professionals and librarians, social and evolutionary psychologists, social scientists and people interested in understanding more about their own information behaviours.
This book constitutes the refereed post-conference proceedings of the 4th International Conference on Intelligence Science, ICIS 2020, held in Durgapur, India, in February 2021 (originally November 2020). The 23 full papers and 4 short papers presented were carefully reviewed and selected from 42 submissions. One extended abstract is also included. They deal with key issues in brain cognition; uncertain theory; machine learning; data intelligence; language cognition; vision cognition; perceptual intelligence; intelligent robot; and medical artificial intelligence.
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
As digital transformations continue to accelerate in the world, discourses of big data have come to dominate in a number of fields, from politics and economics, to media and education. But how can we really understand the digital world when so much of the writing through which we grapple with it remains deeply problematic? In a compelling new work of feminist critical theory, Bassett, Kember and O'Riordan scrutinise many of the assumptions of a masculinist digital world, highlighting the tendency of digital humanities scholarship to venerate and essentialise technical forms, and to adopt gendered writing and citation practices. Contesting these writings, practices and politics, the authors foreground feminist traditions and contributions to the field, offering alternative modes of knowledge production, and a radically different, poetic writing style. Through this prism, Furious brings into focus themes including the automation of home and domestic work, the Anthropocene, and intersectional feminist technofutures.
This is an examination of information policy in environmental sustainability. It covers issues such as information, knowledge and models; environmental regulation; and information policy.
The organizational world today has been characterized in various terms - turmoil, chaos, the age of paradox and unreason. Common to all these characterizations is that the conventional wisdom fails in responding to novel challenges triggered by the pervasive and radical change of organizations. Information, knowledge, information worker and information technology are at the epicenter of these changes and surprises. This book explores new organizational designs, such as, the network and virtual organization from the information perspective. In addition, proposed is a model of the nontraditional organization in which information work evolves around teams that directly serve customers. This model was put on a test, and elements of the nontraditional organization were identified in firms that have been around for quite some time - the public accounting industry, and specifically its technologically most advanced segment. The book aims at transferring experience and facilitating interest for methods of organizing suitable for the information age.
Psychological research into human cognition and judgment reveals a wide range of biases and shortcomings. Whether we form impressions of other people, recall episodes from memory, report our attitudes in an opinion poll, or make important decisions, we often get it wrong. The errors made are not trivial and often seem to violate common sense and basic logic. A closer look at the underlying processes, however, suggests that many of the well known fallacies do not necessarily reflect inherent shortcomings of human judgment. Rather, they partially reflect that research participants bring the tacit assumptions that govern the conduct of conversation in daily life to the research situation. According to these assumptions, communicated information comes with a guarantee of relevance and listeners are entitled to assume that the speaker tries to be informative, truthful, relevant, and clear. Moreover, listeners interpret the speakers' utterances on the assumption that they are trying to live up to these ideals. This book introduces social science researchers to the "logic of conversation" developed by Paul Grice, a philosopher of language, who proposed the cooperative principle and a set of maxims on which conversationalists implicitly rely. The author applies this framework to a wide range of topics, including research on person perception, decision making, and the emergence of context effects in attitude measurement and public opinion research. Experimental studies reveal that the biases generally seen in such research are, in part, a function of violations of Gricean conversational norms. The author discusses implications for the design of experiments and questionnaires and addresses the socially contextualized nature of human judgment.
This book opens a novel dimension in the 50 year history of mathematical theory of "information" since the birth of Shannon theory. First of all, it introduces, in place of the traditional notion of entropy and mutual information, the completely new and highly unconventional approach of "information-spectrum" as a basic but powerful tool for constructing the general theory of information. Reconstructing step-by-step all the essential major topics in information theory from the viewpoint of such an "information-spectrum", this comprehensive work provides an accessible introduction to the new type of mathematical theory of information that focuses mainly on general nonstationary and /or nonergodic sources and channels, in clear contrast with the traditional theories of information. This book is a new non-traditional theoretical reference for communication professionals and statisticians specializing in information theory.
This volume contains a selection of original contributions from internationally reputed scholars in the field of risk management in socio?technical systems with high hazard potential. Its first major section addresses fundamental psychological and socio?technical concepts in the field of risk perception, risk management and learning systems for safety improvement. The second section deals with the variety of procedures for system safety analysis. It covers strategies of analyzing automation problems and of safety culture as well as the analysis of social dynamics in field settings and of field experiments. Its third part then illustrates the utilization of basic concepts and analytic approaches by way of case studies of designing man?machine systems and in various industrial sectors such as intensive care wards, aviation, offfshore oil drilling and chemical industry. In linking basic theoretical conceptual notions and analytic strategies to detailed case studies in the area of hazardous work organizations the volume differs from and complements more theoretical works such as Human Error (J. Reason, 1990) and more general approaches such as New Technologies and Human Error (J. Rasmussen, K. Duncan, J. Leplat, Eds.)
A black swan is a highly improbable event with three principal
characteristics: It is unpredictable; it carries a massive impact;
and, after the fact, we concoct an explanation that makes it appear
less random, and more predictable, than it was. The astonishing
success of Google was a black swan; so was 9/11. For Nassim
Nicholas Taleb, black swans underlie almost everything about our
world, from the rise of religions to events in our own personal
lives. "From the Hardcover edition."
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
In The State of State Theory: State Projects, Repression, and Multi-Sites of Power, Glasberg, Willis, and Shannon argue that state theories should be amended to account both for theoretical developments broadly in the contemporary period as well as the multiple sites of power along which the state governs. Using state projects and policies around political economy, sexuality and family, food, welfare policy, racial formation, and social movements as narrative accounts in how the state operates, the authors argue for a complex and intersectional approach to state theory. In doing so, they expand outside of the canon to engage with perspectives within critical race theory, queer theory, and beyond to build theoretical tools for a contemporary and critical state theory capable of providing the foundations for understanding how the state governs, what is at stake in its governance, and, importantly, how people resist and engage with state power.
Ecosystems, the human brain, ant colonies, and economic networks are all complex systems displaying collective behaviour, or emergence, beyond the sum of their parts. Complexity science is the systematic investigation of these emergent phenomena, and stretches across disciplines, from physics and mathematics, to biological and social sciences. This introductory textbook provides detailed coverage of this rapidly growing field, accommodating readers from a variety of backgrounds, and with varying levels of mathematical skill. Part I presents the underlying principles of complexity science, to ensure students have a solid understanding of the conceptual framework. The second part introduces the key mathematical tools central to complexity science, gradually developing the mathematical formalism, with more advanced material provided in boxes. A broad range of end of chapter problems and extended projects offer opportunities for homework assignments and student research projects, with solutions available to instructors online. Key terms are highlighted in bold and listed in a glossary for easy reference, while annotated reading lists offer the option for extended reading and research.
This volume is to pique the interest of many researchers in the fields of infinite dimensional analysis and quantum probability. These fields have undergone increasingly significant developments and have found many new applications, in particular, to classical probability and to different branches of physics. These fields are rather wide and are of a strongly interdisciplinary nature. For such a purpose, we strove to bridge among these interdisciplinary fields in our Workshop on IDAQP and their Applications that was held at the Institute for Mathematical Sciences, National University of Singapore from 3-7 March 2014. Readers will find that this volume contains all the exciting contributions by well-known researchers in search of new directions in these fields.
The Science of Deep Learning emerged from courses taught by the author that have provided thousands of students with training and experience for their academic studies, and prepared them for careers in deep learning, machine learning, and artificial intelligence in top companies in industry and academia. The book begins by covering the foundations of deep learning, followed by key deep learning architectures. Subsequent parts on generative models and reinforcement learning may be used as part of a deep learning course or as part of a course on each topic. The book includes state-of-the-art topics such as Transformers, graph neural networks, variational autoencoders, and deep reinforcement learning, with a broad range of applications. The appendices provide equations for computing gradients in backpropagation and optimization, and best practices in scientific writing and reviewing. The text presents an up-to-date guide to the field built upon clear visualizations using a unified notation and equations, lowering the barrier to entry for the reader. The accompanying website provides complementary code and hundreds of exercises with solutions.
This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory - the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.
This text explores what speech, music and other sounds have in common. It gives a description of the way perspective, rhythm, textual quality and other aspects of sound are used to communicate emotion and meaning. It draws on a wealth of examples from radio (disc jockey and newsreading speech, radio plays, advertising jingles, news signature tunes), film soundtracks, such as The Piano, The X Files and Disney animation films, music ranging from medieval plain chant to drum'n'bass, and everyday soundscapes.
The book is a collection of papers of experts in the fields of information and complexity. Information is a basic structure of the world, while complexity is a fundamental property of systems and processes. There are intrinsic relations between information and complexity.The research in information theory, the theory of complexity and their interrelations is very active. The book will expand knowledge on information, complexity and their relations representing the most recent and advanced studies and achievements in this area.The goal of the book is to present the topic from different perspectives - mathematical, informational, philosophical, methodological, etc.
This book aims to synthesize different directions in knowledge studies into a unified theory of knowledge and knowledge processes. It explicates important relations between knowledge and information. It provides the readers with understanding of the essence and structure of knowledge, explicating operations and process that are based on knowledge and vital for society.The book also highlights how the theory of knowledge paves the way for more advanced design and utilization of computers and networks.
Interpersonal relationships are the core of our societal system and
have been since before the dawn of civilization. In today's world,
friends, lovers, companions, and confidants make valuable
contributions to our everyday lives. These are the relationships
whose members are not automatically participants as a result of
their birth and kin affiliations. The focus is on these
relationships that must be forged from the sometimes indifferent,
and sometimes hostile world. Yet, there is still much that is not
known about how these relationships evolve, how partners
communicate in on-going relationships, how people keep their
relationships together, and how they cope when they fall apart.
Primary to the focus of this book is the underlying theme of
evolving interpersonal relationships from the initial encounter to
the mature alliance.
The book is a unique exploration of a spectrum of unexpected analogs to psychopathologies likely to afflict real-time critical systems, written by a specialist in the epidemiology of mental disorders. The purpose of this book is to develop a set of information-theoretic statistical tools for analyzing the instabilities of real-time cognitive systems at those varying scales and levels of organization, with special focus on high level machine function.The book should be of particular interest to both industry and academic scientists, and government regulators, concerned with driverless cars on intelligent roads. Many of the same concerns also afflict high-end automated weapons systems. The book should appeal to students, researchers, and industrial and governmental administrators facing the design, operation, and maintenance of real time critical systems ranging across manufacturing facilities, transportation, finance, and military operations. |
You may like...
Handbook Of Public Relations
Irma Meyer, Dalien Rene Benecke, …
Paperback
R592
Discovery Miles 5 920
Services Marketing - A Contemporary…
Adele Berndt, Christo Boshoff
Paperback
R697
Discovery Miles 6 970
Relationship Marketing And Customer…
Madele Tait, Mornay Roberts-Lombard
Paperback
R397
Discovery Miles 3 970
Marketing Concepts And Strategies
Sally Dibb, William Pride, …
Paperback
|