![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
Information Theory is studied from the following points of view: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources.The main purpose of this book is to present information channels in the environment of functional analysis and operator theory as well as probability theory. Ergodic, mixing, and AMS channels are also considered in detail with some illustrations. In this second edition, channel operators are studied in many aspects, which generalize ordinary channels. Also Gaussian channels are considered in detail together with Gaussian measures on a Hilbert space. The Special Topics chapter deals with features such as generalized capacity, channels with an intermediate noncommutative system, and von Neumann algebra method for channels. Finally, quantum (noncommutative) information channels are examined in an independent chapter, which may be regarded as an introduction to quantum information theory. Von Neumann entropy is introduced and its generalization to a C*-algebra setting is given. Basic results on quantum channels and entropy transmission are also considered.
Our world and the people within it are increasingly interpreted and classified by automated systems. At the same time, automated classifications influence what happens in the physical world. These entanglements change what it means to interact with governance, and shift what elements of our identity are knowable and meaningful. In this cyber-physical world, or 'world state', what is the role for law? Specifically, how should law address the claim that computational systems know us better than we know ourselves? Monitoring Laws traces the history of government profiling from the invention of photography through to emerging applications of computer vision for personality and behavioral analysis. It asks what dimensions of profiling have provoked legal intervention in the past, and what is different about contemporary profiling that requires updating our legal tools. This work should be read by anyone interested in how computation is changing society and governance, and what it is about people that law should protect in a computational world.
'It can be used as a supplementary material for teaching thermodynamics and statistical physics at an undergraduate or postgraduate level and can be a great read for undergraduate and postgraduate students of Sciences and Engineering.'Contemporary PhysicsIn this unique book, the reader is invited to experience the joy of appreciating something which has eluded understanding for many years - entropy and the Second Law of Thermodynamics. The book has a two-pronged message: first, that the Second Law is not infinitely incomprehensible as commonly stated in most textbooks on thermodynamics, but can, in fact, be comprehended through sheer common sense; and second, that entropy is not a mysterious quantity that has resisted understanding but a simple, familiar and easily comprehensible concept.Written in an accessible style, the book guides the reader through an abundance of dice games and examples from everyday life. The author paves the way for readers to discover for themselves what entropy is, how it changes, and, most importantly, why it always changes in one direction in a spontaneous process.In this new edition, seven simulated games are included so that the reader can actually experiment with the games described in the book. These simulated games are meant to enhance the readers' understanding and sense of joy upon discovering the Second Law of Thermodynamics.All errors in the previous edition were corrected and a whole new section (7.7) has been added in which the meaning of entropy is explain in simple lanaguage.
Information we receive from and create together with our social networks is becoming increasingly important. Social information has a great impact on our information behaviour and there are many possible angles and layers in studying social aspects in information science. This book presents some of these angles. Social Information Research, co-edited by Gunilla Widen and Kim Holmberg communicates current research looking into different aspects of social information as part of information behaviour research. There is a special emphasis on the new innovations supporting contemporary information behavior and the social media context within which it can sit. As a concept, social information has been studied in biology, psychology and sociology among other disciplines. This book is relevant for various actors in the library and information science field and will be useful for researchers, educators, and practitioners while coordinating empirical research on social information and providing an overview of some of the present research about social information.
30 years after its publication Marshall McLuhan's The Medium is the Massage remains his most entertaining, provocative, and piquant book. With every technological and social "advance" McLuhan's proclamation that "the media work us over completely" becomes more evident and plain. In his words, Uso pervasive are they in their personal, political, economic, aesthetic, psychological, moral, ethical and social consequences that they leave no part of us untouched, unaffected, or unaltered'. McLuhan's remarkable observation that "societies have always been shaped more by the nature of the media by which men communicate than by the content of the communication" is undoubtedly more relevant today than ever before. With the rise of the internet and the explosion of the digital revolution there has never been a better time to revisit Marshall McLuhan.
The aim of this book is to explain in simple language what we know and what we do not know about information and entropy - two of the most frequently discussed topics in recent literature - and whether they are relevant to life and the entire universe.Entropy is commonly interpreted as a measure of disorder. This interpretation has caused a great amount of 'disorder' in the literature. One of the aims of this book is to put some 'order' in this 'disorder'.The book explains with minimum amount of mathematics what information theory is and how it is related to thermodynamic entropy. Then it critically examines the application of these concepts to the question of 'What is life?' and whether or not they can be applied to the entire universe.
The aim of this book is to explain in simple language what we know and what we do not know about information and entropy - two of the most frequently discussed topics in recent literature - and whether they are relevant to life and the entire universe.Entropy is commonly interpreted as a measure of disorder. This interpretation has caused a great amount of 'disorder' in the literature. One of the aims of this book is to put some 'order' in this 'disorder'.The book explains with minimum amount of mathematics what information theory is and how it is related to thermodynamic entropy. Then it critically examines the application of these concepts to the question of 'What is life?' and whether or not they can be applied to the entire universe.
Your software needs to leverage multiple cores, handle thousands of users and terabytes of data, and continue working in the face of both hardware and software failure. Concurrency and parallelism are the keys, and Seven Concurrency Models in Seven Weeks equips you for this new world. See how emerging technologies such as actors and functional programming address issues with traditional threads and locks development. Learn how to exploit the parallelism in your computer's GPU and leverage clusters of machines with MapReduce and Stream Processing. And do it all with the confidence that comes from using tools that help you write crystal clear, high-quality code. This book will show you how to exploit different parallel architectures to improve your code's performance, scalability, and resilience. Learn about the perils of traditional threads and locks programming and how to overcome them through careful design and by working with the standard library. See how actors enable software running on geographically distributed computers to collaborate, handle failure, and create systems that stay up 24/7/365. Understand why shared mutable state is the enemy of robust concurrent code, and see how functional programming together with technologies such as Software Transactional Memory (STM) and automatic parallelism help you tame it. You'll learn about the untapped potential within every GPU and how GPGPU software can unleash it. You'll see how to use MapReduce to harness massive clusters to solve previously intractible problems, and how, in concert with Stream Processing, big data can be tamed. With an understanding of the strengths and weaknesses of each of the different models and hardware architectures, you'll be empowered to tackle any problem with confidence. What You Need: The example code can be compiled and executed on *nix, OS X, or Windows. Instructions on how to download the supporting build systems are given in each chapter.
Psychological research into human cognition and judgment reveals a wide range of biases and shortcomings. Whether we form impressions of other people, recall episodes from memory, report our attitudes in an opinion poll, or make important decisions, we often get it wrong. The errors made are not trivial and often seem to violate common sense and basic logic. A closer look at the underlying processes, however, suggests that many of the well known fallacies do not necessarily reflect inherent shortcomings of human judgment. Rather, they partially reflect that research participants bring the tacit assumptions that govern the conduct of conversation in daily life to the research situation. According to these assumptions, communicated information comes with a guarantee of relevance and listeners are entitled to assume that the speaker tries to be informative, truthful, relevant, and clear. Moreover, listeners interpret the speakers' utterances on the assumption that they are trying to live up to these ideals. This book introduces social science researchers to the "logic of conversation" developed by Paul Grice, a philosopher of language, who proposed the cooperative principle and a set of maxims on which conversationalists implicitly rely. The author applies this framework to a wide range of topics, including research on person perception, decision making, and the emergence of context effects in attitude measurement and public opinion research. Experimental studies reveal that the biases generally seen in such research are, in part, a function of violations of Gricean conversational norms. The author discusses implications for the design of experiments and questionnaires and addresses the socially contextualized nature of human judgment.
We are often told that we are "living in an information society" or that we are "information workers." But what exactly do these claims mean, and how might they be verified? In this important methodological study, Alistair S. Duff cuts through the rhetoric to get to the bottom of the "information society thesis." Wide-ranging in coverage, this study will be of interest to scholars in information science, communication and media studies and social theory. It is a key text for the newly-unified specialism of information society studies, and an indispensable guide to the future of this discipline.
Big Data and Information Theory are a binding force between various areas of knowledge that allow for societal advancement. Rapid development of data analytic and information theory allows companies to store vast amounts of information about production, inventory, service, and consumer activities. More powerful CPUs and cloud computing make it possible to do complex optimization instead of using heuristic algorithms, as well as instant rather than offline decision-making. The era of "big data" challenges includes analysis, capture, curation, search, sharing, storage, transfer, visualization, and privacy violations. Big data calls for better integration of optimization, statistics, and data mining. In response to these challenges this book brings together leading researchers and engineers to exchange and share their experiences and research results about big data and information theory applications in various areas. This book covers a broad range of topics including statistics, data mining, data warehouse implementation, engineering management in large-scale infrastructure systems, data-driven sustainable supply chain network, information technology service offshoring project issues, online rumors governance, preliminary cost estimation, and information system project selection. The chapters in this book were originally published in the journal, International Journal of Management Science and Engineering Management.
"Coalescent Argumentation" is based on the concept that arguments
can function from agreement, rather than disagreement. To prove
this idea, Gilbert first discusses how several
components--emotional, visceral (physical) and kisceral (intuitive)
are utilized in an argumentative setting by people everyday. These
components, also characterized as "modes," are vital to
argumentative communication because they affect both the argument
and the resulting outcome.
This is a book about creating information systems within firms that will truly give managers the information they need to make informed business decisions. The author contends that information is part of an ecological system in which it undergoes a process of evolution and adaptation to the requirements of the local users. The book explains ecological planning tools that guide managers to develop information systems to meet their changing needs.
This work addresses the notion of compression ratios greater than what has been known for random sequential strings in binary and larger radix-based systems as applied to those traditionally found in Kolmogorov complexity. A culmination of the author's decade-long research that began with his discovery of a compressible random sequential string, the book maintains a theoretical-statistical level of introduction suitable for mathematical physicists. It discusses the application of ternary-, quaternary-, and quinary-based systems in statistical communication theory, computing, and physics.
Information services are economic and organizational activities for informing people. Because informing is changing rapidly under the influence of internet-technologies, this book presents in Chapter 1 fundamental notions of information and knowledge, based on philosopher C.W. Churchman's inquiring systems. This results in the identification of three product-oriented design theory aspects: content, use value and revenue. Chapter 2 describes how one can cope with these aspects by presenting process-oriented design theory. Both design theory insights are applied in chapters on information services challenges, their business concepts and processes, their architectures and exploitation. The final chapter discusses three case studies that integrate the insights from previous chapters, and it discusses some ideas for future research. This book gives students a coherent start to the topic of information services from a design science perspective, with a balance between technical and managerial aspects. Therefore, this book is useful for modern curricula of management, communication science and information systems. Because of its design science approach, it also explains design science principles. The book also serves professionals and academics in search of a foundational understanding of informing as a science and management practice.
Networks surround us, from social networks to protein - protein interaction networks within the cells of our bodies. The theory of random graphs provides a necessary framework for understanding their structure and development. This text provides an accessible introduction to this rapidly expanding subject. It covers all the basic features of random graphs - component structure, matchings and Hamilton cycles, connectivity and chromatic number - before discussing models of real-world networks, including intersection graphs, preferential attachment graphs and small-world models. Based on the authors' own teaching experience, it can be used as a textbook for a one-semester course on random graphs and networks at advanced undergraduate or graduate level. The text includes numerous exercises, with a particular focus on developing students' skills in asymptotic analysis. More challenging problems are accompanied by hints or suggestions for further reading.
This review volume consists of a set of chapters written by leading scholars, most of them founders of their fields. It explores the connections of Randomness to other areas of scientific knowledge, especially its fruitful relationship to Computability and Complexity Theory, and also to areas such as Probability, Statistics, Information Theory, Biology, Physics, Quantum Mechanics, Learning Theory and Artificial Intelligence. The contributors cover these topics without neglecting important philosophical dimensions, sometimes going beyond the purely technical to formulate age old questions relating to matters such as determinism and free will.The scope of Randomness Through Computation is novel. Each contributor shares their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points.
The last few years have witnessed rapid advancements in information and coding theory research and applications. This book provides a comprehensive guide to selected topics, both ongoing and emerging, in information and coding theory. Consisting of contributions from well-known and high-profile researchers in their respective specialties, topics that are covered include source coding; channel capacity; linear complexity; code construction, existence and analysis; bounds on codes and designs; space-time coding; LDPC codes; and codes and cryptography.All of the chapters are integrated in a manner that renders the book as a supplementary reference volume or textbook for use in both undergraduate and graduate courses on information and coding theory. As such, it will be a valuable text for students at both undergraduate and graduate levels as well as instructors, researchers, engineers, and practitioners in these fields.Supporting Powerpoint Slides are available upon request for all instructors who adopt this book as a course text.
This guide represents the first serious academic assessment of the relationships between peoples in Africa and of African descent and Afro mass media around the world. Experts on communications in sub-Saharan and North Africa and the Caribbean and African-American media in the United States characterize the settings and philosophical contexts for media in the countries that they survey; the development of often difficult relationships between government, society, and the media; the education and training of media personnel; and the implications of new technologies and future challenges. Designed for students, teachers, and professionals in communications and in the social sciences broadly. This comparative study of Afro mass media, the impact of social and political systems, of culture and ideology, of different communications mechanisms, and of special problems is designed for students, teachers, and professionals in all areas of communications and mass media, and in government, sociology, economics, and African and African-American studies.
This unique volume presents a new approach ??? the general theory of information ??? to scientific understanding of information phenomena. Based on a thorough analysis of information processes in nature, technology, and society, as well as on the main directions in information theory, this theory synthesizes existing directions into a unified system. The book explains how this theory opens new kinds of possibilities for information technology, information sciences, computer science, knowledge engineering, psychology, linguistics, social sciences, and education. The book also gives a broad introduction to the main mathematically-based directions in information theory. The general theory of information provides a unified context for existing directions in information studies, making it possible to elaborate on a comprehensive definition of information; explain relations between information, data, and knowledge; and demonstrate how different mathematical models of information and information processes are related. Explanation of information essence and functioning is given, as well as answers to the following questions: ??? how information is related to knowledge and data; ??? how information is modeled by mathematical structures; ??? how these models are used to better understand computers and the Internet, cognition and education, communication and computation.
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This first volume, Foundations, introduces core topics in inference and learning, such as matrix theory, linear algebra, random variables, convex optimization and stochastic optimization, and prepares students for studying their practical application in later volumes. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 600 end-of-chapter problems (including solutions for instructors), 100 figures, 180 solved examples, datasets and downloadable Matlab code. Supported by sister volumes Inference and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.
From the reviews: "The 2nd (slightly enlarged) edition of the van Lint's book is a short, concise, mathematically rigorous introduction to the subject. Basic notions and ideas are clearly presented from the mathematician's point of view and illustrated on various special classes of codes...This nice book is a must for every mathematician wishing to introduce himself to the algebraic theory of coding." European Mathematical Society Newsletter, 1993 "Despite the existence of so many other books on coding theory, this present volume will continue to hold its place as one of the standard texts...." The Mathematical Gazette, 1993
An effective blend of carefully explained theory and practical applications, this text imparts the fundamentals of both information theory and data compression. Although the two topics are related, this unique text allows either topic to be presented independently, and it was specifically designed so that the data compression section requires no prior knowledge of information theory.
"i In the Sky" is a collection of essays by more than 40 experts, including such leading writers as Charles Handy, Don Tapscott, and Kevin Warwick, giving their personal vision of the future of information. Information here is given its widest meaning and includes such subjects as the Internet, electronic commerce, cybernetics, robotics, artificial intelligence, and even computers as fashion accessories. Information as phenomenon pervades all areas of life, and its evolution has consequences for everyone. Many of the essays have as their central themes the future of computer intelligence; library and information services; interactive Internet marketing; networked learning in higher education; the linking of technology enabling remote and online communication to the deconstruction of the modern corporation; artificial intelligence; scholarly communication; smart houses; intelligent appliances; etc. |
You may like...
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,954
Discovery Miles 209 540
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,967
Discovery Miles 209 670
Primer for Data Analytics and Graduate…
Douglas Wolfe, Grant Schneider
Hardcover
R2,441
Discovery Miles 24 410
Engineering and the Ultimate - An…
Jonathan Bartlett, Dominic Halsmer, …
Hardcover
R701
Discovery Miles 7 010
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,961
Discovery Miles 209 610
|