Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
Navigating Information Literacy captures a range of skills and topics essential for students who intend positioning themselves in academic or workplace environments that are globally connected and competitive. The clear, well-structured and informative text leads the reader through all aspects of information literacy and provides practical advice and relevant examples from a variety of international contexts.
Ithiel de Sola Pool was a pioneering social scientist, a distinguished scholar of the political process, and one of the most original thinkers in the development of the social sciences. Passionately engaged in politics, he continued his role of leadership throughout his life, building the MIT Political Science Department into an outstanding group. He organized international teams of social scientists and collaborated widely to develop the understanding of social change. He was a frequent adviser to governments as consultant and in-house critic, and a successful advocate of limits on government regulation. "Politics in Wired Nations" presents his writings on the social and political impact of different communication systems and new telecommunications technology. Included in this volume is the first study of trends in a global information society, and the first study of social networks and the "small world" phenomenon that creates new relationships and routes of informal influence and political power, both domestic and international. Pool's essays on the politics of foreign trade, the influence of American businessmen on Congress, and changeable "unnatural" institutions of the modern world (e.g., bureaucracies, mega-cities, and nation-states) are herein contained. Pool describes a nonviolent revolution in freedom and political control that is possible as the world changes from the era of one-way mass communications--targeted to national audiences--to a new era of abundant, high-capacity, low-cost, interactive, and user-controlled communications on a global scale. He discusses policy choices for freedom, the battlegrounds ahead, and the risks of government involvement in the regulation of new telecommunication technologies.
This comprehensive and innovative Research Handbook tackles the pressing issues confronting us at the dawn of the global network society, including freedom of speech, government transparency and the digital divide. Representing a milestone in information policy research, this new volume edited by Alistair Duff brings together leading contributors from a wide range of disciplines to discuss important topics such as genetic information, news and privacy, and provides case studies on cyber harms, freedom of information and national digitization policy. Engaging with controversial problems of public policy including freedom of expression, copyright and information inequality, the Research Handbook on Information Policy offers a well-rounded exploration of the history and future of this vital field. Systematically addressing both general theory and specific issues, as well as providing international perspectives, this Research Handbook will be of particular interest to academics and students in the disciplines of information science, journalism and media studies, politics, sociology, philosophy and law.
The regulation and flow of information continues to have a critical impact upon how people live their lives and the way society functions. In recent times, disinformation and privacy violation have become the 'information pollution' of the 21st century. This book explores ways and means of protecting the 'information environment' by drawing upon four theories of contemporary environmentalism: welfare economics, the commons, ecology, and public choice theory. Welfare economics highlights the need to focus on costs (as well as benefits) when evaluating regulatory structures. The commons encourages queries about the validity of propertisation. Ecology speaks to the importance of diversity and resilience. And public choice theory hazards against the regulatory effect of concentrated interests. The lessons from each inspire the proposed information environmental governance framework. By neatly capturing the metaphorical relationship between the physical environment and the information environment, Robert Cunningham explores progressive regulatory pathways for the digital age. This book will be a thought-provoking read for scholars and students with an interest in intellectual property or the regulation of information.
The phenomenal international bestseller that shows us how to stop trying to predict everything - and take advantage of uncertainty What have the invention of the wheel, Pompeii, the Wall Street Crash, Harry Potter and the internet got in common? Why are all forecasters con-artists? Why should you never run for a train or read a newspaper? This book is all about Black Swans: the random events that underlie our lives, from bestsellers to world disasters. Their impact is huge; they're impossible to predict; yet after they happen we always try to rationalize them. 'Taleb is a bouncy and even exhilarating guide ... I came to relish what he said, and even develop a sneaking affection for him as a person' Will Self, Independent on Sunday 'He leaps like some superhero of the mind' Boyd Tonkin, Independent
Proofs play a central role in advanced mathematics and theoretical computer science, yet many students struggle the first time they take a course in which proofs play a significant role. This bestselling text's third edition helps students transition from solving problems to proving theorems by teaching them the techniques needed to read and write proofs. Featuring over 150 new exercises and a new chapter on number theory, this new edition introduces students to the world of advanced mathematics through the mastery of proofs. The book begins with the basic concepts of logic and set theory to familiarize students with the language of mathematics and how it is interpreted. These concepts are used as the basis for an analysis of techniques that can be used to build up complex proofs step by step, using detailed 'scratch work' sections to expose the machinery of proofs about numbers, sets, relations, and functions. Assuming no background beyond standard high school mathematics, this book will be useful to anyone interested in logic and proofs: computer scientists, philosophers, linguists, and, of course, mathematicians.
Make the most of your Mac with this witty, authoritative guide to macOS Big Sur. Apple updates its Mac operating system every year, adding new features with every revision. But after twenty years of this updating cycle without a printed user guide to help customers, feature bloat and complexity have begun to weigh down the works. For thirty years, the Mac faithful have turned to David Pogue's Mac books to guide them. With Mac Unlocked, New York Times bestselling author Pogue introduces readers to the most radical Mac software redesign in Apple history, macOS Big Sur. Beginning Mac users and Windows refugees will gain an understanding of the Mac philosophy; Mac veterans will find a concise guide to what's new in Big Sur, including its stunning visual and sonic redesign, the new Control Center for quick settings changes, and the built-in security auditing features. With a 300 annotated illustrations, sparkling humor, and crystal-clear prose, Mac Unlocked is the new gold-standard guide to the Mac.
The World Wide Web is truly astounding. It has changed the way we interact, learn and innovate. It is the largest sociotechnical system humankind has created and is advancing at a pace that leaves most in awe. It is an unavoidable fact that the future of the world is now inextricably linked to the future of the Web. Almost every day it appears to change, to get better and increase its hold on us. For all this we are starting to see underlying stability emerge. The way that Web sites rank in terms of popularity, for example, appears to follow laws with which we are familiar. What is fascinating is that these laws were first discovered, not in fields like computer science or information technology, but in what we regard as more fundamental disciplines like biology, physics and mathematics. Consequently the Web, although synthetic at its surface, seems to be quite 'natural' deeper down, and one of the driving aims of the new field of Web Science is to discover how far down such 'naturalness' goes. If the Web is natural to its core, that raises some fundamental questions. It forces us, for example, to ask if the central properties of the Web might be more elemental than the truths we cling to from our understandings of the physical world. In essence, it demands that we question the very nature of information. Understanding Information and Computation is about such questions and one possible route to potentially mind-blowing answers.
The Science of Deep Learning emerged from courses taught by the author that have provided thousands of students with training and experience for their academic studies, and prepared them for careers in deep learning, machine learning, and artificial intelligence in top companies in industry and academia. The book begins by covering the foundations of deep learning, followed by key deep learning architectures. Subsequent parts on generative models and reinforcement learning may be used as part of a deep learning course or as part of a course on each topic. The book includes state-of-the-art topics such as Transformers, graph neural networks, variational autoencoders, and deep reinforcement learning, with a broad range of applications. The appendices provide equations for computing gradients in backpropagation and optimization, and best practices in scientific writing and reviewing. The text presents an up-to-date guide to the field built upon clear visualizations using a unified notation and equations, lowering the barrier to entry for the reader. The accompanying website provides complementary code and hundreds of exercises with solutions.
The last two decades have seen the development of a number of
models that have proven particularly important in advancing
understanding of message-production processes. Now it appears that
a "second generation" of theories is emerging, one that reflects
considerable conceptual advances over earlier models. "Message
Production: Advances in Communication Theory" focuses on these new
developments in theoretical approaches to verbal and nonverbal
message production. The chapters reflect a number of
characteristics and trends resident in these theories including:
This volume includes edited and revised versions of the papers
delivered and discussed at the recent Advertising and Consumer
Psychology Conference. Following the theme of the conference --
"Measuring Advertising Effectiveness" -- the book blends academic
psychology, marketing theory, survey methodology, and practical
experience, while simultaneously addressing the problems and
limitations of advertising.
The recent evolution of western societies has been characterized by
an increasing emphasis on information and communication. As the
amount of available information increases, however, the user --
worker, student, citizen -- faces a new problem: selecting and
accessing relevant information. More than ever it is crucial to
find efficient ways for users to interact with information systems
in a way that prevents them from being overwhelmed or simply
missing their targets. As a result, hypertext systems have been
developed as a means of facilitating the interactions between
readers and text. In hypertext, information is organized as a
network in which nodes are text chunks (e.g., lists of items,
paragraphs, pages) and links are relationships between the nodes
(e.g., semantic associations, expansions, definitions, examples --
virtually any kind of relation that can be imagined between two
text passages). Unfortunately, the many ways in which these
hypertext interfaces can be designed has caused a complexity that
extends far beyond the processing abilities of regular users.
Therefore, it has become widely recognized that a more rational
approach based on a thorough analysis of information users' needs,
capacities, capabilities, and skills is needed. This volume seeks
to meet that need.
Coding theory came into existence in the late 1940's and is
concerned with devising efficient encoding and decoding
procedures.
The Universal Service Desk (USD) - Implementing, controlling and improving service delivery defines what a USD is, why it is valuable to an organisation and how to build and implement one. It also discusses the evolution of the USD as part of integrated workplace management. Understand the essentials of any USD - buy this book today!
Characterized by its multi-level interdisciplinary character,
communication has become a variable field -- one in which the level
of analysis varies. This has had important ramifications for the
study of communication because, to some extent, the questions one
asks are determined by the methods one has available to answer
them. As a result, communication research is characterized by the
plethora of both qualitative and quantitative approaches used by
its practitioners. These include survey and experimental methods,
and content, historical, and rhetorical analyses.
This book introduces machine learning for readers with some background in basic linear algebra, statistics, probability, and programming. In a coherent statistical framework it covers a selection of supervised machine learning methods, from the most fundamental (k-NN, decision trees, linear and logistic regression) to more advanced methods (deep neural networks, support vector machines, Gaussian processes, random forests and boosting), plus commonly-used unsupervised methods (generative modeling, k-means, PCA, autoencoders and generative adversarial networks). Careful explanations and pseudo-code are presented for all methods. The authors maintain a focus on the fundamentals by drawing connections between methods and discussing general concepts such as loss functions, maximum likelihood, the bias-variance decomposition, ensemble averaging, kernels and the Bayesian approach along with generally useful tools such as regularization, cross validation, evaluation metrics and optimization methods. The final chapters offer practical advice for solving real-world supervised machine learning problems and on ethical aspects of modern machine learning.
First book on the subject, illustrative examples, some original results, self-contained material, a reference book.
Information is central to the evolution of biological complexity, a physical system relying on a continuous supply of energy. Biology provides superb examples of the consequent Darwinian selection of mechanisms for efficient energy utilisation. Genetic information, underpinned by the Watson-Crick base-pairing rules is largely encoded by DNA, a molecule uniquely adapted to its roles in information storage and utilisation.This volume addresses two fundamental questions. Firstly, what properties of the molecule have enabled it to become the predominant genetic material in the biological world today and secondly, to what extent have the informational properties of the molecule contributed to the expansion of biological diversity and the stability of ecosystems. The author argues that bringing these two seemingly unrelated topics together enables Schroedinger's What is Life?, published before the structure of DNA was known, to be revisited and his ideas examined in the context of our current biological understanding.
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
Our world and the people within it are increasingly interpreted and classified by automated systems. At the same time, automated classifications influence what happens in the physical world. These entanglements change what it means to interact with governance, and shift what elements of our identity are knowable and meaningful. In this cyber-physical world, or 'world state', what is the role for law? Specifically, how should law address the claim that computational systems know us better than we know ourselves? Monitoring Laws traces the history of government profiling from the invention of photography through to emerging applications of computer vision for personality and behavioral analysis. It asks what dimensions of profiling have provoked legal intervention in the past, and what is different about contemporary profiling that requires updating our legal tools. This work should be read by anyone interested in how computation is changing society and governance, and what it is about people that law should protect in a computational world.
This compact course is written for the mathematically literate reader who wants to learn to analyze data in a principled fashion. The language of mathematics enables clear exposition that can go quite deep, quite quickly, and naturally supports an axiomatic and inductive approach to data analysis. Starting with a good grounding in probability, the reader moves to statistical inference via topics of great practical importance - simulation and sampling, as well as experimental design and data collection - that are typically displaced from introductory accounts. The core of the book then covers both standard methods and such advanced topics as multiple testing, meta-analysis, and causal inference.
This volume contains a selection of original contributions from internationally reputed scholars in the field of risk management in socio?technical systems with high hazard potential. Its first major section addresses fundamental psychological and socio?technical concepts in the field of risk perception, risk management and learning systems for safety improvement. The second section deals with the variety of procedures for system safety analysis. It covers strategies of analyzing automation problems and of safety culture as well as the analysis of social dynamics in field settings and of field experiments. Its third part then illustrates the utilization of basic concepts and analytic approaches by way of case studies of designing man?machine systems and in various industrial sectors such as intensive care wards, aviation, offfshore oil drilling and chemical industry. In linking basic theoretical conceptual notions and analytic strategies to detailed case studies in the area of hazardous work organizations the volume differs from and complements more theoretical works such as Human Error (J. Reason, 1990) and more general approaches such as New Technologies and Human Error (J. Rasmussen, K. Duncan, J. Leplat, Eds.)
Heavy tails -extreme events or values more common than expected -emerge everywhere: the economy, natural events, and social and information networks are just a few examples. Yet after decades of progress, they are still treated as mysterious, surprising, and even controversial, primarily because the necessary mathematical models and statistical methods are not widely known. This book, for the first time, provides a rigorous introduction to heavy-tailed distributions accessible to anyone who knows elementary probability. It tackles and tames the zoo of terminology for models and properties, demystifying topics such as the generalized central limit theorem and regular variation. It tracks the natural emergence of heavy-tailed distributions from a wide variety of general processes, building intuition. And it reveals the controversy surrounding heavy tails to be the result of flawed statistics, then equips readers to identify and estimate with confidence. Over 100 exercises complete this engaging package. |
You may like...
Big Data and Information Theory
Jiuping Xu, Syed Ejaz Ahmed, …
Hardcover
R4,129
Discovery Miles 41 290
The Medium is the Massage - An Inventory…
Marshall McLuhan, Quentin Fiore, …
Paperback
An Introduction to Symbolic Dynamics and…
Douglas Lind, Brian Marcus
Paperback
Cognition and Communication - Judgmental…
Norbert Schwarz
Paperback
|