![]() |
![]() |
Your cart is empty |
||
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
The essays collected in this volume address the full range of pedagogical and programmatic issues specifically facing technical communication teachers and programme directors in the computer age. The authors locate computers and computing activities within the richly-textured cultural contexts of a technological society, focusing on the technical communication instructional issues that remain most important as old versions of hardware and software are endlessly replaced by new ones. Part One, "Broadening Notions of Computer Literacy", complicates mechanistic approaches to computer-related instruction by locating the design and use of hardware and software within social, cultural, political, ethical and legal contexts. Part Two examines how teachers and programme directors can encourage critical literacies in their classrooms and programmes. At the same time, it considers how computer technologies such as the World Wide Web, hypertext, electronic mail, Internet discussion groups and real-time conferencing environments might challenge traditional notions of technical communication pedagogical practice. Building on the first two sections, Part Three, "Examining Computer-Supported Communication Facilities from Pedagogical Perspectives", explores a wide range of instructional and political challenges in designing and supporting the robust computing needs of technical communication programmes. Part Four, "Planning for Technological Changes in Technical Communication Programmes", outlines some long-term ways of thinking about computers and technical communications that are instructionally and institutionally productive for students, teachers and programme directors.
Make the most of your Mac with this witty, authoritative guide to macOS Big Sur. Apple updates its Mac operating system every year, adding new features with every revision. But after twenty years of this updating cycle without a printed user guide to help customers, feature bloat and complexity have begun to weigh down the works. For thirty years, the Mac faithful have turned to David Pogue's Mac books to guide them. With Mac Unlocked, New York Times bestselling author Pogue introduces readers to the most radical Mac software redesign in Apple history, macOS Big Sur. Beginning Mac users and Windows refugees will gain an understanding of the Mac philosophy; Mac veterans will find a concise guide to what's new in Big Sur, including its stunning visual and sonic redesign, the new Control Center for quick settings changes, and the built-in security auditing features. With a 300 annotated illustrations, sparkling humor, and crystal-clear prose, Mac Unlocked is the new gold-standard guide to the Mac.
The World Wide Web is truly astounding. It has changed the way we interact, learn and innovate. It is the largest sociotechnical system humankind has created and is advancing at a pace that leaves most in awe. It is an unavoidable fact that the future of the world is now inextricably linked to the future of the Web. Almost every day it appears to change, to get better and increase its hold on us. For all this we are starting to see underlying stability emerge. The way that Web sites rank in terms of popularity, for example, appears to follow laws with which we are familiar. What is fascinating is that these laws were first discovered, not in fields like computer science or information technology, but in what we regard as more fundamental disciplines like biology, physics and mathematics. Consequently the Web, although synthetic at its surface, seems to be quite 'natural' deeper down, and one of the driving aims of the new field of Web Science is to discover how far down such 'naturalness' goes. If the Web is natural to its core, that raises some fundamental questions. It forces us, for example, to ask if the central properties of the Web might be more elemental than the truths we cling to from our understandings of the physical world. In essence, it demands that we question the very nature of information. Understanding Information and Computation is about such questions and one possible route to potentially mind-blowing answers.
This book addresses the issue of Machine Learning (ML) attacks on Integrated Circuits through Physical Unclonable Functions (PUFs). It provides the mathematical proofs of the vulnerability of various PUF families, including Arbiter, XOR Arbiter, ring-oscillator, and bistable ring PUFs, to ML attacks. To achieve this goal, it develops a generic framework for the assessment of these PUFs based on two main approaches. First, with regard to the inherent physical characteristics, it establishes fit-for-purpose mathematical representations of the PUFs mentioned above, which adequately reflect the physical behavior of these primitives. To this end, notions and formalizations that are already familiar to the ML theory world are reintroduced in order to give a better understanding of why, how, and to what extent ML attacks against PUFs can be feasible in practice. Second, the book explores polynomial time ML algorithms, which can learn the PUFs under the appropriate representation. More importantly, in contrast to previous ML approaches, the framework presented here ensures not only the accuracy of the model mimicking the behavior of the PUF, but also the delivery of such a model. Besides off-the-shelf ML algorithms, the book applies a set of algorithms hailing from the field of property testing, which can help to evaluate the security of PUFs. They serve as a "toolbox", from which PUF designers and manufacturers can choose the indicators most relevant for their requirements. Last but not least, on the basis of learning theory concepts, the book explicitly states that the PUF families cannot be considered as an ultimate solution to the problem of insecure ICs. As such, it provides essential insights into both academic research on and the design and manufacturing of PUFs.
This book is dedicated to Prof. J. Kapur and his contributions to the field of entropy measures and maximum entropy applications. Eminent scholars in various fields of applied information theory have been invited to contribute to this Festschrift, collected on the occasion of his 75th birthday. The articles cover topics in the areas of physical, biological, engineering and social sciences such as information technology, soft computing, nonlinear systems or molecular biology with a thematic coherence. The volume will be useful to researchers working in these different fields enabling them to see the underlying unity and power of entropy optimization frameworks.
The advancement of technology in today's world has led to the progression of several professional fields. This includes the classroom, as teachers have begun using new technological strategies to increase student involvement and motivation. ICT innovation including virtual reality and blended learning methods has changed the scope of classroom environments across the globe; however, significant research is lacking in this area. ICTs and Innovation for Didactics of Social Sciences is a fundamental reference focused on didactics of social sciences and ICTs including issues related to innovation, resources, and strategies for teachers that can link to the transformation of social sciences teaching and learning as well as societal transformation. While highlighting topics such as blended learning, augmented reality, and virtual classrooms, this book is ideally designed for researchers, administrators, educators, practitioners, and students interested in understanding current relevant ICT resources and innovative strategies for the didactic of social sciences and didactic possibilities in relation to concrete conceptual contents, resolution of problems, planning, decision making, development of social skills, attention, and motivation promoting a necessary technological literacy.
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata. Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
This research text addresses the logical aspects of the visualization of information with papers especially commissioned for this book. The authors explore the logical properties of diagrams, charts, maps, and the like, and their use in problem solving and in teaching basic reasoning skills. As computers make visual presentations of information even more commonplace,it becomes increasingly important for the research community to develop an understanding of such tools.
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows."
Does the Information Age promise egalitarianism and democracy, or will it simply reinforce long-standing social and economic inequalities? This collection of essays analyzes the emerging role of African-Americans in post-industrial society from a variety of communications research perspectives. Accepting W.J. Wilson's theory of a socially and economically isolated African-American underclass, Barber and Tait ask the logical question: what next? "The Information Society and the Black Community "is a critical examination of the prospects and pitfalls of a historically disadvantaged group in a period of rapid technological advances and economic growth. Adopting Frank Websters theory of the Information Society as a framework for organization and development, the book is divided into five sections that look at technological, economic, occupational, spatial, and cultural aspects of the relationship between the African-American community and the Information Society. Part One analyzes data on African-American use of information technology, and examines how the new flow of information might effect African-American social and cultural images. Part Two focuses on African-American participation in the ownership and control of information industries. Part Three treats professional training and employment patterns affecting African-Americans in the Information Age. Part Four centers around the potential uses of information technology in solving social, political, and economic problems. Part Five addresses the growing connections of the African-American community to Africa and the rest of the world via information technology.
Health communication research examines the role of communication in health professional/client relationships and in promoting patient adherence, the flow of information within and between health organizations, the design and effectiveness of health information for various audiences and the planning and evaluation of health care policy. Other important areas treated in this book are cultural and social factors influencing health communication, ethical issues effecting communication, and education in communication within medical schools. Medical students, physicians, policy makers, students and faculty in communications and sociology, as well as social services professionals should find this reference an important tool.
Basic Concepts in Information Theory and Coding is an outgrowth of a one semester introductory course that has been taught at the University of Southern California since the mid-1960s. Lecture notes from that course have evolved in response to student reaction, new technological and theoretical develop ments, and the insights of faculty members who have taught the course (in cluding the three of us). In presenting this material, we have made it accessible to a broad audience by limiting prerequisites to basic calculus and the ele mentary concepts of discrete probability theory. To keep the material suitable for a one-semester course, we have limited its scope to discrete information theory and a general discussion of coding theory without detailed treatment of algorithms for encoding and decoding for various specific code classes. Readers will find that this book offers an unusually thorough treatment of noiseless self-synchronizing codes, as well as the advantage of problem sections that have been honed by reactions and interactions of several gen erations of bright students, while Agent 00111 provides a context for the discussion of abstract concepts."
The increasing diversity of Infonnation Communication Technologies and their equally diverse range of uses in personal, professional and official capacities raise challenging questions of identity in a variety of contexts. Each communication exchange contains an identifier which may, or may not, be intended by the parties involved. What constitutes an identity, how do new technologies affect identity, how do we manage identities in a globally networked infonnation society? th th From the 6 to the 10 August 2007, IFIP (International Federation for Infonnation Processing) working groups 9. 2 (Social Accountability), 9. 6/11. 7 (IT rd Misuse and the Law) and 11. 6 (Identity Management) hold their 3 Intemational Summer School on "The Future of Identity in the Infonnation Society" in cooperation with the EU Network of Excellence FIDIS at Karlstad University. The Summer School addressed the theme of Identity Management in relation to current and future technologies in a variety of contexts. The aim of the IFIP summer schools has been to introduce participants to the social implications of Infonnation Technology through the process of infonned discussion. Following the holistic approach advocated by the involved IFIP working groups, a diverse group of participants ranging from young doctoral students to leading researchers in the field were encouraged to engage in discussion, dialogue and debate in an infonnal and supportive setting. The interdisciplinary, and intemational, emphasis of the Summer School allowed for a broader understanding of the issues in the technical and social spheres.
Upon hearing that Ronald Coase had been awarded the Nobel Prize, a fellow economist's first response was to ask with whom Coase had shared the Prize. Whether this response was idiosyncratic or not, I do not know; I expect not. Part of this type of reaction can no doubt be explained by the fact that Coase has often been characterized as an economist who wrote only two significant or influential papers: "The Nature of the Firm" (1937) and "The Problem of Social Cost" (1960). And by typical professional standards of "significant" and "influential" (i. e. , widely read and cited), this perception embodies a great deal of truth, even subsequent to Coase's receipt of the Prize. This is not to say that there have not been other important works - "The Marginal Cost Controversy" (1946) and "The Lighthouse in Economics" (1974) come immediately to mind here - only that in a random sample of, say, one hundred economists, one would likely find few who could list a Coase bibliography beyond the two classic pieces noted above, in spite of Coase's significant publication record. ' The purpose of this collection is to assess the development of, tensions within, and prospects for Coasean Economics - those aspects of economic analysis that have evolved out of Coase's path-breaking work. Two major strands of research can be identified here: law and economics and the New Institutional Economics.
Every thought is a throw of dice. Stephane Mallarme This book is the last one of a trilogy which reports a part of our research work over nearly thirty years (we discard our non-conventional results in automatic control theory and applications on the one hand, and fuzzy sets on the other), and its main key words are Information Theory, Entropy, Maximum Entropy Principle, Linguistics, Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian Motion, Stochastic Differential Equations of Order n, Stochastic Optimal Control, Computer Vision. Our obsession has been always the same: Shannon's information theory should play a basic role in the foundations of sciences, but subject to the condition that it be suitably generalized to allow us to deal with problems which are not necessarily related to communication engineering. With this objective in mind, two questions are of utmost importance: (i) How can we introduce meaning or significance of information in Shannon's information theory? (ii) How can we define and/or measure the amount of information involved in a form or a pattern without using a probabilistic scheme? It is obligatory to find suitable answers to these problems if we want to apply Shannon's theory to science with some chance of success. For instance, its use in biology has been very disappointing, for the very reason that the meaning of information is there of basic importance, and is not involved in this approach.
This volume includes edited and revised versions of the papers
delivered and discussed at the recent Advertising and Consumer
Psychology Conference. Following the theme of the conference --
"Measuring Advertising Effectiveness" -- the book blends academic
psychology, marketing theory, survey methodology, and practical
experience, while simultaneously addressing the problems and
limitations of advertising.
Coding theory came into existence in the late 1940's and is
concerned with devising efficient encoding and decoding
procedures.
The recent evolution of western societies has been characterized by
an increasing emphasis on information and communication. As the
amount of available information increases, however, the user --
worker, student, citizen -- faces a new problem: selecting and
accessing relevant information. More than ever it is crucial to
find efficient ways for users to interact with information systems
in a way that prevents them from being overwhelmed or simply
missing their targets. As a result, hypertext systems have been
developed as a means of facilitating the interactions between
readers and text. In hypertext, information is organized as a
network in which nodes are text chunks (e.g., lists of items,
paragraphs, pages) and links are relationships between the nodes
(e.g., semantic associations, expansions, definitions, examples --
virtually any kind of relation that can be imagined between two
text passages). Unfortunately, the many ways in which these
hypertext interfaces can be designed has caused a complexity that
extends far beyond the processing abilities of regular users.
Therefore, it has become widely recognized that a more rational
approach based on a thorough analysis of information users' needs,
capacities, capabilities, and skills is needed. This volume seeks
to meet that need.
Characterized by its multi-level interdisciplinary character,
communication has become a variable field -- one in which the level
of analysis varies. This has had important ramifications for the
study of communication because, to some extent, the questions one
asks are determined by the methods one has available to answer
them. As a result, communication research is characterized by the
plethora of both qualitative and quantitative approaches used by
its practitioners. These include survey and experimental methods,
and content, historical, and rhetorical analyses. |
![]() ![]() You may like...
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R23,951
Discovery Miles 239 510
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R23,944
Discovery Miles 239 440
Research Methodologies, Innovations and…
Manuel Mora, Ovsei Gelman, …
Hardcover
R5,097
Discovery Miles 50 970
The Oxford Handbook of Information and…
Robin Mansell, Chrisanthi Avgerou, …
Hardcover
R6,482
Discovery Miles 64 820
|