![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
Health communication research examines the role of communication in health professional/client relationships and in promoting patient adherence, the flow of information within and between health organizations, the design and effectiveness of health information for various audiences and the planning and evaluation of health care policy. Other important areas treated in this book are cultural and social factors influencing health communication, ethical issues effecting communication, and education in communication within medical schools. Medical students, physicians, policy makers, students and faculty in communications and sociology, as well as social services professionals should find this reference an important tool.
Basic Concepts in Information Theory and Coding is an outgrowth of a one semester introductory course that has been taught at the University of Southern California since the mid-1960s. Lecture notes from that course have evolved in response to student reaction, new technological and theoretical develop ments, and the insights of faculty members who have taught the course (in cluding the three of us). In presenting this material, we have made it accessible to a broad audience by limiting prerequisites to basic calculus and the ele mentary concepts of discrete probability theory. To keep the material suitable for a one-semester course, we have limited its scope to discrete information theory and a general discussion of coding theory without detailed treatment of algorithms for encoding and decoding for various specific code classes. Readers will find that this book offers an unusually thorough treatment of noiseless self-synchronizing codes, as well as the advantage of problem sections that have been honed by reactions and interactions of several gen erations of bright students, while Agent 00111 provides a context for the discussion of abstract concepts."
Every Thing Must Go aruges that the only kind of metaphysics that
can contribute to objective knowledge is one based specifically on
contemporary science as it really is, and not on philosophers' a
priori intuitions, common sense, or simplifications of science. In
addition to showing how recent metaphysics has drifted away from
connection with all other serious scholarly inquiry as a result of
not heeding this restriction, they demonstrate how to build a
metaphysics compatible with current fundamental phsyics ("ontic
structural realism"), which, when combined with their metaphysics
of the special sciences ("rainforet realism"), can be used to unify
physics with the other sciences without reducing these sciences to
physics intself. Taking science metaphysically seriously, Ladyman
and Ross argue, means that metaphysicians must abandon the picture
of the world as composed of self-subsistent individual objects, and
the paradigm of causation as the collision of such objects.
The increasing diversity of Infonnation Communication Technologies and their equally diverse range of uses in personal, professional and official capacities raise challenging questions of identity in a variety of contexts. Each communication exchange contains an identifier which may, or may not, be intended by the parties involved. What constitutes an identity, how do new technologies affect identity, how do we manage identities in a globally networked infonnation society? th th From the 6 to the 10 August 2007, IFIP (International Federation for Infonnation Processing) working groups 9. 2 (Social Accountability), 9. 6/11. 7 (IT rd Misuse and the Law) and 11. 6 (Identity Management) hold their 3 Intemational Summer School on "The Future of Identity in the Infonnation Society" in cooperation with the EU Network of Excellence FIDIS at Karlstad University. The Summer School addressed the theme of Identity Management in relation to current and future technologies in a variety of contexts. The aim of the IFIP summer schools has been to introduce participants to the social implications of Infonnation Technology through the process of infonned discussion. Following the holistic approach advocated by the involved IFIP working groups, a diverse group of participants ranging from young doctoral students to leading researchers in the field were encouraged to engage in discussion, dialogue and debate in an infonnal and supportive setting. The interdisciplinary, and intemational, emphasis of the Summer School allowed for a broader understanding of the issues in the technical and social spheres.
Upon hearing that Ronald Coase had been awarded the Nobel Prize, a fellow economist's first response was to ask with whom Coase had shared the Prize. Whether this response was idiosyncratic or not, I do not know; I expect not. Part of this type of reaction can no doubt be explained by the fact that Coase has often been characterized as an economist who wrote only two significant or influential papers: "The Nature of the Firm" (1937) and "The Problem of Social Cost" (1960). And by typical professional standards of "significant" and "influential" (i. e. , widely read and cited), this perception embodies a great deal of truth, even subsequent to Coase's receipt of the Prize. This is not to say that there have not been other important works - "The Marginal Cost Controversy" (1946) and "The Lighthouse in Economics" (1974) come immediately to mind here - only that in a random sample of, say, one hundred economists, one would likely find few who could list a Coase bibliography beyond the two classic pieces noted above, in spite of Coase's significant publication record. ' The purpose of this collection is to assess the development of, tensions within, and prospects for Coasean Economics - those aspects of economic analysis that have evolved out of Coase's path-breaking work. Two major strands of research can be identified here: law and economics and the New Institutional Economics.
Every thought is a throw of dice. Stephane Mallarme This book is the last one of a trilogy which reports a part of our research work over nearly thirty years (we discard our non-conventional results in automatic control theory and applications on the one hand, and fuzzy sets on the other), and its main key words are Information Theory, Entropy, Maximum Entropy Principle, Linguistics, Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian Motion, Stochastic Differential Equations of Order n, Stochastic Optimal Control, Computer Vision. Our obsession has been always the same: Shannon's information theory should play a basic role in the foundations of sciences, but subject to the condition that it be suitably generalized to allow us to deal with problems which are not necessarily related to communication engineering. With this objective in mind, two questions are of utmost importance: (i) How can we introduce meaning or significance of information in Shannon's information theory? (ii) How can we define and/or measure the amount of information involved in a form or a pattern without using a probabilistic scheme? It is obligatory to find suitable answers to these problems if we want to apply Shannon's theory to science with some chance of success. For instance, its use in biology has been very disappointing, for the very reason that the meaning of information is there of basic importance, and is not involved in this approach.
Following the emergence of quantum computing, the subsequent quantum revolution will be that of interconnecting individual quantum computers at the global level. In the same way that classical computers only realised their full potential with the emergence of the internet, a fully-realised quantum internet is the next stage of evolution for quantum computation. This cutting-edge book examines in detail how the quantum internet would evolve in practise, focusing not only on the technology itself, but also the implications it will have economically and politically, with numerous non-technical sections throughout the text providing broader context to the discussion. The book begins with a description of classical networks before introducing the key concepts behind quantum networks, such as quantum internet protocols, quantum cryptography, and cloud quantum computing. Written in an engaging style and accessible to graduate students in physics, engineering, computer science and mathematics.
This volume includes edited and revised versions of the papers
delivered and discussed at the recent Advertising and Consumer
Psychology Conference. Following the theme of the conference --
"Measuring Advertising Effectiveness" -- the book blends academic
psychology, marketing theory, survey methodology, and practical
experience, while simultaneously addressing the problems and
limitations of advertising.
The recent evolution of western societies has been characterized by
an increasing emphasis on information and communication. As the
amount of available information increases, however, the user --
worker, student, citizen -- faces a new problem: selecting and
accessing relevant information. More than ever it is crucial to
find efficient ways for users to interact with information systems
in a way that prevents them from being overwhelmed or simply
missing their targets. As a result, hypertext systems have been
developed as a means of facilitating the interactions between
readers and text. In hypertext, information is organized as a
network in which nodes are text chunks (e.g., lists of items,
paragraphs, pages) and links are relationships between the nodes
(e.g., semantic associations, expansions, definitions, examples --
virtually any kind of relation that can be imagined between two
text passages). Unfortunately, the many ways in which these
hypertext interfaces can be designed has caused a complexity that
extends far beyond the processing abilities of regular users.
Therefore, it has become widely recognized that a more rational
approach based on a thorough analysis of information users' needs,
capacities, capabilities, and skills is needed. This volume seeks
to meet that need.
Coding theory came into existence in the late 1940's and is
concerned with devising efficient encoding and decoding
procedures.
Characterized by its multi-level interdisciplinary character,
communication has become a variable field -- one in which the level
of analysis varies. This has had important ramifications for the
study of communication because, to some extent, the questions one
asks are determined by the methods one has available to answer
them. As a result, communication research is characterized by the
plethora of both qualitative and quantitative approaches used by
its practitioners. These include survey and experimental methods,
and content, historical, and rhetorical analyses.
Is knowledge an economic good? Which are the characteristics of the institutions regulating the production and diffusion of knowledge? Cumulation of knowledge is a key determinant of economic growth, but only recently knowledge has moved to the core of economic analysis. Recent literature also gives profound insights into events like scientific progress, artistic and craft development which have been rarely addressed as socio-economic institutions, being the domain of sociologists and historians rather than economists. This volume adopts a multidisciplinary approach to bring knowledge in the focus of attention, as a key economic issue.
Knowledge has in recent years become a key driver for growth of regions and nations. This volume empirically investigates the emergence of the knowledge economy in the late 20th century from a regional point of view. It first deals with the theoretical background for understanding the knowledge economy, with knowledge spillovers and development externalities. It then examines aspects of the relationship between knowledge inputs and innovative outputs in the information, computer and telecommunications sector (ICT) of the economy at the regional level. Case studies focusing on a wide variety of sectors, countries and regions finally illustrate important regional innovation issues.
Selected Areas in Cryptography brings together in one place important contributions and up-to-date research results in this fast moving area. Selected Areas in Cryptography serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
The Universal Service Desk (USD) - Implementing, controlling and improving service delivery defines what a USD is, why it is valuable to an organisation and how to build and implement one. It also discusses the evolution of the USD as part of integrated workplace management. Understand the essentials of any USD - buy this book today!
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This second volume, Inference, builds on the foundational topics established in volume I to introduce students to techniques for inferring unknown variables and quantities, including Bayesian inference, Monte Carlo Markov Chain methods, maximum-likelihood estimation, hidden Markov models, Bayesian networks, and reinforcement learning. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 350 end-of-chapter problems (including solutions for instructors), 180 solved examples, almost 200 figures, datasets and downloadable Matlab code. Supported by sister volumes Foundations and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.
This book describes concepts and tools needed for water resources management, including methods for modeling, simulation, optimization, big data analysis, data mining, remote sensing, geographical information system, game theory, conflict resolution, System dynamics, agent-based models, multiobjective, multicriteria, and multiattribute decision making and risk and uncertainty analysis, for better and sustainable management of water resources and consumption, thus mitigating the present and future global water shortage crisis. It presents the applications of these tools through case studies which demonstrate its benefits of proper management of water resources systems. This book acts as a reference for students, professors, industrial practitioners, and stakeholders in the field of water resources and hydrology.
This book provides awareness of methods used for functional encryption in the academic and professional communities. The book covers functional encryption algorithms and its modern applications in developing secure systems via entity authentication, message authentication, software security, cyber security, hardware security, Internet of Thing (IoT), cloud security, smart card technology, CAPTCHA, digital signature, and digital watermarking. This book is organized into fifteen chapters; topics include foundations of functional encryption, impact of group theory in cryptosystems, elliptic curve cryptography, XTR algorithm, pairing based cryptography, NTRU algorithms, ring units, cocks IBE schemes, Boneh-Franklin IBE, Sakai-Kasahara IBE, hierarchical identity based encryption, attribute based Encryption, extensions of IBE and related primitives, and digital signatures. Explains the latest functional encryption algorithms in a simple way with examples; Includes applications of functional encryption in information security, application security, and network security; Relevant to academics, research scholars, software developers, etc.
This book discusses machine learning and artificial intelligence (AI) for agricultural economics. It is written with a view towards bringing the benefits of advanced analytics and prognostics capabilities to small scale farmers worldwide. This volume provides data science and software engineering teams with the skills and tools to fully utilize economic models to develop the software capabilities necessary for creating lifesaving applications. The book introduces essential agricultural economic concepts from the perspective of full-scale software development with the emphasis on creating niche blue ocean products. Chapters detail several agricultural economic and AI reference architectures with a focus on data integration, algorithm development, regression, prognostics model development and mathematical optimization. Upgrading traditional AI software development paradigms to function in dynamic agricultural and economic markets, this volume will be of great use to researchers and students in agricultural economics, data science, engineering, and machine learning as well as engineers and industry professionals in the public and private sectors.
In this book, H. S. Green, a former student of Max Born and well known as an author in physics and in philosophy of science, presents an individual and modern approach to theoretical physics and related fundamental problems. Starting from first principles, the links between physics and information science are unveiled step by step: modern information theory and the classical theory of the Turing machine are combined to create a new interpretation of quantum computability, which is then applied to field theory, gravitation and submicroscopic measurement theory and culminates in a detailed examination of the role of the conscious observer in physical measurements. The result is a highly readable book that unifies a wide range of scientific knowledge and is essential reading for all scientists and philosophers of science interested in the interpretation and the implications of the interaction between information science and basic physical theories.
How to draw plausible conclusions from uncertain and conflicting sources of evidence is one of the major intellectual challenges of Artificial Intelligence. It is a prerequisite of the smart technology needed to help humans cope with the information explosion of the modern world. In addition, computational modelling of uncertain reasoning is a key to understanding human rationality. Previous computational accounts of uncertain reasoning have fallen into two camps: purely symbolic and numeric. This book represents a major advance by presenting a unifying framework which unites these opposing camps. The Incidence Calculus can be viewed as both a symbolic and a numeric mechanism. Numeric values are assigned indirectly to evidence via the possible worlds in which that evidence is true. This facilitates purely symbolic reasoning using the possible worlds and numeric reasoning via the probabilities of those possible worlds. Moreover, the indirect assignment solves some difficult technical problems, like the combinat ion of dependent sources of evidcence, which had defeated earlier mechanisms. Weiru Liu generalises the Incidence Calculus and then compares it to a succes sion of earlier computational mechanisms for uncertain reasoning: Dempster-Shafer Theory, Assumption-Based Truth Maintenance, Probabilis tic Logic, Rough Sets, etc. She shows how each of them is represented and interpreted in Incidence Calculus. The consequence is a unified mechanism which includes both symbolic and numeric mechanisms as special cases. It provides a bridge between symbolic and numeric approaches, retaining the advantages of both and overcoming some of their disadvantages." |
You may like...
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,967
Discovery Miles 209 670
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,954
Discovery Miles 209 540
NMR Quantum Information Processing
Ivan Oliveira, Roberto Sarthour Jr., …
Hardcover
R3,754
Discovery Miles 37 540
Engineering and the Ultimate - An…
Jonathan Bartlett, Dominic Halsmer, …
Hardcover
R701
Discovery Miles 7 010
|