![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > General
The geometry of curves has fascinated mathematicians for 2500 years, and the theory has become highly abstract. Recently links have been made with the subject of error correction, leading to the creation of geometric Goppa codes, a new and important area of coding theory. This book is an updated and extended version of the last part of the successful book Error-Correcting Codes and Finite Fields. It provides an elementary introduction to Goppa codes, and includes many examples, calculations, and applications. The book is in two parts with an emphasis on motivation, and applications of the theory take precedence over proofs of theorems. The formal theory is, however, provided in the second part of the book, and several of the concepts and proofs have been simplified without sacrificing rigour.
The phenomenal international bestseller that shows us how to stop trying to predict everything - and take advantage of uncertainty What have the invention of the wheel, Pompeii, the Wall Street Crash, Harry Potter and the internet got in common? Why are all forecasters con-artists? Why should you never run for a train or read a newspaper? This book is all about Black Swans: the random events that underlie our lives, from bestsellers to world disasters. Their impact is huge; they're impossible to predict; yet after they happen we always try to rationalize them. 'Taleb is a bouncy and even exhilarating guide ... I came to relish what he said, and even develop a sneaking affection for him as a person' Will Self, Independent on Sunday 'He leaps like some superhero of the mind' Boyd Tonkin, Independent
This book provides a much needed short, reliable and stimulating guide to the mass media in present day society. Incisive, surprising and stimulating it will become an essential text in thinking and writing about the mass media.
Probability theory has been extraordinarily successful at describing a variety of phenomena, from the behaviour of gases to the transmission of messages, and is, besides, a powerful tool with applications throughout mathematics. At its heart are a number of concepts familiar in one guise or another to many: Gauss' bell-shaped curve, the law of averages, and so on, concepts that crop up in so many settings they are in some sense universal. This universality is predicted by probability theory to a remarkable degree. This book explains that theory and investigates its ramifications. Assuming a good working knowledge of basic analysis, real and complex, the author maps out a route from basic probability, via random walks, Brownian motion, the law of large numbers and the central limit theorem, to aspects of ergodic theorems, equilibrium and nonequilibrium statistical mechanics, communication over a noisy channel, and random matrices. Numerous examples and exercises enrich the text.
This book will focus on utilizing statistical modelling of the software source code, in order to resolve issues associated with the software development processes. Writing and maintaining software source code is a costly business; software developers need to constantly rely on large existing code bases. Statistical modelling identifies the patterns in software artifacts and utilize them for predicting the possible issues.
Recently, much attention has been paid to image processing with multi-resolution and hierarchical structures such as pyramids and trees. This volume deals with recursive pyramids, which combine the advantages of available multiresolution structures and which are convenient both for global and local image processing. Recursive pyramids are based on regular hierarchical (recursive) structures containing data on image fragments of different sizes. Such an image representation technique enables the effective manipulation of pictorial information as well as the development of special hardware or data structures. The major aspects of this book are two original mathematical models of greyscale and binary images represented by recursive structures. Image compression, transmission and processing are discussed using these models. A number of applications are presented, including optical character recognition, expert systems and special computer architecture for pictorial data processing. The majority of results are presented as algorithms applicable to discrete information fields of arbitrary dimensions (e.g. 2-D or 3-D images). The book is divided into six chapters: Chapter 1 provides a brief introduction. Chapter 2 then deals with recursive structures and their properties. Chapter 3 introduces pyramidal image models. Image coding and the progressive transmission of images with gradual refinement are discussed in Chapter 4. Chapters 5 and 6 are devoted to image processing with pyramidal-recursive structures and applications. The volume concludes with a comprehensive bibliography. This work should interest applied mathematicians and computer scientists whose work involves computer vision, information theory and other aspects of image representation techniques.
This book introduces the reader to Serres' unique manner of 'doing philosophy' that can be traced throughout his entire oeuvre: namely as a novel manner of bearing witness. It explores how Serres takes note of a range of epistemologically unsettling situations, which he understands as arising from the short-circuit of a proprietary notion of capital with a praxis of science that commits itself to a form of reasoning which privileges the most direct path (simple method) in order to expend minimal efforts while pursuing maximal efficiency. In Serres' universal economy, value is considered as a function of rarity, not as a stock of resources. This book demonstrates how Michel Serres has developed an architectonics that is coefficient with nature. Mathematic and Information in the Philosophy of Michel Serres acquaints the reader with Serres' monist manner of addressing the universality and the power of knowledge - that is at once also the anonymous and empty faculty of incandescent, inventive thought. The chapters of the book demarcate, problematize and contextualize some of the epistemologically unsettling situations Serres addresses, whilst also examining the particular manner in which he responds to and converses with these situations.
Administrators of academic professional and technical communication (PTSC) programs have long relied upon lore--stories of what works--to understand and communicate about the work of program administration. Stories are interesting, telling, engaging, and necessary. But a discipline focused primarily on stories, especially the ephemeral stories narrated at conferences and deliberated at department meetings, usually suffice primarily to solve immediate problems and address day-to-day concerns and activities. This edited collection captures some of those stories and layers them with theoretical perspectives and reflection, to enhance their usefulness to the PTSC program administration community at large. Like the ephemeral stories PTSC program administrators are accustomed to, the stories told in this volume are set within specific institutional contexts that reflect specific institutional challenges. They emphasize the intellectual traces--the debts the authors owe to those who have informed and transformed their administrative work. In so doing, this collection creates another conversation--albeit a robust, diverse, and theoretically informed one--around which program leaders might define or redefine their roles and re-envision their administrative work as the rich, complex, intellectual engagement that we find it to be. This volume asks authors to move beyond a notion of administration as an activity based solely in institutional details and processes. In so doing, they emphasize theory as they share their reflections on core administrative processes and significant moments in the histories of their associated programs, thereby affording opportunities for critical examination in conjunction with practical advice.
Administrators of academic professional and technical communication (PTSC) programs have long relied upon lore--stories of what works--to understand and communicate about the work of program administration. Stories are interesting, telling, engaging, and necessary. But a discipline focused primarily on stories, especially the ephemeral stories narrated at conferences and deliberated at department meetings, usually suffice primarily to solve immediate problems and address day-to-day concerns and activities. This edited collection captures some of those stories and layers them with theoretical perspectives and reflection, to enhance their usefulness to the PTSC program administration community at large. Like the ephemeral stories PTSC program administrators are accustomed to, the stories told in this volume are set within specific institutional contexts that reflect specific institutional challenges. They emphasize the intellectual traces--the debts the authors owe to those who have informed and transformed their administrative work. In so doing, this collection creates another conversation--albeit a robust, diverse, and theoretically informed one--around which program leaders might define or redefine their roles and re-envision their administrative work as the rich, complex, intellectual engagement that we find it to be. This volume asks authors to move beyond a notion of administration as an activity based solely in institutional details and processes. In so doing, they emphasize theory as they share their reflections on core administrative processes and significant moments in the histories of their associated programs, thereby affording opportunities for critical examination in conjunction with practical advice.
This book gives the definitive mathematical answer to what thermodynamics really is: a variational calculus applied to probability distributions. Extending Gibbs's notion of ensemble, the Author imagines the ensemble of all possible probability distributions and assigns probabilities to them by selection rules that are fairly general. The calculus of the most probable distribution in the ensemble produces the entire network of mathematical relationships we recognize as thermodynamics. The first part of the book develops the theory for discrete and continuous distributions while the second part applies this thermodynamic calculus to problems in population balance theory and shows how the emergence of a giant component in aggregation, and the shattering transition in fragmentation may be treated as formal phase transitions. While the book is intended as a research monograph, the material is self-contained and the style sufficiently tutorial to be accessible for self-paced study by an advanced graduate student in such fields as physics, chemistry, and engineering.
As a collection of ideas and methodologies, systems thinking has made an impact in organizations and in particular in the information systems field. However, this main emphasis on organizations limits the scope of systems thinking and practice. There is a need first to use systems thinking in addressing societal problems, and second to enable people involved in developing the information society to reflect on the impacts of systems and technologies in society as a whole. Thus, there are opportunities to review the scope and potential of systems thinking and practice to deal with information society-related issues. Systems Practice in the Information Society provides students of information systems as well as practicing Inofrmation Systems managers with concepts and strategies to enable them to understand and use systems thinking methodologies and address challenges posed by the development of information-based societies. This book brings experiences, ideas, and applications of systemic thinking in designing and evaluating socio-technological initiatives. Using a number of cultural contexts, this book explores how organizations, including governments, can enable better access to information and communication technologies and improve the quality of life of individuals.
As a collection of ideas and methodologies, systems thinking has made an impact in organizations and in particular in the information systems field. However, this main emphasis on organizations limits the scope of systems thinking and practice. There is a need first to use systems thinking in addressing societal problems, and second to enable people involved in developing the information society to reflect on the impacts of systems and technologies in society as a whole. Thus, there are opportunities to review the scope and potential of systems thinking and practice to deal with information society-related issues. Systems Practice in the Information Society provides students of information systems as well as practicing Inofrmation Systems managers with concepts and strategies to enable them to understand and use systems thinking methodologies and address challenges posed by the development of information-based societies. This book brings experiences, ideas, and applications of systemic thinking in designing and evaluating socio-technological initiatives. Using a number of cultural contexts, this book explores how organizations, including governments, can enable better access to information and communication technologies and improve the quality of life of individuals.
This book deals with the autoregressive method for digital processing of random oscillations. The method is based on a one-to-one transformation of the numeric factors of the Yule series model to linear elastic system characteristics. This parametric approach allowed to develop a formal processing procedure from the experimental data to obtain estimates of logarithmic decrement and natural frequency of random oscillations. A straightforward mathematical description of the procedure makes it possible to optimize a discretization of oscillation realizations providing efficient estimates. The derived analytical expressions for confidence intervals of estimates enable a priori evaluation of their accuracy. Experimental validation of the method is also provided. Statistical applications for the analysis of mechanical systems arise from the fact that the loads experienced by machineries and various structures often cannot be described by deterministic vibration theory. Therefore, a sufficient description of real oscillatory processes (vibrations) calls for the use of random functions. In engineering practice, the linear vibration theory (modeling phenomena by common linear differential equations) is generally used. This theory's fundamental concepts such as natural frequency, oscillation decrement, resonance, etc. are credited for its wide use in different technical tasks. In technical applications two types of research tasks exist: direct and inverse. The former allows to determine stochastic characteristics of the system output X(t) resulting from a random process E(t) when the object model is considered known. The direct task enables to evaluate the effect of an operational environment on the designed object and to predict its operation under various loads. The inverse task is aimed at evaluating the object model on known processes E(t) and X(t), i.e. finding model (equations) factors. This task is usually met at the tests of prototypes to identify (or verify) its model experimentally. To characterize random processes a notion of "shaping dynamic system" is commonly used. This concept allows to consider the observing process as the output of a hypothetical system with the input being stationary Gauss-distributed ("white") noise. Therefore, the process may be exhaustively described in terms of parameters of that system. In the case of random oscillations, the "shaping system" is an elastic system described by the common differential equation of the second order: X (t)+2hX (t)+ _0^2 X(t)=E(t), where 0 = 2 / 0 is the natural frequency, T0 is the oscillation period, and h is a damping factor. As a result, the process X(t) can be characterized in terms of the system parameters - natural frequency and logarithmic oscillations decrement = hT0 as well as the process variance. Evaluation of these parameters is subjected to experimental data processing based on frequency or time-domain representations of oscillations. It must be noted that a concept of these parameters evaluation did not change much during the last century. For instance, in case of the spectral density utilization, evaluation of the decrement values is linked with bandwidth measurements at the points of half-power of the observed oscillations. For a time-domain presentation, evaluation of the decrement requires measuring covariance values delayed by a time interval divisible by T0. Both estimation procedures are derived from a continuous description of research phenomena, so the accuracy of estimates is linked directly to the adequacy of discrete representation of random oscillations. This approach is similar a concept of transforming differential equations to difference ones with derivative approximation by corresponding finite differences. The resulting discrete model, being an approximation, features a methodical error which can be decreased but never eliminated. To render such a presentation more accurate it is imperative to decrease the discretization interval and to increase realization size growing requirements for computing power. The spectral density and covariance function estimates comprise a non-parametric (non-formal) approach. In principle, any non-formal approach is a kind of art i.e. the results depend on the performer's skills. Due to interference of subjective factors in spectral or covariance estimates of random signals, accuracy of results cannot be properly determined or justified. To avoid the abovementioned difficulties, the application of linear time-series models with well-developed procedures for parameter estimates is more advantageous. A method for the analysis of random oscillations using a parametric model corresponding discretely (no approximation error) with a linear elastic system is developed and presented in this book. As a result, a one-to-one transformation of the model's numerical factors to logarithmic decrement and natural frequency of random oscillations is established. It allowed to develop a formal processing procedure from experimental data to obtain the estimates of and 0. The proposed approach allows researchers to replace traditional subjective techniques by a formal processing procedure providing efficient estimates with analytically defined statistical uncertainties.
A highly contentious, very readable and totally up-to-the-minute investigation of women's natural relationship with modern technology, an association which, Plant argues, will trigger a new sexual revolution. Zeros and Ones is an intelligent, provocative and accessible investigation of the intersection between women, feminism, machines and in particular, information technology. Arguing that the computer is rewriting the old conceptions of man and his world, it suggests that the telecoms revolution is also a sexual revolution which undermines the fundamental assumptions crucial to patriarchal culture. Historical, contemporary and future developments in telecommunications and in IT are interwoven with the past, present and future of feminism, women and sexual difference, and a wealth of connections, parallels and affinities between machines and women are uncovered as a result. Challenging the belief that man was ever in control of either his own agency, the planet, or his machines, this book argues it is seriously undermined by the new scientific paradigms emergent from theories of chaos, complexity and connectionism, all of which suggest that the old distinctions between man, woman, nature and technology need to be radically reassessed.
The latest edition of this classic is updated with new problem sets
and material An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
This book presents a comprehensive mathematical theory that explains precisely what information flow is, how it can be assessed quantitatively - so bringing precise meaning to the intuition that certain information leaks are small enough to be tolerated - and how systems can be constructed that achieve rigorous, quantitative information-flow guarantees in those terms. It addresses the fundamental challenge that functional and practical requirements frequently conflict with the goal of preserving confidentiality, making perfect security unattainable. Topics include: a systematic presentation of how unwanted information flow, i.e., "leaks", can be quantified in operationally significant ways and then bounded, both with respect to estimated benefit for an attacking adversary and by comparisons between alternative implementations; a detailed study of capacity, refinement, and Dalenius leakage, supporting robust leakage assessments; a unification of information-theoretic channels and information-leaking sequential programs within the same framework; and a collection of case studies, showing how the theory can be applied to interesting realistic scenarios. The text is unified, self-contained and comprehensive, accessible to students and researchers with some knowledge of discrete probability and undergraduate mathematics, and contains exercises to facilitate its use as a course textbook.
At the dawn of the twenty-first century, education about and through the media has become a worldwide phenomenon, and is playing an increasingly important role in educational reform. The theory and practice of media education have profited greatly from recent and intensive development and application of new information and telecommunications technologies. Consequently, the importance of media and information literacy is taking on an even greater urgency. With this in mind, the contributors to this volume survey what has taken place over the last decade in different parts of the world, examine the current state of theoretical, conceptual, and research development, and consider where media education is going and where it ought to go. With two-thirds of its 22 contributions coming from outside the United States, "Media Literacy in the Information Age" is a genuine international effort, with many leading media and information educators in the world taking part. The work converts the notion of globalism from a slogan into a working hypothesis. The concerns in this volume are with literacy not just in computer technology, but as a broad concern of the educational process.
Luciano Floridi develops an original ethical framework for dealing with the new challenges posed by Information and Communication Technologies (ICTs). ICTs have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, health care, industrial production and business, social relations, and conflicts. They have had a radical and widespread impact on our moral lives and on contemporary ethical debates. Privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, and pornography online are only some of the pressing issues that characterise the ethical discourse in the information society. They are the subject of Information Ethics (IE), the new philosophical area of research that investigates the ethical impact of ICTs on human life and society. Since the seventies, IE has been a standard topic in many curricula. In recent years, there has been a flourishing of new university courses, international conferences, workshops, professional organizations, specialized periodicals and research centres. However, investigations have so far been largely influenced by professional and technical approaches, addressing mainly legal, social, cultural and technological problems. This book is the first philosophical monograph entirely and exclusively dedicated to it. Floridi lays down, for the first time, the conceptual foundations for IE. He does so systematically, by pursuing three goals: a) a metatheoretical goal: it describes what IE is, its problems, approaches and methods; b) an introductory goal: it helps the reader to gain a better grasp of the complex and multifarious nature of the various concepts and phenomena related to computer ethics; c) an analytic goal: it answers several key theoretical questions of great philosophical interest, arising from the investigation of the ethical implications of ICTs. Although entirely independent of The Philosophy of Information (OUP, 2011), Floridi's previous book, The Ethics of Information complements it as new work on the foundations of the philosophy of information.
There is a need for general theoretical principles
describing/explaining effective design -- those which demonstrate
"unity" and enhance comprehension and usability. Theories of
cohesion from linguistics and of comprehension in psychology are
likely sources of such general principles. Unfortunately,
linguistic approaches to discourse unity have focused exclusively
on semantic elements such as synonymy or anaphora, and have ignored
other linguistic elements such as syntactic parallelism and
phonological alliteration. They have also overlooked the
non-linguistic elements -- visual factors such as typography or
color, and auditory components such as pitch or duration. In
addition, linguistic approaches have met with criticism because
they have failed to explain the relationship between semantic
cohesive elements and coherence. On the other hand, psychological
approaches to discourse comprehension have considered the impact of
a wider range of discourse elements -- typographical cuing of key
terms to enhance comprehension -- but have failed to provide
general theoretical explanations for such observations.
If the carriers of information are governed by quantum mechanics, new principles for information processing apply. This graduate textbook introduces the underlying mathematical theory for quantum communication, computation, and cryptography. A focus lies on the concept of quantum channels, understanding fi gures of merit, e.g. fidelities and entropies in the quantum world, and understanding the interrelationship of various quantum information processing protocols.
This book provides a comprehensive explanation of forward error correction, which is a vital part of communication systems. The book is written in such a way to make the subject easy and understandable for the reader. The book starts with a review of linear algebra to provide a basis for the text. The author then goes on to cover linear block codes, syndrome error correction, cyclic codes, Galois fields, BCH codes, Reed Solomon codes, and convolutional codes. Examples are provided throughout the text.
The book is a concise, self-contained and fully updated introduction to automata theory - a fundamental topic of computer sciences and engineering. The material is presented in a rigorous yet convincing way and is supplied with a wealth of examples, exercises and down-to-the earth convincing explanatory notes. An ideal text to a spectrum of one-term courses in computer sciences, both at the senior undergraduate and graduate students.
How to build and maintain strong data organizations--the Dummies way Data Governance For Dummies offers an accessible first step for decision makers into understanding how data governance works and how to apply it to an organization in a way that improves results and doesn't disrupt. Prep your organization to handle the data explosion (if you know, you know) and learn how to manage this valuable asset. Take full control of your organization's data with all the info and how-tos you need. This book walks you through making accurate data readily available and maintaining it in a secure environment. It serves as your step-by-step guide to extracting every ounce of value from your data. Identify the impact and value of data in your business Design governance programs that fit your organization Discover and adopt tools that measure performance and need Address data needs and build a more data-centric business culture This is the perfect handbook for professionals in the world of data analysis and business intelligence, plus the people who interact with data on a daily basis. And, as always, Dummies explains things in terms anyone can understand, making it easy to learn everything you need to know.
The Science of Deep Learning emerged from courses taught by the author that have provided thousands of students with training and experience for their academic studies, and prepared them for careers in deep learning, machine learning, and artificial intelligence in top companies in industry and academia. The book begins by covering the foundations of deep learning, followed by key deep learning architectures. Subsequent parts on generative models and reinforcement learning may be used as part of a deep learning course or as part of a course on each topic. The book includes state-of-the-art topics such as Transformers, graph neural networks, variational autoencoders, and deep reinforcement learning, with a broad range of applications. The appendices provide equations for computing gradients in backpropagation and optimization, and best practices in scientific writing and reviewing. The text presents an up-to-date guide to the field built upon clear visualizations using a unified notation and equations, lowering the barrier to entry for the reader. The accompanying website provides complementary code and hundreds of exercises with solutions.
This book focuses on current practices in scientific and technical communication, historical aspects, and characteristics and biblio graphic control of various forms of scientific and technical literature. It integrates the inventory approach for scientific and technical communication. |
You may like...
Quantum Zero-Error Information Theory
Elloa B. Guedes, Francisco Marcos De Assis, …
Hardcover
R3,601
Discovery Miles 36 010
Innovative Applications of Knowledge…
Susan Swayze, Valerie Ford
Hardcover
R4,859
Discovery Miles 48 590
Primer for Data Analytics and Graduate…
Douglas Wolfe, Grant Schneider
Hardcover
R2,441
Discovery Miles 24 410
Emergent Information: A Unified Theory…
Wolfgang Hofkirchner
Hardcover
R2,558
Discovery Miles 25 580
Encyclopedia of Information Science and…
Mehdi Khosrow-Pour, D.B.A.
Hardcover
R20,961
Discovery Miles 209 610
|