|
|
Showing 1 - 3 of
3 matches in All Departments
Every thought is a throw of dice. Stephane Mallarme This book is
the last one of a trilogy which reports a part of our research work
over nearly thirty years (we discard our non-conventional results
in automatic control theory and applications on the one hand, and
fuzzy sets on the other), and its main key words are Information
Theory, Entropy, Maximum Entropy Principle, Linguistics,
Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian
Motion, Stochastic Differential Equations of Order n, Stochastic
Optimal Control, Computer Vision. Our obsession has been always the
same: Shannon's information theory should play a basic role in the
foundations of sciences, but subject to the condition that it be
suitably generalized to allow us to deal with problems which are
not necessarily related to communication engineering. With this
objective in mind, two questions are of utmost importance: (i) How
can we introduce meaning or significance of information in
Shannon's information theory? (ii) How can we define and/or measure
the amount of information involved in a form or a pattern without
using a probabilistic scheme? It is obligatory to find suitable
answers to these problems if we want to apply Shannon's theory to
science with some chance of success. For instance, its use in
biology has been very disappointing, for the very reason that the
meaning of information is there of basic importance, and is not
involved in this approach.
For four decades, information theory has been viewed almost
exclusively as a theory based upon the Shannon measure of
uncertainty and information, usually referred to as Shannon
entropy. Since the publication of Shannon's seminal paper in 1948,
the theory has grown extremely rapidly and has been applied with
varied success in almost all areas of human endeavor. At this time,
the Shannon information theory is a well established and developed
body of knowledge. Among its most significant recent contributions
have been the use of the complementary principles of minimum and
maximum entropy in dealing with a variety of fundamental systems
problems such as predic tive systems modelling, pattern
recognition, image reconstruction, and the like. Since its
inception in 1948, the Shannon theory has been viewed as a
restricted information theory. It has often been argued that the
theory is capable of dealing only with syntactic aspects of
information, but not with its semantic and pragmatic aspects. This
restriction was considered a v~rtue by some experts and a vice by
others. More recently, however, various arguments have been made
that the theory can be appropriately modified to account for
semantic aspects of in formation as well. Some of the most
convincing arguments in this regard are in cluded in Fred Dretske's
Know/edge & Flow of Information (The M.LT. Press, Cambridge,
Mass., 1981) and in this book by Guy lumarie.
Every thought is a throw of dice. Stephane Mallarme This book is
the last one of a trilogy which reports a part of our research work
over nearly thirty years (we discard our non-conventional results
in automatic control theory and applications on the one hand, and
fuzzy sets on the other), and its main key words are Information
Theory, Entropy, Maximum Entropy Principle, Linguistics,
Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian
Motion, Stochastic Differential Equations of Order n, Stochastic
Optimal Control, Computer Vision. Our obsession has been always the
same: Shannon's information theory should play a basic role in the
foundations of sciences, but subject to the condition that it be
suitably generalized to allow us to deal with problems which are
not necessarily related to communication engineering. With this
objective in mind, two questions are of utmost importance: (i) How
can we introduce meaning or significance of information in
Shannon's information theory? (ii) How can we define and/or measure
the amount of information involved in a form or a pattern without
using a probabilistic scheme? It is obligatory to find suitable
answers to these problems if we want to apply Shannon's theory to
science with some chance of success. For instance, its use in
biology has been very disappointing, for the very reason that the
meaning of information is there of basic importance, and is not
involved in this approach.
|
|