|
Showing 1 - 6 of
6 matches in All Departments
Analysis of information transfer has found rapid adoption in
neuroscience, where a highly dynamic transfer of information
continuously runs on top of the brain's slowly-changing anatomical
connectivity. Measuring such transfer is crucial to understanding
how flexible information routing and processing give rise to higher
cognitive function. "Directed Information Measures in Neuroscience"
reviews recent developments of concepts and tools for measuring
information transfer, their application to neurophysiological
recordings and analysis of interactions. Written by the most active
researchers in the field the book discusses the state of the art,
future prospects and challenges on the way to an efficient
assessment of neuronal information transfer. Highlights include the
theoretical quantification and practical estimation of information
transfer, description of transfer locally in space and time,
multivariate directed measures, information decomposition among a
set of stimulus/responses variables and the relation between
interventional and observational causality. Applications to neural
data sets and pointers to open source software highlight the
usefulness of these measures in experimental neuroscience. With
state-of-the-art mathematical developments, computational
techniques and applications to real data sets, this book will be of
benefit to all graduate students and researchers interested in
detecting and understanding the information transfer between
components of complex systems.
The nature of distributed computation in complex systems has often
been described in terms of memory, communication and processing.
This thesis presents a complete information-theoretic framework to
quantify these operations on information (i.e. information storage,
transfer and modification), and in particular their dynamics in
space and time. The framework is applied to cellular automata, and
delivers important insights into the fundamental nature of
distributed computation and the dynamics of complex systems (e.g.
that gliders are dominant information transfer agents).
Applications to several important network models, including random
Boolean networks, suggest that the capability for information
storage and coherent transfer are maximised near the critical
regime in certain order-chaos phase transitions. Further
applications to study and design information structure in the
contexts of computational neuroscience and guided self-organisation
underline the practical utility of the techniques presented here.
This book considers a relatively new metric in complex systems,
transfer entropy, derived from a series of measurements, usually a
time series. After a qualitative introduction and a chapter that
explains the key ideas from statistics required to understand the
text, the authors then present information theory and transfer
entropy in depth. A key feature of the approach is the authors'
work to show the relationship between information flow and
complexity. The later chapters demonstrate information transfer in
canonical systems, and applications, for example in neuroscience
and in finance. The book will be of value to advanced undergraduate
and graduate students and researchers in the areas of computer
science, neuroscience, physics, and engineering.
This book considers a relatively new metric in complex systems,
transfer entropy, derived from a series of measurements, usually a
time series. After a qualitative introduction and a chapter that
explains the key ideas from statistics required to understand the
text, the authors then present information theory and transfer
entropy in depth. A key feature of the approach is the authors'
work to show the relationship between information flow and
complexity. The later chapters demonstrate information transfer in
canonical systems, and applications, for example in neuroscience
and in finance. The book will be of value to advanced undergraduate
and graduate students and researchers in the areas of computer
science, neuroscience, physics, and engineering.
Analysis of information transfer has found rapid adoption in
neuroscience, where a highly dynamic transfer of information
continuously runs on top of the brain's slowly-changing anatomical
connectivity. Measuring such transfer is crucial to understanding
how flexible information routing and processing give rise to higher
cognitive function. Directed Information Measures in Neuroscience
reviews recent developments of concepts and tools for measuring
information transfer, their application to neurophysiological
recordings and analysis of interactions. Written by the most active
researchers in the field the book discusses the state of the art,
future prospects and challenges on the way to an efficient
assessment of neuronal information transfer. Highlights include the
theoretical quantification and practical estimation of information
transfer, description of transfer locally in space and time,
multivariate directed measures, information decomposition among a
set of stimulus/responses variables and the relation between
interventional and observational causality. Applications to neural
data sets and pointers to open source software highlight the
usefulness of these measures in experimental neuroscience. With
state-of-the-art mathematical developments, computational
techniques and applications to real data sets, this book will be of
benefit to all graduate students and researchers interested in
detecting and understanding the information transfer between
components of complex systems.
The nature of distributed computation in complex systems has often
been described in terms of memory, communication and processing.
This thesis presents a complete information-theoretic framework to
quantify these operations on information (i.e. information storage,
transfer and modification), and in particular their dynamics in
space and time. The framework is applied to cellular automata, and
delivers important insights into the fundamental nature of
distributed computation and the dynamics of complex systems (e.g.
that gliders are dominant information transfer agents).
Applications to several important network models, including random
Boolean networks, suggest that the capability for information
storage and coherent transfer are maximised near the critical
regime in certain order-chaos phase transitions. Further
applications to study and design information structure in the
contexts of computational neuroscience and guided self-organisation
underline the practical utility of the techniques presented here.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R205
R168
Discovery Miles 1 680
|