The Local Information Dynamics of Distributed Computation in Complex Systems

The Local Information Dynamics of Distributed Computation in Complex Systems
Title The Local Information Dynamics of Distributed Computation in Complex Systems PDF eBook
Author Joseph T. Lizier
Publisher Springer Science & Business Media
Pages 249
Release 2012-11-06
Genre Technology & Engineering
ISBN 3642329527

Download The Local Information Dynamics of Distributed Computation in Complex Systems Book in PDF, Epub and Kindle

The nature of distributed computation in complex systems has often been described in terms of memory, communication and processing. This thesis presents a complete information-theoretic framework to quantify these operations on information (i.e. information storage, transfer and modification), and in particular their dynamics in space and time. The framework is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems (e.g. that gliders are dominant information transfer agents). Applications to several important network models, including random Boolean networks, suggest that the capability for information storage and coherent transfer are maximised near the critical regime in certain order-chaos phase transitions. Further applications to study and design information structure in the contexts of computational neuroscience and guided self-organisation underline the practical utility of the techniques presented here.

Guided Self-Organization: Inception

Guided Self-Organization: Inception
Title Guided Self-Organization: Inception PDF eBook
Author Mikhail Prokopenko
Publisher Springer Science & Business Media
Pages 488
Release 2013-12-19
Genre Technology & Engineering
ISBN 3642537340

Download Guided Self-Organization: Inception Book in PDF, Epub and Kindle

Is it possible to guide the process of self-organisation towards specific patterns and outcomes? Wouldn’t this be self-contradictory? After all, a self-organising process assumes a transition into a more organised form, or towards a more structured functionality, in the absence of centralised control. Then how can we place the guiding elements so that they do not override rich choices potentially discoverable by an uncontrolled process? This book presents different approaches to resolving this paradox. In doing so, the presented studies address a broad range of phenomena, ranging from autopoietic systems to morphological computation, and from small-world networks to information cascades in swarms. A large variety of methods is employed, from spontaneous symmetry breaking to information dynamics to evolutionary algorithms, creating a rich spectrum reflecting this emerging field. Demonstrating several foundational theories and frameworks, as well as innovative practical implementations, Guided Self-Organisation: Inception, will be an invaluable tool for advanced students and researchers in a multiplicity of fields across computer science, physics and biology, including information theory, robotics, dynamical systems, graph theory, artificial life, multi-agent systems, theory of computation and machine learning.

Directed Information Measures in Neuroscience

Directed Information Measures in Neuroscience
Title Directed Information Measures in Neuroscience PDF eBook
Author Michael Wibral
Publisher Springer
Pages 234
Release 2014-03-20
Genre Technology & Engineering
ISBN 3642544746

Download Directed Information Measures in Neuroscience Book in PDF, Epub and Kindle

Analysis of information transfer has found rapid adoption in neuroscience, where a highly dynamic transfer of information continuously runs on top of the brain's slowly-changing anatomical connectivity. Measuring such transfer is crucial to understanding how flexible information routing and processing give rise to higher cognitive function. Directed Information Measures in Neuroscience reviews recent developments of concepts and tools for measuring information transfer, their application to neurophysiological recordings and analysis of interactions. Written by the most active researchers in the field the book discusses the state of the art, future prospects and challenges on the way to an efficient assessment of neuronal information transfer. Highlights include the theoretical quantification and practical estimation of information transfer, description of transfer locally in space and time, multivariate directed measures, information decomposition among a set of stimulus/responses variables and the relation between interventional and observational causality. Applications to neural data sets and pointers to open source software highlight the usefulness of these measures in experimental neuroscience. With state-of-the-art mathematical developments, computational techniques and applications to real data sets, this book will be of benefit to all graduate students and researchers interested in detecting and understanding the information transfer between components of complex systems.

Information-based methods for neuroimaging: analyzing structure, function and dynamics

Information-based methods for neuroimaging: analyzing structure, function and dynamics
Title Information-based methods for neuroimaging: analyzing structure, function and dynamics PDF eBook
Author Jesus M. Cortés
Publisher Frontiers Media SA
Pages 192
Release 2015-05-07
Genre Neurosciences. Biological psychiatry. Neuropsychiatry
ISBN 2889195023

Download Information-based methods for neuroimaging: analyzing structure, function and dynamics Book in PDF, Epub and Kindle

The aim of this Research Topic is to discuss the state of the art on the use of Information-based methods in the analysis of neuroimaging data. Information-based methods, typically built as extensions of the Shannon Entropy, are at the basis of model-free approaches which, being based on probability distributions rather than on specific expectations, can account for all possible non-linearities present in the data in a model-independent fashion. Mutual Information-like methods can also be applied on interacting dynamical variables described by time-series, thus addressing the uncertainty reduction (or information) in one variable by conditioning on another set of variables. In the last years, different Information-based methods have been shown to be flexible and powerful tools to analyze neuroimaging data, with a wide range of different methodologies, including formulations-based on bivariate vs multivariate representations, frequency vs time domains, etc. Apart from methodological issues, the information bit as a common unit represents a convenient way to open the road for comparison and integration between different measurements of neuroimaging data in three complementary contexts: Structural Connectivity, Dynamical (Functional and Effective) Connectivity, and Modelling of brain activity. Applications are ubiquitous, starting from resting state in healthy subjects to modulations of consciousness and other aspects of pathophysiology. Mutual Information-based methods have provided new insights about common-principles in brain organization, showing the existence of an active default network when the brain is at rest. It is not clear, however, how this default network is generated, the different modules are intra-interacting, or disappearing in the presence of stimulation. Some of these open-questions at the functional level might find their mechanisms on their structural correlates. A key question is the link between structure and function and the use of structural priors for the understanding of the functional connectivity measures. As effective connectivity is concerned, recently a common framework has been proposed for Transfer Entropy and Granger Causality, a well-established methodology originally based on autoregressive models. This framework can open the way to new theories and applications. This Research Topic brings together contributions from researchers from different backgrounds which are either developing new approaches, or applying existing methodologies to new data, and we hope it will set the basis for discussing the development and validation of new Information-based methodologies for the understanding of brain structure, function, and dynamics.

Transfer Entropy

Transfer Entropy
Title Transfer Entropy PDF eBook
Author Deniz Gençağa
Publisher MDPI
Pages 335
Release 2018-08-24
Genre Mathematics
ISBN 3038429198

Download Transfer Entropy Book in PDF, Epub and Kindle

This book is a printed edition of the Special Issue "Transfer Entropy" that was published in Entropy

From Matter to Life

From Matter to Life
Title From Matter to Life PDF eBook
Author Sara Imari Walker
Publisher Cambridge University Press
Pages 517
Release 2017-02-23
Genre Science
ISBN 1108116507

Download From Matter to Life Book in PDF, Epub and Kindle

Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science.

Complexity, Criticality and Computation (C³)

Complexity, Criticality and Computation (C³)
Title Complexity, Criticality and Computation (C³) PDF eBook
Author Mikhail Prokopenko
Publisher MDPI
Pages 269
Release 2018-04-06
Genre Computers
ISBN 3038425141

Download Complexity, Criticality and Computation (C³) Book in PDF, Epub and Kindle

This book is a printed edition of the Special Issue "Complexity, Criticality and Computation (C³)" that was published in Entropy