An embodied agent influences its environment and is influenced by it. We use the sensorimotor loop to model these interactions and quantify the information flows in the system by information-theoretic measures. This includes a measure for the interaction among the agent's body and its environment, often referred to as morphological computation.
View Article and Find Full Text PDFHelmholtz Machines (HMs) are a class of generative models composed of two Sigmoid Belief Networks (SBNs), acting respectively as an encoder and a decoder. These models are commonly trained using a two-step optimization algorithm called Wake-Sleep (WS) and more recently by improved versions, such as Reweighted Wake-Sleep (RWS) and Bidirectional Helmholtz Machines (BiHM). The locality of the connections in an SBN induces sparsity in the Fisher Information Matrices associated to the probabilistic models, in the form of a finely-grained block-diagonal structure.
View Article and Find Full Text PDFFront Psychol
November 2021
The Integrated Information Theory provides a quantitative approach to consciousness and can be applied to neural networks. An embodied agent controlled by such a network influences and is being influenced by its environment. This involves, on the one hand, morphological computation within goal directed action and, on the other hand, integrated information within the controller, the agent's brain.
View Article and Find Full Text PDFComplexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting.
View Article and Find Full Text PDFA core property of robust systems is given by the invariance of their function against the removal of some of their structural components. This intuition has been formalised in the context of input-output maps, thereby introducing the notion of exclusion independence. We review work on how this formalisation allows us to derive characterisation theorems that provide a basis for the design of robust systems.
View Article and Find Full Text PDFDespite the near universal assumption of individuality in biology, there is little agreement about what individuals are and few rigorous quantitative methods for their identification. Here, we propose that individuals are aggregates that preserve a measure of temporal integrity, i.e.
View Article and Find Full Text PDFA new canonical divergence is put forward for generalizing an information-geometric measure of complexity for both classical and quantum systems. On the simplex of probability measures, it is proved that the new divergence coincides with the Kullback-Leibler divergence, which is used to quantify how much a probability measure deviates from the non-interacting states that are modeled by exponential families of probabilities. On the space of positive density operators, we prove that the same divergence reduces to the quantum relative entropy, which quantifies many-party correlations of a quantum state from a Gibbs family.
View Article and Find Full Text PDFWe consider a general model of the sensorimotor loop of an agent interacting with the world. This formalises Uexküll's notion of a function-circle. Here, we assume a particular causal structure, mechanistically described in terms of Markov kernels.
View Article and Find Full Text PDFWe propose a model that explains the reliable emergence of power laws (e.g., Zipf's law) during the development of different human languages.
View Article and Find Full Text PDFWe present a framework for designing cheap control architectures of embodied agents. Our derivation is guided by the classical problem of universal approximation, whereby we explore the possibility of exploiting the agent's embodiment for a new and more efficient universal approximation of behaviors generated by sensorimotor control. This embodied universal approximation is compared with the classical non-embodied universal approximation.
View Article and Find Full Text PDFWe quantify the relationship between the dynamics of a time-discrete dynamical system, the tent map T and its iterations T(m), and the induced dynamics at a symbolical level in information theoretical terms. The symbol dynamics, given by a binary string s of length m, is obtained by choosing a partition point [Formula: see text] and lumping together the points [Formula: see text] s.t.
View Article and Find Full Text PDFOne of the main challenges in the field of embodied artificial intelligence is the open-ended autonomous learning of complex behaviors. Our approach is to use task-independent, information-driven intrinsic motivation(s) to support task-dependent learning. The work presented here is a preliminary step in which we investigate the predictive information (the mutual information of the past and future of the sensor stream) as an intrinsic drive, ideally supporting any kind of task acquisition.
View Article and Find Full Text PDFWe study a notion of knockout robustness of a stochastic map (Markov kernel) that describes a system of several input random variables and one output random variable. Robustness requires that the behaviour of the system does not change if one or several of the input variables are knocked out. Gibbs potentials are used to give a mechanistic description of the behaviour of the system after knockouts.
View Article and Find Full Text PDFInformation theory is a powerful tool to express principles to drive autonomous systems because it is domain invariant and allows for an intuitive interpretation. This paper studies the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process as a driving force to generate behavior. We study nonlinear and nonstationary systems and introduce the time-local predicting information (TiPI) which allows us to derive exact results together with explicit update rules for the parameters of the controller in the dynamical systems framework.
View Article and Find Full Text PDFTwo aspects play a key role in recently developed strategies for functional magnetic resonance imaging (fMRI) data analysis: first, it is now recognized that the human brain is a complex adaptive system and exhibits the hallmarks of complexity such as emergence of patterns arising out of a multitude of interactions between its many constituents. Second, the field of fMRI has evolved into a data-intensive, big data endeavor with large databases and masses of data being shared around the world. At the same time, ultra-high field MRI scanners are now available producing data at previously unobtainable quality and quantity.
View Article and Find Full Text PDFIn recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot.
View Article and Find Full Text PDFWe develop a geometric approach to complexity based on the principle that complexity requires interactions at different scales of description. Complex systems are more than the sum of their parts of any size and not just more than the sum of their elements. Using information geometry, we therefore analyze the decomposition of a system in terms of an interaction hierarchy.
View Article and Find Full Text PDFWe improve recently published results about resources of restricted Boltzmann machines (RBM) and deep belief networks (DBN)required to make them universal approximators. We show that any distribution pon the set {0,1}(n) of binary vectors of length n can be arbitrarily well approximated by an RBM with k-1 hidden units, where k is the minimal number of pairs of binary vectors differing in only one entry such that their union contains the support set of p. In important cases this number is half the cardinality of the support set of p (given in Le Roux & Bengio, 2008).
View Article and Find Full Text PDFWe present a tentative proposal for a quantitative measure of autonomy. This is something that, surprisingly, is rarely found in the literature, even though autonomy is considered to be a basic concept in many disciplines, including artificial life. We work in an information theoretic setting for which the distinction between system and environment is the starting point.
View Article and Find Full Text PDFWe provide a geometric framework for investigating the robustness of information flows over biological networks. We use information measures to quantify the impact of knockout perturbations on simple networks. Robustness has two components, a measure of the causal contribution of a node or nodes, and a measure of the change or exclusion dependence, of the network following node removal.
View Article and Find Full Text PDFPhilos Trans R Soc Lond B Biol Sci
March 2007
In animal communication, signals are frequently emitted using different channels (e.g. frequencies in a vocalization) and different modalities (e.
View Article and Find Full Text PDFIt has been argued that information processing in the cortex is optimised with regard to certain information theoretic principles. We have, for instance, recently shown that spike-timing dependent plasticity can improve an information-theoretic measure called spatio-temporal stochastic interaction which captures how strongly a set of neurons cooperates in space and time. Systems with high stochastic interaction reveal Poisson spike trains but nonetheless occupy only a strongly reduced area in their global phase space, they reveal repetiting but complex global activation patterns, and they can be interpreted as computational systems operating on selected sets of collective patterns or "global states" in a rule-like manner.
View Article and Find Full Text PDFWe extend Linkser's Infomax principle for feedforward neural networks to a measure for stochastic interdependence that captures spatial and temporal signal properties in recurrent systems. This measure, stochastic interaction, quantifies the Kullback-Leibler divergence of a Markov chain from a product of split chains for the single unit processes. For unconstrained Markov chains, the maximization of stochastic interaction, also called Temporal Infomax, has been previously shown to result in almost deterministic dynamics.
View Article and Find Full Text PDFSpatial interdependences of multiple stochastic units can be suitably quantified by the Kullback-Leibler divergence of the joint probability distribution from the corresponding factorized distribution. In the present paper, a generalized measure for stochastic interaction, which also captures temporal interdependences, is analysed within the setting of Markov chains. The dynamical properties of systems with strongly interacting stochastic units are analytically studied and illustrated by computer simulations.
View Article and Find Full Text PDF