Recurrent neural networks are used to forecast time series in finance, climate, language, and from many other domains. Reservoir computers are a particularly easily trainable form of recurrent neural network. Recently, a "next-generation" reservoir computer was introduced in which the memory trace involves only a finite number of previous symbols.
View Article and Find Full Text PDFOrganisms play, explore, and mimic those around them. Is there a purpose to this behavior? Are organisms just behaving, or are they trying to achieve goals? We believe this is a false dichotomy. To that end, to understand organisms, we attempt to unify two approaches for understanding complex agents, whether evolved or engineered.
View Article and Find Full Text PDFTheory suggest that networks of neurons may predict their input. Prediction may underlie most aspects of information processing and is believed to be involved in motor and cognitive control and decision-making. Retinal cells have been shown to be capable of predicting visual stimuli, and there is some evidence for prediction of input in the visual cortex and hippocampus.
View Article and Find Full Text PDFEntropy (Basel)
April 2023
Experimentalists observe allele frequency distributions and try to infer mutation rates and selection coefficients. How easy is this? We calculate limits to their ability in the context of the Wright-Fisher model by first finding the maximal amount of information that can be acquired using allele frequencies about the mutation rate and selection coefficient- at least 2 bits per allele- and then by finding how the organisms would have shaped their mutation rates and selection coefficients so as to maximize the information transfer.
View Article and Find Full Text PDFIn recent years, the field of neuroscience has gone through rapid experimental advances and a significant increase in the use of quantitative and computational methods. This growth has created a need for clearer analyses of the theory and modeling approaches used in the field. This issue is particularly complex in neuroscience because the field studies phenomena that cross a wide range of scales and often require consideration at varying degrees of abstraction, from precise biophysical interactions to the computations they implement.
View Article and Find Full Text PDFPotassium voltage-gated (Kv) channels need to detect and respond to rapidly changing ionic concentrations in their environment. With an essential role in regulating electric signaling, they would be expected to be optimal sensors that evolved to predict the ionic concentrations. To explore these assumptions, we use statistical mechanics in conjunction with information theory to model how animal Kv channels respond to changes in potassium concentrations in their environment.
View Article and Find Full Text PDFInferring models, predicting the future, and estimating the entropy rate of discrete-time, discrete-event processes is well-worn ground. However, a much broader class of discrete-event processes operates in continuous-time. Here, we provide new methods for inferring, predicting, and estimating them.
View Article and Find Full Text PDFTools to estimate brain connectivity offer the potential to enhance our understanding of brain functioning. The behavior of neuronal networks, including functional connectivity and induced connectivity changes by external stimuli, can be studied using models of cultured neurons. Cultured neurons tend to be active in groups, and pairs of neurons are said to be functionally connected when their firing patterns show significant synchronicity.
View Article and Find Full Text PDFEntropy (Basel)
January 2022
Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic any finite-state automaton in theory, and some workers demonstrated that this can hold in practice. We test the capability of generalized linear models, RCs, and Long Short-Term Memory (LSTM) RNN architectures to predict the stochastic processes generated by a large suite of probabilistic deterministic finite-state automata (PDFA) in the small-data limit according to two metrics: predictive accuracy and distance to a predictive rate-distortion curve. The latter provides a sense of whether or not the RNN is a lossy predictive feature extractor in the information-theoretic sense.
View Article and Find Full Text PDFHow can individuals with schizophrenia best be equipped to distinguish delusions from accurate judgements about their environment? This study presents an approach based on the principles of Bayesian probability and presents the results of a series of tests in which a simulated observer classifies randomly generated data characteristic of a simulated environment. The complexity of the data ranges from scalars to vectors of variable lengths, and the simulated observer makes its decisions based on either perfect or imperfect models of its environment. We find that when a low-dimensional observation is considered characteristic of both real observations and delusions, the prior probabilities of any observation being real or fake are of greater importance to the final decision than the attributes of the observation.
View Article and Find Full Text PDFSomehow, our brain and other organisms manage to predict their environment. Behind this must be an input-dependent dynamical system, or recurrent neural network, whose present state reflects the history of environmental input. The design principles for prediction-in particular, what kinds of attractors allow for greater predictive capability-are still unknown.
View Article and Find Full Text PDFDiverse many-body systems, from soap bubbles to suspensions to polymers, learn and remember patterns in the drives that push them far from equilibrium. This learning may be leveraged for computation, memory, and engineering. Until now, many-body learning has been detected with thermodynamic properties, such as work absorption and strain.
View Article and Find Full Text PDFRecently, researchers have found time cells in the hippocampus that appear to contain information about the timing of past events. Some researchers have argued that time cells are taking a Laplace transform of their input in order to reconstruct the past stimulus. We argue that stimulus prediction, not stimulus reconstruction or redundancy reduction, is in better agreement with observed responses of time cells.
View Article and Find Full Text PDFEntropy (Basel)
August 2020
Cognitive systems exhibit astounding prediction capabilities that allow them to reap rewards from regularities in their environment. How do organisms predict environmental input and how well do they do it? As a prerequisite to answering that question, we first address the limits on prediction strategy inference, given a series of inputs and predictions from an observer. We study the special case of Bayesian observers, allowing for a probability that the observer randomly ignores data when building her model.
View Article and Find Full Text PDFGiven the stochastic nature of gene expression, genetically identical cells exposed to the same environmental inputs will produce different outputs. This heterogeneity has been hypothesized to have consequences for how cells are able to survive in changing environments. Recent work has explored the use of information theory as a framework to understand the accuracy with which cells can ascertain the state of their surroundings.
View Article and Find Full Text PDFBiological sensors must often predict their input while operating under metabolic constraints. However, determining whether or not a particular sensor is evolved or designed to be accurate and efficient is challenging. This arises partly from the functional constraints being at cross purposes and partly since quantifying the prediction performance of even in silico sensors can require prohibitively long simulations, especially when highly complex environments drive sensors out of equilibrium.
View Article and Find Full Text PDFEvolved and engineered organisms must adapt to fluctuating environments that are often only partially observed. We show that adaptation to a second environment can be significantly harder after adapting to a first, completely unrelated environment, even when using second-order learning algorithms and a constant learning rate. In effect, there is a lack of fading memory in the organism's performance.
View Article and Find Full Text PDFBuilding predictive sensors is of paramount importance in science. Can we make a randomly wired sensor "good enough" at predicting its input simply by making it larger? We show that infinitely large, randomly wired sensors are nonspecific for their input, and therefore nonpredictive of future input, unless they are close to deterministic. Nearly deterministic, randomly wired sensors can capture ∼ 10% of the predictive information of their inputs for "typical" environments.
View Article and Find Full Text PDFExperimentalists observe phenotypic variability even in isogenic bacteria populations. We explore the hypothesis that in fluctuating environments this variability is tuned to maximize a bacterium's expected log-growth rate, potentially aided by epigenetic (all inheritable nongenetic) markers that store information about past environments. Crucially, we assume a time delay between sensing and action, so that a past epigenetic marker is used to generate the present phenotypic variability.
View Article and Find Full Text PDFCausal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process' "intrinsic computation". We discuss how statistical complexity changes with slight changes to the underlying model- in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite.
View Article and Find Full Text PDFRecurrent networks are trained to memorize their input better, often in the hopes that such training will increase the ability of the network to predict. We show that networks designed to memorize input can be arbitrarily bad at prediction. We also find, for several types of inputs, that one-node networks optimized for prediction are nearly at upper bounds on predictive capacity given by Wiener filters and are roughly equivalent in performance to randomly generated five-node networks.
View Article and Find Full Text PDFScientific explanation often requires inferring maximally predictive features from a given data set. Unfortunately, the collection of minimal maximally predictive features for most stochastic processes is uncountably infinite. In such cases, one compromises and instead seeks nearly maximally predictive features.
View Article and Find Full Text PDFIn complex environments, there are costs to both ignorance and perception. An organism needs to track fitness-relevant information about its world, but the more information it tracks, the more resources it must devote to perception. As a first step towards a general understanding of this trade-off, we use a tool from information theory, rate-distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion ('distortions').
View Article and Find Full Text PDFFor many organisms, the number of sensory neurons is largely determined during development, before strong environmental cues are present. This is despite the fact that environments can fluctuate drastically both from generation to generation and within an organism's lifetime. How can organisms get by by hard coding the number of sensory neurons? We approach this question using rate-distortion theory.
View Article and Find Full Text PDF