Networks with stochastic variables described by heavy-tailed lognormal distribution are ubiquitous in nature, and hence they deserve an exact information-theoretic characterization. We derive analytical formulas for mutual information between elements of different networks with correlated lognormally distributed activities. In a special case, we find an explicit expression for mutual information between neurons when neural activities and synaptic weights are lognormally distributed, as suggested by experimental data. Comparison of this expression with the case when these two variables have short tails reveals that mutual information with heavy tails for neurons and synapses is generally larger and can diverge for some finite variances in presynaptic firing rates and synaptic weights. This result suggests that evolution might prefer brains with heterogeneous dynamics to optimize information processing.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevE.109.014117 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!