Explicit mutual information for simple networks and neurons with lognormal activities.

Phys Rev E

Department of Mathematics, Informatics, and Mechanics, Institute of Applied Mathematics and Mechanics, University of Warsaw, Ulica Banacha 2, 02-097 Warsaw, Poland.

Published: January 2024

Networks with stochastic variables described by heavy-tailed lognormal distribution are ubiquitous in nature, and hence they deserve an exact information-theoretic characterization. We derive analytical formulas for mutual information between elements of different networks with correlated lognormally distributed activities. In a special case, we find an explicit expression for mutual information between neurons when neural activities and synaptic weights are lognormally distributed, as suggested by experimental data. Comparison of this expression with the case when these two variables have short tails reveals that mutual information with heavy tails for neurons and synapses is generally larger and can diverge for some finite variances in presynaptic firing rates and synaptic weights. This result suggests that evolution might prefer brains with heterogeneous dynamics to optimize information processing.

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.109.014117DOI Listing

Publication Analysis

Top Keywords

lognormally distributed
8
synaptic weights
8
explicit mutual
4
mutual simple
4
simple networks
4
networks neurons
4
neurons lognormal
4
lognormal activities
4
activities networks
4
networks stochastic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!