Organisms evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. For example, a compromise between rate of information processing and the energy consumed might explain certain neurophysiological and neuroanatomical observations (e.g., average firing frequency and number of neurons). Using this perspective reveals that the randomness injected into neural processing by the statistical uncertainty of synaptic transmission optimizes one kind of information processing relative to energy use. A critical hypothesis and insight is that neuronal information processing is appropriately measured, first, by considering dendrosomatic summation as a Shannon-type channel (1948) and, second, by considering such uncertain synaptic transmission as part of the dendrosomatic computation rather than as part of axonal information transmission. Using such a model of neural computation and matching the information gathered by dendritic summation to the axonal information transmitted, H(p*), conditions are defined that guarantee synaptic failures can improve the energetic efficiency of neurons. Further development provides a general expression relating optimal failure rate, f, to average firing rate, p*, and is consistent with physiologically observed values. The expression providing this relationship, f approximately 4(-H(p*)), generalizes across activity levels and is independent of the number of inputs to a neuron.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6758790 | PMC |
http://dx.doi.org/10.1523/JNEUROSCI.22-11-04746.2002 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!