Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks.

Sci Adv

Research Center for Brain-inspired Intelligence, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China.

Published: October 2021

Many synaptic plasticity rules found in natural circuits have not been incorporated into artificial neural networks (ANNs). We showed that incorporating a nonlocal feature of synaptic plasticity found in natural neural networks, whereby synaptic modification at output synapses of a neuron backpropagates to its input synapses made by upstream neurons, markedly reduced the computational cost without affecting the accuracy of spiking neural networks (SNNs) and ANNs in supervised learning for three benchmark tasks. For SNNs, synaptic modification at output neurons generated by spike timing–dependent plasticity was allowed to self-propagate to limited upstream synapses. For ANNs, modified synaptic weights via conventional backpropagation algorithm at output neurons self-backpropagated to limited upstream synapses. Such self-propagating plasticity may produce coordinated synaptic modifications across neuronal layers that reduce computational cost.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8528419PMC
http://dx.doi.org/10.1126/sciadv.abh0146DOI Listing

Publication Analysis

Top Keywords

neural networks
16
synaptic modifications
8
artificial neural
8
networks synaptic
8
synaptic plasticity
8
synaptic modification
8
modification output
8
computational cost
8
output neurons
8
limited upstream
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!