Backpropagation With Sparsity Regularization for Spiking Neural Network Learning.

Front Neurosci

School of Information Science and Technology, Fudan University, Shanghai, China.

Published: April 2022

AI Article Synopsis

  • - The article introduces a new learning algorithm for spiking neural networks (SNN) called backpropagation with sparsity regularization (BPSR), which aims to enhance energy efficiency by mimicking biological processes.
  • - BPSR helps minimize spiking firing rates while maintaining high accuracy through techniques like spiking regularization and synaptic rewiring, optimizing the network's structure for better performance.
  • - Experimental results across various datasets (like MNIST and CIFAR10) show that BPSR not only achieves efficient learning with reduced spiking activity but also maintains or improves accuracy, making it a promising advancement in low-power computing.

Article Abstract

The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9047717PMC
http://dx.doi.org/10.3389/fnins.2022.760298DOI Listing

Publication Analysis

Top Keywords

backpropagation sparsity
8
sparsity regularization
8
spiking neural
8
neural network
8
snn learning
8
synaptic sparsity
8
firing rate
8
spiking
6
backpropagation
4
regularization
4

Similar Publications

The brain topology highly reflects the complex cognitive functions of the biological brain after million-years of evolution. Learning from these biological topologies is a smarter and easier way to achieve brain-like intelligence with features of efficiency, robustness, and flexibility. Here we proposed a brain topology-improved spiking neural network (BT-SNN) for efficient reinforcement learning.

View Article and Find Full Text PDF

Spiking neural networks fine-tuning for brain image segmentation.

Front Neurosci

November 2023

School of Electrical Engineering and Computer Science, Ohio University, Athens, OH, United States.

Introduction: The field of machine learning has undergone a significant transformation with the progress of deep artificial neural networks (ANNs) and the growing accessibility of annotated data. ANNs usually require substantial power and memory usage to achieve optimal performance. Spiking neural networks (SNNs) have recently emerged as a low-power alternative to ANNs due to their sparsity nature.

View Article and Find Full Text PDF

Solving large-scale MEG/EEG source localisation and functional connectivity problems simultaneously using state-space models.

Neuroimage

January 2024

Intelligent Systems Research Centre, School of Computing, Engineering and Intelligent Systems, Ulster University, Magee campus, Derry∼Londonderry, United Kingdom; Bath Institute for the Augmented Human, University of Bath, Bath, BA2 7AY, United Kingdom.

State-space models are widely employed across various research disciplines to study unobserved dynamics. Conventional estimation techniques, such as Kalman filtering and expectation maximisation, offer valuable insights but incur high computational costs in large-scale analyses. Sparse inverse covariance estimators can mitigate these costs, but at the expense of a trade-off between enforced sparsity and increased estimation bias, necessitating careful assessment in low signal-to-noise ratio (SNR) situations.

View Article and Find Full Text PDF

Light absorption and scattering exist in the underwater environment, which can lead to blurring, reduced brightness, and color distortion in underwater images. Polarized images have the advantages of eliminating underwater scattering interference, enhancing contrast, and detecting material information of the object in underwater detection. In this paper, from the perspective of polarization imaging, different concentrations (0.

View Article and Find Full Text PDF

In recent years, Deep Convolutional Neural Networks (DCNNs) have outreached the performance of classical algorithms for image restoration tasks. However, most of these methods are not suited for computational efficiency. In this work, we investigate Spiking Neural Networks (SNNs) for the specific and uncovered case of image denoising, with the goal of reaching the performance of conventional DCNN while reducing the computational cost.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!