The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9047717 | PMC |
http://dx.doi.org/10.3389/fnins.2022.760298 | DOI Listing |
Front Neurosci
April 2024
Institute of Automation, Chinese Academy of Sciences, Beijing, China.
The brain topology highly reflects the complex cognitive functions of the biological brain after million-years of evolution. Learning from these biological topologies is a smarter and easier way to achieve brain-like intelligence with features of efficiency, robustness, and flexibility. Here we proposed a brain topology-improved spiking neural network (BT-SNN) for efficient reinforcement learning.
View Article and Find Full Text PDFFront Neurosci
November 2023
School of Electrical Engineering and Computer Science, Ohio University, Athens, OH, United States.
Introduction: The field of machine learning has undergone a significant transformation with the progress of deep artificial neural networks (ANNs) and the growing accessibility of annotated data. ANNs usually require substantial power and memory usage to achieve optimal performance. Spiking neural networks (SNNs) have recently emerged as a low-power alternative to ANNs due to their sparsity nature.
View Article and Find Full Text PDFNeuroimage
January 2024
Intelligent Systems Research Centre, School of Computing, Engineering and Intelligent Systems, Ulster University, Magee campus, Derry∼Londonderry, United Kingdom; Bath Institute for the Augmented Human, University of Bath, Bath, BA2 7AY, United Kingdom.
State-space models are widely employed across various research disciplines to study unobserved dynamics. Conventional estimation techniques, such as Kalman filtering and expectation maximisation, offer valuable insights but incur high computational costs in large-scale analyses. Sparse inverse covariance estimators can mitigate these costs, but at the expense of a trade-off between enforced sparsity and increased estimation bias, necessitating careful assessment in low signal-to-noise ratio (SNR) situations.
View Article and Find Full Text PDFLight absorption and scattering exist in the underwater environment, which can lead to blurring, reduced brightness, and color distortion in underwater images. Polarized images have the advantages of eliminating underwater scattering interference, enhancing contrast, and detecting material information of the object in underwater detection. In this paper, from the perspective of polarization imaging, different concentrations (0.
View Article and Find Full Text PDFFront Neurosci
August 2023
Université Côte d'Azur, CNRS, LEAT, Sophia Antipolis, France.
In recent years, Deep Convolutional Neural Networks (DCNNs) have outreached the performance of classical algorithms for image restoration tasks. However, most of these methods are not suited for computational efficiency. In this work, we investigate Spiking Neural Networks (SNNs) for the specific and uncovered case of image denoising, with the goal of reaching the performance of conventional DCNN while reducing the computational cost.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!