Towards parameter-free attentional spiking neural networks.

Neural Netw

Department of Information Technology, Ghent University, Gent, Belgium. Electronic address:

Published: January 2025

AI Article Synopsis

Article Abstract

Brain-inspired spiking neural networks (SNNs) are increasingly explored for their potential in spatiotemporal information modeling and energy efficiency on emerging neuromorphic hardware. Recent works incorporate attentional modules into SNNs, greatly enhancing their capabilities in handling sequential data. However, these parameterized attentional modules have placed a huge burden on memory consumption, a factor that is constrained on neuromorphic chips. To address this issue, we propose a parameter-free attention (PfA) mechanism that establishes a parameter-free linear space to bolster feature representation. The proposed PfA approach can be seamlessly integrated into the spiking neuron, resulting in enhanced performance without any increase in parameters. The experimental results on the SHD, BAE-TIDIGITS, SSC, DVS-Gesture, DVS-Cifar10, Cifar10, and Cifar100 datasets well demonstrate its competitive or superior classification accuracy compared with other state-of-the-art models. Furthermore, our model exhibits stronger noise robustness than conventional SNNs and those with parameterized attentional mechanisms. Our codes can be accessible at https://github.com/sunpengfei1122/PfA-SNN.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2025.107154DOI Listing

Publication Analysis

Top Keywords

spiking neural
8
neural networks
8
attentional modules
8
parameterized attentional
8
parameter-free attentional
4
attentional spiking
4
networks brain-inspired
4
brain-inspired spiking
4
networks snns
4
snns increasingly
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!