CS-QCFS: Bridging the performance gap in ultra-low latency spiking neural networks.

Neural Netw

School of Electronic Science and Engineering, Nanjing University, Nanjing 210023, China.

Published: January 2025

Spiking Neural Networks (SNNs) are at the forefront of computational neuroscience, emulating the nuanced dynamics of biological systems. In the realm of SNN training methods, the conversion from ANNs to SNNs has generated significant interest due to its potential for creating energy-efficient and biologically plausible models. However, existing conversion methods often require long time-steps to ensure that the converted SNNs achieve performance comparable to the original ANNs. In this paper, we thoroughly investigate the process of ANN-SNN conversion and identify two critical issues: the frequently overlooked heterogeneity across channels and the emergence of negative thresholds, both of which lead to the problem of long time-steps. To address these issues, we introduce an innovative activation function called Channel-wise Softplus Quantization Clip-Floor-Shift (CS-QCFS) activation function. This function effectively handles the disparities between channels and maintain positive thresholds. This innovation enables us to achieve high-performance SNNs, particularly in ultra-low time-steps. Our experimental results demonstrate that the proposed method achieves state-of-the-art performance on CIFAR datasets. For instance, we achieve a top-1 accuracy of 95.86% on CIFAR-10 and 74.83% on CIFAR-100 with only 1 time-step.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2024.107076DOI Listing

Publication Analysis

Top Keywords

spiking neural
8
neural networks
8
long time-steps
8
activation function
8
cs-qcfs bridging
4
bridging performance
4
performance gap
4
gap ultra-low
4
ultra-low latency
4
latency spiking
4

Similar Publications

Traffic classification in SDN-based IoT network using two-level fused network with self-adaptive manta ray foraging.

Sci Rep

January 2025

Department of Computer Science, College of Computer and Information Sciences, Majmaah University, 11952, Al-Majmaah, Saudi Arabia.

The rapid expansion of IoT networks, combined with the flexibility of Software-Defined Networking (SDN), has significantly increased the complexity of traffic management, requiring accurate classification to ensure optimal quality of service (QoS). Existing traffic classification techniques often rely on manual feature selection, limiting adaptability and efficiency in dynamic environments. This paper presents a novel traffic classification framework for SDN-based IoT networks, introducing a Two-Level Fused Network integrated with a self-adaptive Manta Ray Foraging Optimization (SMRFO) algorithm.

View Article and Find Full Text PDF

Model optimization is a problem of great concern and challenge for developing an image classification model. In image classification, selecting the appropriate hyperparameters can substantially boost the model's ability to learn intricate patterns and features from complex image data. Hyperparameter optimization helps to prevent overfitting by finding the right balance between complexity and generalization of a model.

View Article and Find Full Text PDF

We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches: per-synapse delay learning via Dilated Convolutions with Learnable Spacings (DCLS) and a dynamic pruning strategy that also serves as a form of delay learning. In the latter approach, the network dynamically selects and prunes connections, optimizing the delays in sparse connectivity settings. We evaluate both approaches on the Raw Heidelberg Digits keyword spotting benchmark using Backpropagation Through Time with surrogate gradients.

View Article and Find Full Text PDF

The pursuit of artificial neural networks that mirror the accuracy, efficiency and low latency of biological neural networks remains a cornerstone of artificial intelligence (AI) research. Here, we incorporated recent neuroscientific findings of self-inhibiting autapse and neuron heterogeneity for innovating a spiking neural network (SNN) with enhanced learning and memorizing capacities. A bi-level programming paradigm was formulated to respectively learn neuron-level biophysical variables and network-level synapse weights for nested heterogeneous learning.

View Article and Find Full Text PDF

Directional intermodular coupling enriches functional complexity in biological neuronal networks.

Neural Netw

November 2024

Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan; Graduate School of Engineering, Tohoku University, Sendai, Japan.

Hierarchically modular organization is a canonical network topology that is evolutionarily conserved in the nervous systems of animals. Within the network, neurons form directional connections defined by the growth of their axonal terminals. However, this topology is dissimilar to the network formed by dissociated neurons in culture because they form randomly connected networks on homogeneous substrates.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!