Spiking Neural Networks (SNNs) hold great potential for mimicking the brain's efficient processing of information. Although biological evidence suggests that precise spike timing is crucial for effective information encoding, contemporary SNN research mainly concentrates on adjusting connection weights. In this work, we introduce Delay Learning based on Temporal Coding (DLTC), an innovative approach that integrates delay learning with a temporal coding strategy to optimize spike timing in SNNs. DLTC utilizes a learnable delay shift, which assigns varying levels of importance to different informational elements. This is complemented by an adjustable threshold that regulates firing times, allowing for earlier or later neuron activation as needed. We have tested DLTC's effectiveness in various contexts, including vision and auditory classification tasks, where it consistently outperformed traditional weight-only SNNs. The results indicate that DLTC achieves remarkable improvements in accuracy and computational efficiency, marking a step forward in advancing SNNs towards real-world applications. Our codes are accessible at https://github.com/sunpengfei1122/DLTC.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neunet.2024.106678 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!