Front Neurosci
November 2023
Recently, the accuracy of spike neural network (SNN) has been significantly improved by deploying convolutional neural networks (CNN) and their parameters to SNN. The deep convolutional SNNs, however, suffer from large amounts of computations, which is the major bottleneck for energy efficient SNN processor design. In this paper, we present an input-dependent computation reduction approach, where relatively unimportant neurons are identified and pruned without seriously sacrificing the accuracies.
View Article and Find Full Text PDFIEEE Trans Biomed Circuits Syst
June 2022
In this paper, we present a novel early termination based training acceleration technique for temporal coding based spiking neural network (SNN) processor design. The proposed early termination scheme can efficiently identify the non-contributing training images during the training's feedforward process, and it skips the rest of the processes to save training energy and time. A metric to evaluate each input image's contribution to training has been developed, and it is compared with pre-determined threshold to decide whether to skip the rest of the training process.
View Article and Find Full Text PDF