Low Latency Conversion of Artificial Neural Network Models to Rate-encoded Spiking Neural Networks

10/27/2022
by   Zhanglu Yan, et al.
0

Spiking neural networks (SNNs) are well suited for resource-constrained applications as they do not need expensive multipliers. In a typical rate-encoded SNN, a series of binary spikes within a globally fixed time window is used to fire the neurons. The maximum number of spikes in this time window is also the latency of the network in performing a single inference, as well as determines the overall energy efficiency of the model. The aim of this paper is to reduce this while maintaining accuracy when converting ANNs to their equivalent SNNs. The state-of-the-art conversion schemes yield SNNs with accuracies comparable with ANNs only for large window sizes. In this paper, we start with understanding the information loss when converting from pre-existing ANN models to standard rate-encoded SNN models. From these insights, we propose a suite of novel techniques that together mitigate the information lost in the conversion, and achieve state-of-art SNN accuracies along with very low latency. Our method achieved a Top-1 SNN accuracy of 98.73 the MNIST dataset, 76.38 (8 time steps) on the CIFAR-10 dataset. On ImageNet, an SNN accuracy of 75.35

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset