A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks
The emerging neuromorphic computing (NC) architectures have shown compelling energy efficiency to perform machine learning tasks with spiking neural networks (SNNs). However, due to the non-differentiable nature of spike generation, the standard error backpropagation algorithm is not directly applicable to SNNs. In this work, we propose a novel learning rule based on the hybrid neural network with shared weights, wherein a rate-based SNN is used during the forward propagation to determine precise spike counts and spike trains, and an equivalent ANN is used during error backpropagation to approximate the gradients for the coupled SNN. The SNNs trained with the proposed learning rule have demonstrated competitive classification accuracies on the CIFAR-10 and IMAGENET- 2012 datasets with significant savings on the inference time and total synaptic operations compared to other state-of-the-art SNN implementations. The proposed learning rule offers an intriguing solution to enable on-chip computing on the pervasive mobile and embedded devices with limited computational budgets.
READ FULL TEXT