Leaky Integrate-and-Fire Spiking Neuron with Learnable Membrane Time Parameter
The Spiking Neural Networks (SNNs) have attracted research interest due to its temporal information processing capability, low power consumption, and high biological plausibility. The Leaky Integrate-and-Fire (LIF) neuron model is one of the most popular spiking neuron models used in SNNs for it achieves a balance between computing cost and biological plausibility. The most important parameter of a LIF neuron is the membrane time constant τ, which determines the decay rate of membrane potential. The value of τ plays a crucial role in SNNs containing LIF neurons. However, τ is usually treated as a hyper-parameter, which is preset before training SNNs and adjusted manually. In this article, we propose a novel spiking neuron, namely parametric Leaky Integrate-and-Fire (PLIF) neuron, whose τ is a learnable parameter rather than an empirical hyper-parameter. We evaluate the performance of SNNs with PLIF neurons for image classification tasks on both traditional static MNIST, Fashion-MNIST, CIFAR-10 datasets, and neuromorphic N-MNIST, CIFAR10-DVS datasets. The experiment results show that SNNs augmented by PLIF neurons outperform those with conventional spiking neurons.
READ FULL TEXT