Poster
Temporal Misalignment in ANN-SNN Conversion and its Mitigation via Probabilistic Spiking Neurons
Velibor Bojkovic · Xiaofeng Wu · Bin Gu
West Exhibition Hall B2-B3 #W-213
Spiking Neural Networks (SNNs) offer a more energy-efficient alternative to Artificial Neural Networks (ANNs) by mimicking biological neural principles, establishing them as a promising approach to mitigate the increasing energy demands of large-scale neural models. However, fully harnessing the capabilities of SNNs remains challenging due to their discrete signal processing and temporal dynamics. ANN-SNN conversion has emerged as a practical approach, enabling SNNs to achieve competitive performance on complex machine learning tasks. In this work, we identify a phenomenon in the ANN-SNN conversion framework, termed temporal misalignment, in which random spike rearrangement across SNN layers leads to performance improvements. Based on this observation, we introduce biologically plausible two-phase probabilistic (TPP) spiking neurons, further enhancing the conversion process. We demonstrate the advantages of our proposed method both theoretically and empirically through comprehensive experiments on CIFAR-10/100, CIFAR10-DVS, and ImageNet across a variety of architectures, achieving state-of-the-art results.
Spiking neural networks (SNNs) are designed to mimic biological functioning of biological neurons. In this work we discover and study a seemingly counterintuitive phenomenon in ANN-to-SNN conversion (a way to train SNNs via pretrained Artificial Neural Networks), which we term Temporal Misalignment. Namely, we find that random permutations of spike trains after spiking layers significantly boost model performance.We dive deeper into explaining what is actually happening and we introduce the Two-Phase Probabilistic (TPP) Spiking Neuron—a novel, biologically plausible, and hardware-friendly neuron model. Mitigating Temporal Misalignment, our TPP neuron enables SNNs to closely match ANN accuracy with just a few time steps.