Attention-SNN
Attention-SNN copied to clipboard
Dough about the Accuracy claimed about Attention SNN
The value of 'dt' as 15 and 'T 'as 60 at TABLE I for the data set DVS128 Gesture, and found the testing accuracy 89.9305% in the epoch 138, where your work in the paper says it has an efficiency of 96.53%. So any modification in the code is there that needs to be done to improve the testing accuracy to the claimed value.
Thank you sir for your response, I also use CPU, will try to check for your 'dt'.
I have the same question. Got 90.1 acc when reproduction, can you provide the dt and T for a better acc?
I tested all A-SNNs for this paper, and my results are consistent with those in the paper. If you are achieving accuracy around 90%-92%, it's because you are running the training on a vanilla SNN without incorporating attention. To enable attention, you should specify the type of attention you want to use in the Config.py file of each dataset under the self.attention = "no" hyperparameter. You can set this to CA, TA, SA, CSA, TCA, TSA, TCSA, or no. The results shown in the paper use dt = 15 and T = 60 for the DVS128 Gesture dataset. The only difference in my results was that CSA gave the best test accuracy, with 96.32%. These are all GPU results however, cpu should is the same too, just longer training time.