Attention-SNN icon indicating copy to clipboard operation
Attention-SNN copied to clipboard

Dough about the Accuracy claimed about Attention SNN

Open A227902 opened this issue 1 year ago • 3 comments

The value of 'dt' as 15 and 'T 'as 60 at TABLE I for the data set DVS128 Gesture, and found the testing accuracy 89.9305% in the epoch 138, where your work in the paper says it has an efficiency of 96.53%. So any modification in the code is there that needs to be done to improve the testing accuracy to the claimed value.

A227902 avatar Mar 18 '24 09:03 A227902

Thank you sir for your response, I also use CPU, will try to check for your 'dt'.

A227902 avatar Mar 20 '24 08:03 A227902

I have the same question. Got 90.1 acc when reproduction, can you provide the dt and T for a better acc?

StCross avatar Apr 26 '24 23:04 StCross

I tested all A-SNNs for this paper, and my results are consistent with those in the paper. If you are achieving accuracy around 90%-92%, it's because you are running the training on a vanilla SNN without incorporating attention. To enable attention, you should specify the type of attention you want to use in the Config.py file of each dataset under the self.attention = "no" hyperparameter. You can set this to CA, TA, SA, CSA, TCA, TSA, TCSA, or no. The results shown in the paper use dt = 15 and T = 60 for the DVS128 Gesture dataset. The only difference in my results was that CSA gave the best test accuracy, with 96.32%. These are all GPU results however, cpu should is the same too, just longer training time.

oteomamo avatar Apr 27 '24 00:04 oteomamo