hybrid-snn-conversion icon indicating copy to clipboard operation
hybrid-snn-conversion copied to clipboard

SNN Retraining Issue

Open Sangyeob-Kim opened this issue 4 years ago • 6 comments

Hello.

I trained ann model for CIFAR10 by using ann.py.

After that, I run snn.py to train SNN by STDB.

Converting CNN to SNN works fine. However, accuracy continues to decrease as epoch continues. The results are the same no matter how small the learning rate is set. Even if I train the SNN with linear activation, the result is the same.

How can I solve this problem? I ask for your help, because the learning of SNN is not progressing,

Thank you.

Sangyeob-Kim avatar Jun 17 '20 05:06 Sangyeob-Kim

Can you provide some more details like the architecture, number of timesteps, dataset, CNN accuracy, converted SNN accuracy, optimizer and other hyperparameters

nitin-rathi avatar Jun 18 '20 15:06 nitin-rathi

  1. ANN training script for training ANN python ann.py --architecture VGG16 --learning_rate 1e-2 --epochs 100 --lr_interval '0.60 0.80 0.90' --lr_reduce 10 --dataset CIFAR10 --batch_size 64 --optimizer SGD --dropout 0.3

    Accuracy of ANN: 91% (test accuracy)

  2. SNN training (case 1 - large learning rate) script for training SNN python snn.py --architecture VGG16 --learning_rate 1e-6 --epochs 100 --lr_interval '0.60 0.80 0.90' --lr_reduce 10 --dataset CIFAR10 --batch_size 50 --optimizer SGD --timesteps 100 --leak 1.0 --scaling_factor 0.7 --dropout 0.3 --kernel_size 3 --devices 0 --pretrained_ann './trained_models/ann/ann_vgg16_cifar10.pth' --log --activation STDB

    Accuracy of Converted SNN: 89% (test accuracy) Accuracy of Converted SNN after training: 10% (test accuracy)

  3. SNN training (case 2 - large learning rate) script for training SNN python snn.py --architecture VGG16 --learning_rate 1e-12 --epochs 100 --lr_interval '0.60 0.80 0.90' --lr_reduce 10 --dataset CIFAR10 --batch_size 50 --optimizer SGD --timesteps 100 --leak 1.0 --scaling_factor 0.7 --dropout 0.3 --kernel_size 3 --devices 0 --pretrained_ann './trained_models/ann/ann_vgg16_cifar10.pth' --log --activation STDB

    Accuracy of Converted SNN: 89% (test accuracy) Accuracy of Converted SNN after training: 70% (test accuracy)

  4. There is no increase in accuracy even if the leak value is changed under the same conditions as 2 and 3, or the activation is changed to linear.

Sangyeob-Kim avatar Jun 19 '20 07:06 Sangyeob-Kim

You can try changing the activation to 'Linear' and optimizer to 'Adam' for SNN training. Keep the learning rate at '1e-4'

nitin-rathi avatar Jun 19 '20 16:06 nitin-rathi

I tried it!

The results are as follows.


  1. ANN training script for training ANN python ann.py --architecture VGG16 --learning_rate 1e-2 --epochs 100 --lr_interval '0.60 0.80 0.90' --lr_reduce 10 --dataset CIFAR10 --batch_size 64 --optimizer SGD --dropout 0.3

    Accuracy of ANN: 91% (test accuracy)

  2. SNN training script for training SNN python snn.py --architecture VGG16 --learning_rate 1e-4 --epochs 100 --lr_interval '0.60 0.80 0.90' --lr_reduce 10 --dataset CIFAR10 --batch_size 50 --optimizer Adam --timesteps 100 --leak 1.0 --scaling_factor 0.7 --dropout 0.3 --kernel_size 3 --devices 0 --pretrained_ann './trained_models/ann/ann_vgg16_cifar10.pth' --log --activation Linear

    Accuracy of Converted SNN after training (after 6 epochs): 13.7% (train accuracy)


Still, when learning SNN, the accuracy of the SNN is low. Is there any difference between the code you have and the code uploaded to github? Any help in this situation would be greatly appreciated.

Sangyeob-Kim avatar Jun 22 '20 01:06 Sangyeob-Kim

I am very curious, why the accuracy of running snn.py alone is even better than running ann.py first and then running snn.py

kugatsu-sudo avatar Dec 17 '20 03:12 kugatsu-sudo

Is the problem with VGG16 or also with the other VGG architectures? Could someone maybe post the script for training a smaller network (e.g. VGG11 or even VGG5) for first training an ANN and then converting to an SNN or directly training an SNN?

annahambi avatar Jan 08 '21 16:01 annahambi