blockdrop
blockdrop copied to clipboard
Accuracy keeps dropping
Hi. Thanks for your work. I am using your cl_training code. However, I found the train and test accuracy keeps dropping after training for 600 epochs. Is it normal?
@CheungBH I am experiencing the same issue as well, were you able to resolve it eventually?
@CheungBH @akinsanyaayomide
Not sure, but it may caused by the learning rate. In paper, it suggest the lr should be decayed 10 every 30 epochs. However, the epoch_step args in cl_training.py didn't properly set up (epoch_step=100000
, which means the lr will not be decayed during the training).
Instead, you can give the cmd python cl_training.py --epoch_step 30 --model R101_ImgNet --cv_dir cv/R101_ImgNet_cl/ --lr 1e-3 --batch_size 2048 --max_epochs 45 --data_dir data/imagenet/
.
Note that I give --epoch_step 30 as the setup described in paper.
good luck.