CloserLookFewShot
CloserLookFewShot copied to clipboard
Can't reach specified accuracy on CUB dataset ResNet10
Getting around 80.78 (85.17 in paper) % accuracy on 5 shot , ResNet10 baseline++ with augmentation on.
Command used to train : python train.py --dataset CUB --model ResNet10 --method baseline++ --train_aug
Command used to test : python test.py --dataset CUB --model ResNet10 --method baseline++ --train_aug
Hello, can you tell me what is your final epoch (200 or 400?) and what is the loss in the final epoch? I think it may be an overfitting issue, thanks!
The number of epochs is 200. Here is the loss for last epoch, last batch: Epoch 199 | Batch 360/368 | Loss 0.636699 The result is reproduced from the freshly cloned repository.
Hello, can you also tell me the result of running the baseline (with ResNet10)? Thanks!
The number of epochs is 200. Here is the loss for last epoch, last batch: Epoch 199 | Batch 360/368 | Loss 0.636699 The result is reproduced from the freshly cloned repository.
You can try Epoch 100 or 150 to test. I have the same problem on CUB 1-shot baseline++ with augmentation. And the results are as followings: Epoch 50: 66.0 Epoch 100: 69.7 Epoch 150: 64.5 Epoch 200: 61.0 Epoch 400: 60.7
@yuxiwang93 are your results from original code ? And are u getting comparable results on 5-shot also ?
Oh, have you solved the problem? I run the code in command line : python train.py --dataset CUB --model ResNet10 --method baseline++ --train_aug --stop_epoch 100 (150,200), python save_features.py --dataset CUB --model ResNet10 --method baseline++ --train_aug python test.py --dataset CUB --model ResNet10 --method baseline++ --train_aug All codes run from fresh(before the new turn, I delete the checkpoints of the last time), however, the test results always stay around 80.50%, could you give me some advice?
And the test accuracies on miniImagenet of 5-way 5-shot are around 73.30%, while the reported result is 75.68%.
Hello, sorry it takes me a while to recover what happens. Please refer to issue#31 for the explanation.