MAML-Pytorch icon indicating copy to clipboard operation
MAML-Pytorch copied to clipboard

Performance on Omniglot is slightly lower than paper report

Open YaxinLi0-0 opened this issue 4 years ago • 3 comments

Hi, thank you for your implement MAML in pytorch.

I have tried your code and get some result. For omniglot dataset, the accuracy I got is lower than the original implementation. For 5-way 5-shot, the accuracy on test set is around 96%, while the paper report that the accuracy could achieve 99.9% for convolution network. The same thing happens for 5-way 1 shot.

I checked the code and found the model parameters are the same with original code. Do you have any idea about this?

YaxinLi0-0 avatar Apr 25 '20 23:04 YaxinLi0-0

@I-am-Bot I think it might be because this code implements a first-order approximation of the original MAML algorithm. See #32 and this post for a discussion on it.

lfrati avatar Jul 10 '20 15:07 lfrati

In the original implementation, 90 degree rotation is used as a data augmentation on omniglot. I can't find such preprocessing in this implementation. maybe this is the reason? @dragen1860 @I-am-Bot

haditabealhojeh avatar Mar 05 '21 06:03 haditabealhojeh

In the original implementation, 90 degree rotation is used as a data augmentation on omniglot. I can't find such preprocessing in this implementation. maybe this is the reason? @dragen1860 @I-am-Bot

I find it as well. Have u try if it could improve the performance? It seems hard to add a rotation over this reproduction.

BigWhitePolarBear avatar Nov 23 '21 09:11 BigWhitePolarBear