MAML-Pytorch
MAML-Pytorch copied to clipboard
Performance on Omniglot is slightly lower than paper report
Hi, thank you for your implement MAML in pytorch.
I have tried your code and get some result. For omniglot dataset, the accuracy I got is lower than the original implementation. For 5-way 5-shot, the accuracy on test set is around 96%, while the paper report that the accuracy could achieve 99.9% for convolution network. The same thing happens for 5-way 1 shot.
I checked the code and found the model parameters are the same with original code. Do you have any idea about this?
@I-am-Bot I think it might be because this code implements a first-order approximation of the original MAML algorithm. See #32 and this post for a discussion on it.
In the original implementation, 90 degree rotation is used as a data augmentation on omniglot. I can't find such preprocessing in this implementation. maybe this is the reason? @dragen1860 @I-am-Bot
In the original implementation, 90 degree rotation is used as a data augmentation on omniglot. I can't find such preprocessing in this implementation. maybe this is the reason? @dragen1860 @I-am-Bot
I find it as well. Have u try if it could improve the performance? It seems hard to add a rotation over this reproduction.