MAML-Pytorch
MAML-Pytorch copied to clipboard
Re-sampling tasks after each epoch increases the performance
The create_batch
function is only called once when the MiniImagenet
dataset object is created, which means the tasks sampled are the same in every epoch.
I changed the code to second-order (according to https://github.com/dragen1860/MAML-Pytorch/issues/32) and call create_batch
in every epoch, the performance can achieve 47.17%.
Lol. Thanks for your result. I think the author is busy making money... He does not care about this little bug. For those who want to implement maml. I recommend https://towardsdatascience.com/advances-in-few-shot-learning-reproducing-results-in-pytorch-aba70dee541d
Lol. Thanks for your result. I think the author is busy making money... He does not care about this little bug. For those who want to implement maml. I recommend https://towardsdatascience.com/advances-in-few-shot-learning-reproducing-results-in-pytorch-aba70dee541d
Thanks, I'll check that.
The
create_batch
function is only called once when theMiniImagenet
dataset object is created, which means the tasks sampled are the same in every epoch.I changed the code to second-order (according to #32) and call
create_batch
in every epoch, the performance can achieve 47.17%.
@ShawnLixx Hi! I have run this code on miniImageNet, I can get almost 47% accuracy on testing dataset. However, when I save the corresponding best model, and load it on my testing code, I can only get 44% accuracy, is there any insight you can provide to help me fix this problem? Or can you tell me the way of how you implement the test code?