Reptile Vision example has constant meta learning rate
In the following reptile example script https://github.com/learnables/learn2learn/blob/master/examples/vision/reptile_miniimagenet.py:111, the meta-learning rate has a bug: it remains constant and does not decay.
new_lr = frac_done * meta_lr + (1 - frac_done) * meta_lr
Compare this to the original reptile code:
https://github.com/openai/supervised-reptile/blob/master/supervised_reptile/train.py:55
cur_meta_step_size = frac_done * meta_step_size_final + (1 - frac_done) * meta_step_size
To fix this, an additional parameter may be used, e.g. meta_lr_final (similar to the second example).
Hello @mi92,
Thanks for reporting this discrepancy! Have you tried fixing by using the same final learning rate? If it works better, would you like to submit a PR for it since you found the bug?
I haven't actually tried which works better yet, but I can gladly start a PR adding the meta_lr_final for making the decay working properly (otherwise ppl may believe they are decaying when in fact they are not)
That would be great, thanks!
Closing since fixed.