learn2learn icon indicating copy to clipboard operation
learn2learn copied to clipboard

Reptile Vision example has constant meta learning rate

Open mi92 opened this issue 3 years ago • 3 comments

In the following reptile example script https://github.com/learnables/learn2learn/blob/master/examples/vision/reptile_miniimagenet.py:111, the meta-learning rate has a bug: it remains constant and does not decay.

new_lr = frac_done * meta_lr + (1 - frac_done) * meta_lr

Compare this to the original reptile code: https://github.com/openai/supervised-reptile/blob/master/supervised_reptile/train.py:55 cur_meta_step_size = frac_done * meta_step_size_final + (1 - frac_done) * meta_step_size

To fix this, an additional parameter may be used, e.g. meta_lr_final (similar to the second example).

mi92 avatar Aug 12 '22 04:08 mi92

Hello @mi92,

Thanks for reporting this discrepancy! Have you tried fixing by using the same final learning rate? If it works better, would you like to submit a PR for it since you found the bug?

seba-1511 avatar Aug 12 '22 04:08 seba-1511

I haven't actually tried which works better yet, but I can gladly start a PR adding the meta_lr_final for making the decay working properly (otherwise ppl may believe they are decaying when in fact they are not)

mi92 avatar Aug 12 '22 05:08 mi92

That would be great, thanks!

seba-1511 avatar Aug 12 '22 05:08 seba-1511

Closing since fixed.

seba-1511 avatar May 29 '23 01:05 seba-1511