addons icon indicating copy to clipboard operation
addons copied to clipboard

Output current learning rate for RAdam and similar Optimizers

Open jschuetzke opened this issue 4 years ago • 1 comments

Maybe I'm missing something here but I'm using RAdam with warmup for a dataset and want to save the concurrent learning rate for each step (to generate some plots, lr vs. loss etc.). According to the documentation, one can set total_steps to activate warmup and the model changes the learning rate while training automatically (according to arguments warmup_proportion etc.).

Usually I can access the learning rate through model.learning_rate or, when using a LRSchedule, via model.learning_rate(iteration). Since the model handles the learning rate changes internally, is there a way to access the current learning rate without explicitly calculating the learning rate using RAdam's internal algorithms? For RAdam, model.learning_rate returns simply the base lr.

I checked the RAdam code but couldn't find an attribute to access the current learning rate, so help is appreciated!

jschuetzke avatar Oct 20 '21 15:10 jschuetzke

/cc @szutenberg

bhack avatar Oct 20 '21 16:10 bhack

TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down

Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP

seanpmorgan avatar Mar 01 '23 04:03 seanpmorgan