DeepSpeech icon indicating copy to clipboard operation
DeepSpeech copied to clipboard

learning rate scheduler addded

Open soroushhashemifar opened this issue 4 years ago • 2 comments

soroushhashemifar avatar May 16 '21 10:05 soroushhashemifar

Thanks @soroushhashemifar could you give some details about how you imagine this should be used and how the "reduce on plateau" functionality doesn't solve this already?

ftyers avatar May 17 '21 16:05 ftyers

Thanks for your attention @ftyers actually I faced the problem while I was training DeepSpeech for Persian language. Reduce on Plateau may drive the model to higher points of the parameters space, because it takes time to detect the plateau. While scheduling learning rate according to epoch makes more sense, because as training goes forward, we need less oscillation in parameters and thus reducing learning rate can help a lot in this approach. On the other hand, this feature is provided in famous frameworks, e.g. Tensorflow and PyTorch, so I was thinking why not in DeepSpeech?! At last, it is easy to work with the parameter because of its pythonic format. It is a compact form of nested if else blocks while because it will get evaluated during the running, you are able to use any python function inside the learning rate scheduler parameter.

soroushhashemifar avatar May 20 '21 05:05 soroushhashemifar