DeepSpeech
DeepSpeech copied to clipboard
learning rate scheduler addded
Thanks @soroushhashemifar could you give some details about how you imagine this should be used and how the "reduce on plateau" functionality doesn't solve this already?
Thanks for your attention @ftyers actually I faced the problem while I was training DeepSpeech for Persian language. Reduce on Plateau may drive the model to higher points of the parameters space, because it takes time to detect the plateau. While scheduling learning rate according to epoch makes more sense, because as training goes forward, we need less oscillation in parameters and thus reducing learning rate can help a lot in this approach. On the other hand, this feature is provided in famous frameworks, e.g. Tensorflow and PyTorch, so I was thinking why not in DeepSpeech?! At last, it is easy to work with the parameter because of its pythonic format. It is a compact form of nested if else blocks while because it will get evaluated during the running, you are able to use any python function inside the learning rate scheduler parameter.