Tilman Kamp
Tilman Kamp
Log messages of new best validating checkpoints and plateau encounters don't carry epoch numbers. This makes it harder than necessary to assess training progress using grep.
Command line parameters for sample skipping allows for better bisecting of faulty samples in new corpora. Changing the ordering helps in determining maximum batch size.
Training samples should be augmented by different reverberation environments.
Support for hyper parameter schedules during training. They could get specified as comma-separated value sequences: ``` python DeepSpeech.py ... --dropout_rate 0.40,10:0.30,50:0.20 ... ``` This example also shows how colon-separated prefixes...