DeepSpeech-Italian-Model icon indicating copy to clipboard operation
DeepSpeech-Italian-Model copied to clipboard

DOCKERFILE Merge flag TRANSFER_LEARNING and DROP_SOURCE_LAYER

Open Mte90 opened this issue 3 years ago • 2 comments

immagine

Mte90 avatar Aug 28 '20 12:08 Mte90

What should be done here? Only removing every TRANSFER_LEARNING in the DeepSpeech folder and putting a comment that when DROP_SOURCE_LAYER > 0 is transfer learning? Or should something else be done also?

ilyasmg avatar Oct 03 '20 20:10 ilyasmg

Basically yes. As written on deepspeech doc when the option DROP_SOURCE_LAYER is > 0 you drop the number of layers specified and then start training with some previouos checkpoint from another train (eg: english checkpoint). Right now the script checks if TRANSFER_LEARNING is true and the starts downloading the ENG deepspeech checkpoints.

There is a recurrent case that need to be handled: how continue a previous training. In fact if I start again with the same flags (DROP_SOURCE_LAYER and TRANSFER LEARNING) the script (if I remember correctly) will skip the ENG checkpoints download but it will DROP the N layers from the previous iteration. And this could be a problem :)

In the second bullet point I added a quite extreme case that is load a previous IT checkpoint and dropping some layers (if we want for example start a new training from the IT release checkpoint while adding new characters to the alphabet this could be an option)

nefastosaturo avatar Oct 04 '20 16:10 nefastosaturo