Tatiana Likhomanenko
Tatiana Likhomanenko
Could you try `--minisz=-1`? Can you confirm that you see again `I0914 08:24:05.279836 112 Utils.cpp:102] Filtered 1/1 samples` in this case?
cc @vineelpratap is there any option to accept empty transcriptions?
> Yes, `Filtered 1/1 samples` > > I've tried `--minisz=-1` and my lst file containes: > `flac /root/host/flac.wav 5.855` (it is audio from librispeech dataset converted to wav) > >...
@mironnn could you confirm that if transcription is not empty decoder is working for you? if so we will fix the issue with empty transcription in future (decoder hangs because...
Could you pull latest w2l? Or you can set `--warmup=1` in your config (we set default value for warmup=8000, but in latest commit changed to 1 to be consistent with...
@rajeevbaalwan > Consider for transformer architecture the batch size in the given config is 8 but in doc it is mentioned that total batch size is 128. So I can...
Yep, we know that people mostly don't have so many resources. We specified the total batch size, so that you could understand how you need, for example, to scale learning...
Could you post the full log? and also info if this is CTC or s2s criterion. Also please add link or info about your training config.
@AlexandderGorodetski Are you using the latest versions of w2l? Please pull because we recently fixed some problems with lr schedule. Then please add to the config `--warmup=64000` (we didn't fix...
I see that you are running on 1 GPU only, is this right? Then lr, warmup possible should be tuned.