Yubei

Results 9 issues of Yubei

Has someone trained it on full librispeech train sets(train-clean-100, train-clean-360, train-other-500)? Could you tell the WER if training on them? Thank you!

I found that you directly adapt all of the entire network layers from ADDA instead of the feature extraction layers. I am confused, is this still feature adaptation?

Hi~ Thank you for sharing! Would you like share your PER on TIMIT and CER on chinese copurs? Thank you!

Hi, I've been training models for almost two days. Today, the GPU utilization dropped suddenly to 0%, but all GPU memory were still occupied by the experiment. Besides, the experimental...

I have a total of two GPU servers and one CPU server. I ran 4 trainers on each GPU server. I tried to run the ps and controller jobs on...

Hi! I finished shallow LM fusion in ASR using your `ComputeLogitsWithLM` in asr/fusion.py. Then, I have tested it many times using different parameters which maybe affect LM fusion results. However,...

Could you share your pretained model? Thank you!

Thank you for sharing your code! Could you share your WER on librispeech?