leonwlw

Results 3 comments of leonwlw

I also meet this situation. I trained model with conf/asr/transformer_mma/lc_transformer_mma_subsample8_ma4H_ca4H_chunk16_from4L_32_32_32.yaml on aishell corpus. It could decode test_set in offline mode. When I set recog_chunk_sync true for decode_streaming, it failed. There...

> > But when decoding use the command > > ./pruned_transducer_stateless2/decode.py \ --simulate-streaming 1 \ --bpe-model ./data/lang_bpe_5000/bpe.model \ --decode-chunk-size 16 \ --causal-convolution 1 \ --epoch 22 \ --avg 10 \...

> I think it's better if we add the experiment results on AIShell-1 and LibriSpeech, to show that we can get consistent and solid gain by using the model. @robin1001...