icefall icon indicating copy to clipboard operation
icefall copied to clipboard

[WIP] Fix MMI recipe

Open pkufool opened this issue 3 years ago • 3 comments
trafficstars

This PR try to fix https://github.com/k2-fsa/icefall/issues/685 and to tune a better result for MMI models.

  • [x] Make current model converge.
  • [ ] Tune a better result.
  • [ ] Apply Zipformer to MMI recipe.

pkufool avatar Nov 21 '22 13:11 pkufool

The results I have for libri-100.

model ctc-decoding HLG 1best HLG + 4-gram rescoring HLG + 4-gram rescoring + attention decoder
conformer-mmi (with attention decoder) 95.68 & 122.38 6.15 & 17.58 5.6 & 16.02 (lm-scale=1.3) 5.37 & 16.28 (lm-scale=2.0; att-scale=1.0) --epoch 29 --avg 12

The ctc-decoding results is weird, I haven't figured out the issue.

Training command:

./conformer_mmi/train-with-attention.py \
  --exp-dir conformer_mmi/exp_100_att \
  --lang-dir data/lang_bpe_500 \
  --full-libri 0 \
  --max-duration 200 \
  --world-size 2 \
  --start-epoch 0 \
  --num-epochs 30 \
  --master-port 35673

pkufool avatar Nov 23 '22 06:11 pkufool

In my opinion, bad ctc-decoding result is reasonable for MMI model. I think your decoding results with HLG look good now.

Btw, does & sign in your table mean both MMI only and MMI with attention decoder? I'm wondering because I just only tried MMI model, not MMI with attention decoder.

dan-legit avatar Nov 24 '22 03:11 dan-legit

Btw, does & sign in your table mean both MMI only and MMI with attention decoder? I'm wondering because I just only tried MMI model, not MMI with attention decoder.

No, & means test and test-other. I am running MMI without attention decoder, will update the results here soon.

pkufool avatar Nov 24 '22 12:11 pkufool