snowfall icon indicating copy to clipboard operation
snowfall copied to clipboard

Moved to https://github.com/k2-fsa/icefall

Results 73 snowfall issues
Sort by recently updated
recently updated
newest added

I notice the --full-libri option defaults to true in mmi_att_transformer_train.py (specified in librispeech.py). This may be unexpected as it's not the previous behavior. Just pointing it out (we'll see what...

The loss we backprop is normalized by the number of supervisions: https://github.com/k2-fsa/snowfall/blob/5d1b00dd2ab4c809714f588fa6d2487cde5ea46c/egs/librispeech/asr/simple_v1/mmi_att_transformer_train.py#L114-L117 But the loss we report is normalized by the number of frames: https://github.com/k2-fsa/snowfall/blob/5d1b00dd2ab4c809714f588fa6d2487cde5ea46c/egs/librispeech/asr/simple_v1/mmi_att_transformer_train.py#L267-L278 It looks like it wasn't...

It would be nice to have a choice about the topology to use, that could be passed in somehow, e.g. as a string when we do training. E.g. to have...

This PR implements iterated loss from https://github.com/k2-fsa/snowfall/issues/179#issuecomment-830565127. Reference: https://arxiv.org/pdf/1910.10324.pdf The following results could be reproduced with: ``` python mmi_att_transformer_train.py --world-size 2 --full-libri 0 --use-ali-model 0 --max-duration 250 --iterated-layers 5 --iterated-scale...

python3 ./mmi_bigram_train.py World size: 1 Rank: 0 Traceback (most recent call last): File "./mmi_bigram_train.py", line 502, in main() File "./mmi_bigram_train.py", line 280, in main lexicon = Lexicon(lang_dir) File "/tmp/lingvo/snowfall/build/lib/snowfall/lexicon.py", line...

I am referring to the following: https://github.com/k2-fsa/snowfall/blob/b7f76b663ba61c42c7b63aed4a101ed4e3f0fc16/snowfall/objectives/ctc.py#L38 https://github.com/k2-fsa/snowfall/blob/b7f76b663ba61c42c7b63aed4a101ed4e3f0fc16/snowfall/objectives/mmi.py#L85 This is a bit unusual to me. Was there a particular motivation? @csukuangfj it seems like you were the one who chose...

Perhaps @zhu-han can try this.. we should be able to implement label smoothing for our LF-MMI system by adding some small constant times `-nnet_output.mean(1).sum() / (len(texts) * accum_grad)` to the...

It would be nice if we could print version information to make it easier to reproduce experiments and load trained models. I am thinking of printing things like git branch...

help wanted
easy

Right now, if I change the script I'm running before a job is finished I get errors (and I'm not 100% sure but I think I've seen errors if I...

Hi guys, I have installed the latest lhotse, k2 and snowfall and run the 100h librispeech example using mmi_att_transformer_train.py. The model seems does not converge well as the valid objf...