snowfall
snowfall copied to clipboard
Moved to https://github.com/k2-fsa/icefall
It implements https://github.com/k2-fsa/snowfall/pull/106#issuecomment-803796177 > BTW, since it seems this is hard to get to work, if you feel like it you could work on a simpler idea. In training time...
You are one of the main groups that was using Lhotse throughout the last year. I’d like to solicit your feedback on Lhotse’s API — do you think there is...
Hi, for my experiment, built-in (cudnnctc) is about 2.5 times fast than k2-ctc. I was wondering if this is normal and would like to make sure my program is correct....
## With unigram LM for P ``` export CUDA_VISIBLE_DEVICES="0" ./mmi_att_transformer_train.py \ --master-port=12355 \ --full-libri=0 \ --use-ali-model=0 \ --max-duration=500 \ --use-unigram=1 ./mmi_att_transformer_decode.py \ --use-lm-rescoring=1 \ --num-paths=100 \ --max-duration=300 \ --use-unigram=1 ```...
From https://github.com/k2-fsa/snowfall/pull/173#issuecomment-833624666 > BTW, someone should have a close look at SpeechBrain to see whether we might be able to use it with Lhotse and k2 as the base for...
A small vocab_size, e.g., 200, is used to avoid OOM if the bigram P is used. After removing P, it is possible to use a large vocab size, e.g., 5000....
Wer results of this pr (by loaded models from espnet model zoo): ``` test-clean 2.43% test-other 5.79% ```   This pr implements following procedure with models from espnet model...
Usage: ```bash $ snowfall net compute-post -m /ceph-fj/model-jit.pt -f exp/data/cuts_test-clean.json.gz -o exp ``` I find that there is one issue with the Torch Scripted module: We have to know the...
The current setup inherits building lexicon FSTs from Kaldi. I think it makes sense to have the ability to build it directly in Python, which should make building new recipes...