Aurora-6

Results 14 comments of Aurora-6

> What is your OS version? (I know it's centos, but what's the exact version?) > > The conda package is built using Ubuntu 18.04. If your OS was released...

> What is your OS version? (I know it's centos, but what's the exact version?) > > The conda package is built using Ubuntu 18.04. If your OS was released...

I deleted the k2 scripts and re-installed k2 from Github source. Run `python3 setup.py install ` and install k2 done. "Finished processing dependencies for k2==1.17.dev20220714+cuda11.1.torch1.8.1" But when I import k2,...

> Please remove all files/folders whose name contains k2 inside the directory > /root/miniconda3/lib/python3.8/site-packages > and then re-run `python3 setup.py install`. Sorry forget to thanks for your helps. I've been...

> > Does current k2 support streaming training and decoding? > > Yes, it does. > > Please see > > * https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md#with-lower-latency-setup-training-on-full-librispeech > * https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md#librispeech-bpe-training-results-pruned-stateless-streaming-conformer-rnn-t > * https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md#librispeech-bpe-training-results-pruned-stateless-conv-emformer-rnn-t >...

> > ok, I get it. Are the streaming training and decoding scripts shown in the streaming branch of Icefall? > > Please don't use the streaming branch at all....

> @Aurora-6 You can train a streaming model using recipes pruned_transducer_statelessX, all you need to do is specifying the related parameters, mainly `dynamic-chunk-training`, `causal-convolution` and `num-left-chunks`. > > Take pruned_transducer_stateless2...

I have trained transducer model with the hand-crafted cuts.jsonl.gz following the scripts (https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/pruned_transducer_stateless2). But when decoding use the command `./pruned_transducer_stateless2/decode.py \ --simulate-streaming 1 \ --bpe-model ./data/lang_bpe_5000/bpe.model \ --decode-chunk-size 16 \...

Here I use the following steps to generate final cuts.jsonl.gz. step1: ` lhotse kaldi import -f 0.01\ ${librispeech_data}/test-clean \ 16000 \ data/manifests/${name}` To generate features.jsonl.gz, recordings.jsonl.gz and supervisions.jsonl.gz step2: run...