MohitTare
MohitTare
Hi @Najat91 , As discussed over LinkedIn and confirmed by @perone , you can use the above method to load custom dataset. In other specific cases you can use nibabel...
Thanks for this wonderful project. I was exploring the library and I tried to make a example [notebook ](https://github.com/MohitTare/medicaltorch/blob/master/examples/Using%20Medicaltorch%20for%20Medical%20imaging%20NIFTI%20datasets.ipynb)on how to create a dataloader using medicaltorch. Will try to work...
Thanks @perone for checking the notebook out. I have generated a PR. You can review once and let me know in case of any changes (PS: i am a bit...
Ok here are the specifics The error for non streaming version is measured by using Decoder binary provided. For the streaming ASR we are using the [MulithreadedStreamingBinary](https://github.com/facebookresearch/wav2letter/blob/v0.2/inference/inference/examples/MultithreadedStreamingASRExample.cpp) and error rate...
Thanks for your input. > https://github.com/facebookresearch/wav2letter/blob/v0.2/inference/inference/examples/AudioToWords.cpp#L47 to 1000 or 1500. So we tried this, but the error rate didn't improved drastically (2-3 point diff). I am sharing our token [set.](https://drive.google.com/file/d/17rlJdntqoAfo-lk9raeY7e7zQ_TmtZC2/view?usp=sharing)...
Ok so got the issue. The high difference was due to sampling frequency of 16k default [here](https://github.com/facebookresearch/wav2letter/blob/v0.2/inference/inference/module/feature/LogMelFeature.h#L31) Passing appropriate value of 8k for mulaw encoding fixed the issue and the...
Hey @vineelpratap we have made the changes now with stride 4 by changing [here](https://github.com/facebookresearch/wav2letter/blob/master/recipes/streaming_convnets/librispeech/am_500ms_future_context.arch#L19) Does this affect the future context of 500ms during streaming ? We are getting a performance...
Hello, I am trying to train it and you can follow the below steps 1. Clone the repo and set it up by appending path of repo to the PYTHONPATH...