Andrés
Andrés
Any updates on this? Maybe this helps https://stackoverflow.com/a/75858606/4544940.
For GPU image adding `--no-cache` flag reduced images size from 14.6GB to 11.2GB.
It would be helpful to clarify in the README that using `BatchedInferencePipeline` may lead to degraded transcription quality.
In my case I evaluated against 150 samples of TV/Readio media in spanish, and with same model and default settings I got a WER of ≈ 22 for batch processing,...