VoiceCraft icon indicating copy to clipboard operation
VoiceCraft copied to clipboard

Batch Inference

Open nickmitchko opened this issue 1 year ago • 2 comments

Hi, Is it possible to batch inference like LLMs do? Such as provide 10 transcripts and batch the requests to increase total throughput?

nickmitchko avatar Jun 28 '24 19:06 nickmitchko

TTS inference is in batch mode, it's just right now running the same transcript and selecting the shortest. Need some tweak in the code to support batch processing

jasonppy avatar Jun 29 '24 14:06 jasonppy

isn't inference_tts_scale.py the way to do batch inference?

thivux avatar Aug 23 '24 05:08 thivux