Retrieval-based-Voice-Conversion-WebUI icon indicating copy to clipboard operation
Retrieval-based-Voice-Conversion-WebUI copied to clipboard

Batch size 4 vs 8

Open itsthepatrick opened this issue 10 months ago • 2 comments

hi I have a 3050 6gb,so the best Batch size i can work with is 4 but i want to know how much time would be saved if it was a 3080 12gb ? (so being able to use Batch size 8 get possible) is it worth renting one on the web?

(rn,for every epoch with ~15 min vocal,it takes around 1:05 per epoch)

itsthepatrick avatar Jan 30 '25 09:01 itsthepatrick

In my experience, doubling the batch size didn't make the training much faster. You can simulate its benefits by training your models on Colab and comparing the training time with different batch sizes.

comicdodge avatar Sep 09 '25 13:09 comicdodge

In my experience, doubling the batch size didn't make the training much faster. You can simulate its benefits by training your models on Colab and comparing the training time with different batch sizes.

Hi what about having a newer gpu?does that make a big difference? (Like instead of that 3050, using a 4060)

because ChatGPT says if a model takes 5 hours to be trained,it would take around two hours with 4060 (which i doubt)

itsthepatrick avatar Sep 14 '25 09:09 itsthepatrick